Creating directory symlinks in a nested loop - ansible

I would appreciate if anyone could point me in the right direction. I'm trying to find a list of directories in the $HOME/dotfiles directory for each of a list of users, and then from that list create a symlink of that directory in $HOME.
# Get list of directories in $HOME/dotfiles
- name: Get list of directories in $HOME/dotfiles
find:
paths: "/home/{{ user.username }}/dotfiles"
file_type: directory
recurse: false
register: dirs_matched
become: "{{ user.username }}"
loop: "{{ users|flatten(levels=1)}}"
loop_control:
loop_var: user
# Symlink any directories in dotfiles to $HOME
- name: Symlink dirs in ~/dotfiles to $HOME
file:
src: "{{item.0.path}}"
dest: "/home/{{item.1.username}}/{{item.0.path|basename}}"
state: link
force: true
loop: "{{ dirs_matched.files |product(users)|list }}"
when: dirs_matched.matched > 0
I do get results but they are in dirs_matched.results.files. I'm not sure how to map the results to the file module loop.

You need subelements loop but not nested/(product).
For every top-level item (user) iterate every subelement (files).
- name: Symlink dirs in ~/dotfiles to $HOME
file:
src: "{{ item.1.path }}"
dest: "/home/{{ item.0.user.username }}/{{ item.1.path | basename }}"
state: link
force: true
loop: "{{ dirs_matched.results | subelements('files') }}"
when condition is not required because loop over zero elements does nothing.

Related

Resursively sync files and links to multiple users, preserving permissions, but not owner and group

I have a directory with several tools and text files that I would like to copy to multiple users homedirs on given host. There are couple of caveats:
I don't want to list the files in playbook
Some files are executable, some are not
I want the files to be owned by their respective users
I want the symlinks to be copied as-is
When pushing to ansible_user home dir, there's no issue as ansible.posix.synchronize does the job very well with archive=true and owner/group set to false:
- name: Sync testing_files repo to ansible_user
ansible.posix.synchronize:
src: ~/testing_files/
dest: ~/testing_files/
mode: push
archive: true
delete: true
owner: false
group: false
register: rsync_output
The links are properly handled as well (they are rsynced as symlinks).
However, the problem is with populating the same for other users. I tried the following approach:
- name: Sync testing_files repo to extra_users
ansible.builtin.copy:
src: ~/testing_files/
dest: ~{{ item }}/testing_files/
remote_src: true
mode: preserve
owner: "{{ item }}"
follow: true # I tried with `false` as well
with_items:
- "{{ extra_users if extra_users is not none else [] }}"
The file permissions are correct, the owner as well, however:
the group of file remains the same as source file
the symlinks are ignored
How can I make it work? For the group issue the only solution I came up is to have another task that will run stat to check group and save it for future use, e.g. like this:
- name: Get group of homedir
ansible.builtin.stat:
path: "~{{ item }}"
register: homedir
with_items:
- "{{ extra_users_or_empty }}"
- name: Sync testing_files repo to extra_users
ansible.builtin.copy:
src: ~/testing_files/
dest: "~{{ item }}/testing_files/"
remote_src: true
mode: preserve
owner: "{{ item }}"
group: "{{ homedir.results | selectattr('item', 'eq', item) | map(attribute='stat.gr_name') | first }}"
follow: true
with_items:
- "{{ extra_users_or_empty }}"
(NOTE: extra_users_or_empty: "{{ extra_users if extra_users is not none else [] }}")
However that feels like something that should be achieved in more elegant way. And for symlinks - I have no idea why the ansible.builtin.copy ignores them.
Any ideas?
Huh, ok, seems like when we're using remote_src, we should set local_follow, instead of follow. The following solution handles symlinks properly:
- name: Get group of homedir
ansible.builtin.stat:
path: "~{{ item }}"
register: homedir
with_items:
- "{{ extra_users_or_empty }}"
- name: Sync testing_files repo to extra_users
ansible.builtin.copy:
src: ~/testing_files/
dest: "~{{ item }}/testing_files/"
remote_src: true
mode: preserve
owner: "{{ item }}"
group: "{{ homedir.results | selectattr('item', 'eq', item) | map(attribute='stat.gr_name') | first }}"
local_follow: false
with_items:
- "{{ extra_users_or_empty }}"

Ansible delete Files with wildcard/regex/glob with exception

I want to delete files based on a wildcard but also add exceptions to the rule.
- hosts: all
tasks:
- name: Ansible delete file wildcard
find:
paths: /etc/wild_card/example
patterns: "*.txt"
use_regex: true
register: wildcard_files_to_delete
- name: Ansible remove file wildcard
file:
path: "{{ item.path }}"
state: absent
with_items: "{{ wildcard_files_to_delete.files }}"
For example I want to except a file named "important.txt". How can I do that?
Just add a when condition to the task that deletes files. E.g., something like:
- name: Ansible remove file wildcard
file:
path: "{{ item.path }}"
state: absent
when: item.path != '/etc/wild_card/example/important.txt'
with_items: "{{ wildcard_files_to_delete.files }}"
This will skip a specific file. If you have a list of files to skip you could do this instead:
- name: Ansible remove file wildcard
file:
path: "{{ item.path }}"
state: absent
when: item.path not in files_to_skip
with_items: "{{ wildcard_files_to_delete.files }}"
vars:
files_to_skip:
- /etc/wild_card/example/important.txt
- /etc/wild_card/example/saveme.txt
And if you want to preserve files based on some sort of pattern, you could make use of ansible's match or search tests:
- name: Ansible remove file wildcard
file:
path: "{{ item.path }}"
state: absent
when: item.path is not search('important.txt')
with_items: "{{ wildcard_files_to_delete.files }}"

Ansible Create SubFolders Matching Pattern

I have an ansible playbook, that creates directories by passed an array of directories, owner, and permissions. Our admins are worried, someone will create directories under our O/S Volumes and cause issues with the system. Since we only have a few folders that require root, I'm researching how to whitelist specific folders that are passed in for root only. Other directories and use our internal user to directories that don't require root.
This is what I've come up with, but I have concern with /vs_volue/etc instead of /etc being passed. I can't find a starts with /etc for example. Is there a better way?
---
- name: Create Directories
hosts: target_hosts
vars:
dir_list: '{{ dir_list }}'
permissions: {{ permissions }}
linux_user: 'webuser'
whitelist_dir:
- "/etc"
- "/usr"
tasks:
- name: User to root when creating folders in /etc or /usr
set_fact:
linux_user: "root"
when: dir_list|string|regex_search('{{ item }}')
with_items:
- "{{ whitelist_dir }}"
- name: Create Directories as WebUser by Directory Array Lists by Line Feed
file:
path: "{{ item }}"
mode: "{{ permissions }}"
recurse: yes
state: directory
become: true
become_user: "{{ linux_user }}"
with_items: "{{ dir_list.split('\n') }}"
when: dir_list | search('\n')
Try this.
main.yml
- hosts: target_hosts
vars:
default_linux_user: "webuser"
permissions: "{{ permissions | default(0664) }}"
whitelist_dir:
- "^/etc/.*"
- "^/usr/.*"
tasks:
- include_tasks: create_dir.yml
loop: "{{ dir_list.split('\n') }}"
loop_control:
loop_var: dir
create_dir.yml
- block:
- set_fact:
linux_user: "{{ 'root' if dir is regex(item) else default_linux_user }}"
when: linux_user is undefined and (not linux_user == 'root')
loop: "{{ whitelist_dir }}"
- debug:
msg: "For {{ dir }} - {{ linux_user }} will be set as owner"
- file:
path: "{{ dir }}"
state: directory
mode: "{{ permissions }}"
owner: "{{ linux_user | default(default_linux_user) }}
recurse: yes
become: true
become_user: root
become_method: sudo
always:
- set_fact:
linux_user: default_linux_user

How to create the multiple symlinks for the folders/files under the same source

I want to create the folder: temp2 which is able to store all the symlinks of the subfolders/files of other foder: temp1. with_items can help complete this task, but it needs to list down all the folder/file name as below script shown:
- name: "create folder: temp2 to store symlinks"
file:
path: "/etc/temp2"
state: directory
- name: "create symlinks & store in temp2"
file:
src: "/etc/temp1/{{ item.src }}"
dest: "/etc/temp2/{{ item.dest }}"
state: link
force: yes
with_items:
- { src: 'BEAM', dest: 'BEAM' }
- { src: 'cfg', dest: 'cfg' }
- { src: 'Core', dest: 'Core' }
- { src: 'Data', dest: 'Data' }
It is not flexible as the subfolders/files under temp1 would be added or removed, and I need to update above script frequently to keep the symlinks as updated
Is there any way to detect all the files/folder under temp1 automatically instead of maintaining the with_items list?
The following code works under Ansible-2.8:
- name: Find all files in ~/commands
find:
paths: ~/commands
register: find
- name: Create symlinks to /usr/local/bin
become: True
file:
src: "{{ item.path }}"
path: "/usr/local/bin/{{ item.path | basename }}"
state: link
with_items: "{{ find.files }}"
You can create a list of files using find module:
Return a list of files based on specific criteria. Multiple criteria are AND’d together.
You'll likely need to leave recurse set to false (default) since you assume subfolders might exist.
You need to register the results of the module with register declaration:
register: find
In the next step you need to iterate over the files list from the results:
with_items: "{{ find.results.files }}"
and refer to the value of the path key. You already know how to do it.
You will also need to extract the filename from the path, so that you can append it to the destination path. Use basename filter for that.

Can the templates module handle multiple templates / directories?

I believe the Ansible copy module can take a whole bunch of "files" and copy them in one hit. This I believe can be achieved by copying a directory recursively.
Can the Ansible template module take a whole bunch of "templates" and deploy them in one hit? Is there such a thing as deploying a folder of templates and applying them recursively?
The template module itself runs the action on a single file, but you can use with_filetree to loop recursively over a specified path:
- name: Ensure directory structure exists
ansible.builtin.file:
path: '{{ templates_destination }}/{{ item.path }}'
state: directory
with_community.general.filetree: '{{ templates_source }}'
when: item.state == 'directory'
- name: Ensure files are populated from templates
ansible.builtin.template:
src: '{{ item.src }}'
dest: '{{ templates_destination }}/{{ item.path }}'
with_community.general.filetree: '{{ templates_source }}'
when: item.state == 'file'
And for templates in a single directory you can use with_fileglob.
This answer provides a working example of the approach laid down by #techraf
with_fileglob expects only files to live within the templates folder - see https://serverfault.com/questions/578544/deploying-a-folder-of-template-files-using-ansible
with_fileglob will only parse files in the templates folder
with_filetree maintains the directory structure when moving the template files to dest. It auto creates those directories for you at dest.
with_filetree will parse all files in the templates folder and nested directories
- name: Approve certs server directories
file:
state: directory
dest: '~/{{ item.path }}'
with_filetree: '../templates'
when: item.state == 'directory'
- name: Approve certs server files
template:
src: '{{ item.src }}'
dest: '~/{{ item.path }}'
with_filetree: '../templates'
when: item.state == 'file'
Essentially, think of this approach as copying and pasting a directory and all its contents from A to B and whilst doing so, parsing all templates.
I could not manage to do it with the other answers. This is what worked for me:
- name: Template all the templates and place them in the corresponding path
template:
src: "{{ item.src }}"
dest: "{{ destination_path }}/{{ item.path | regex_replace('\\.j2$', '') }}"
force: yes
with_filetree: '{{ role_path }}/templates'
when: item.state == 'file'
In my case folder contain both files and jinja2 templates.
- name: copy all directories recursively
file: dest={{templates_dest_path}}/{{ item|replace(templates_src_path+'/', '') }} state=directory
with_items: "{{ lookup('pipe', 'find '+ templates_src_path +'/ -type d').split('\n') }}"
- name: copy all files recursively
copy: src={{ item }} dest={{templates_dest_path}}/{{ item|replace(templates_src_path+'/', '') }}
with_items: "{{ lookup('pipe', 'find '+ templates_src_path +'/ -type f -not -name *.j2 ').split('\n') }}"
- name: copy templates files recursively
template: src={{ item }} dest={{templates_dest_path}}/{{ item|replace(templates_src_path+'/', '')|replace('.j2', '') }}
with_items: "{{ lookup('pipe', 'find '+ templates_src_path +'/*.j2 -type f').split('\n') }}"
I did it and it worked. \o/
- name: "Create file template"
template:
src: "{{ item.src }}"
dest: "{{ your_dir_remoto }}/{{ item.dest }}"
loop:
- { src: '../templates/file1.yaml.j2', dest: 'file1.yaml' }
- { src: '../templates/file2.yaml.j2', dest: 'file2.yaml' }

Resources