Find and Copy a file where the location is not fixed in Ansible - ansible

I am trying to find and copy files with pattern *-chrony-configuration.yaml from the machineconfig directory to manifests directory.
The location of the machineconfig varies depending on the user
if the user is root, the folder is /root/machineconfig
if the user is non-root, the folder is /home/machineconfig
- name: Find machineconfig files generated from helpernode
find:
paths: "machineconfig/"
patterns: "*-chrony-configuration.yaml"
register: machine_file
- name: Copy machineconfig files generated from helpernode
copy:
src: "{{ item.path }}"
dest: "{{ workdir }}/manifests"
remote_src: yes
with_items:
- "{{ machine_file.files }}"
The above code errors out giving
"msg": "machineconfig/ was skipped as it does not seem to be a valid directory or it cannot be accessed\n"
Is there a way in ansible where it finds the path of the file and uses that to produce its copy?

What you could do, if your path is either /root/machineconfig or /home/machineconfig, as it seems, is to feed the paths parameter with a list, as the documentation propose it.
Given the task:
- find:
paths:
- /root/machineconfig
- /home/machineconfig
patterns: "*-chrony-configuration.yaml"
register: machine_file
This will list you the files you are looking for and raise a simple warning for folder that does not exists.
With the playbook:
- hosts: localhost
gather_facts: yes
gather_subset:
- min
tasks:
- find:
paths:
- /root/machineconfig
- /home/machineconfig
patterns: "*-chrony-configuration.yaml"
register: machine_file
- debug:
var: machine_file.files | map(attribute='path')
This yields:
TASK [find] ****************************************************************
[WARNING]: Skipped '/root/machineconfig' path due to this access issue:
'/root/machineconfig' is not a directory
ok: [localhost]
TASK [debug] ***************************************************************
ok: [localhost] =>
machine_file.files | map(attribute='path'):
- /home/machineconfig/foobar-chrony-configuration.yaml
If the remote_user of the playbook is the user for which the machineconfig folder is created, then you can use the ansible_env.HOME fact in order to get the home directory of that user.
So, that would make your copy task looks like:
- name: Find machineconfig files generated from helpernode
find:
paths: "{{ ansible_env.HOME }}/machineconfig/"
patterns: "*-chrony-configuration.yaml"
register: machine_file
become: no
Please mind: that you need to gather some minimal facts for this, to work:
- hosts: foobar
gather_facts: yes
gather_subset:
- min

Related

Is there a way to skip a task/play when files are not found in ansible?

Below I posted an example of what I currently have but it doesn't resolve the issue.
ignore_errors still outputs the errors from the play but doesnt stop the tasks from completing. Is there a way to skip the play all together and move on to the next?
- name: replace static with delta file
copy:
src: "/home/docs/delta.{{ inventory_hostname }}"
dest: "/usr/share/static"
backup: yes
ignore_errors: yes
You could use a fileglob to prevent the task from running if the source file does not exist:
- name: replace static with delta file
copy:
src: "{{ item }}"
dest: "/usr/share/static"
backup: yes
loop: "{{ query('fileglob', '/home/docs/delta.%s' % inventory_hostname) }}"
This fileglob will return either 0 or 1 results, so the task will be
skipped if there is no matching file.
So the first thing which comes to my mind is to create task which will check if the directory exist:
- name: Playbook name
hosts: all
tasks:
- name: Task name
stat:
path: [path to the file or directory you want to check]
register: register_name
And the second task to work if directory exists:
- name: Task name
debug:
msg: "The file or directory exists"
when: register_name.stat.exists

Pass parameters into ansible include_tasks

I wrote tasks that manipulate log files. I need to create a playbook that gets all logs files and
run the tasks over it.
I get all logs files using
---
- name: loop task
hosts: localhost
tasks:
- name: "find all logs file in {{folder_path}}"
find:
paths: "{{folder_path}}/"
file_type: file
use_regex: yes
patterns: "*.log"
register: results
- name: loop over all files
include_tasks: file_tasks.yml
vars:
logs_path: ["{{results.files.path}}"]
log_path is the var name in the file_tasks.yml that make the manipulation.
How can I pass the file_tasks.yml file by file?
Thanks
For example, given the files
shell> tree log
log
├── service1.log
├── service2.log
└── service3.log
the task
- find:
paths: log
file_type: file
patterns: "*.log"
register: results
- debug:
var: item.path
loop: "{{ results.files }}"
gives (abridged)
item.path: log/service3.log
item.path: log/service2.log
item.path: log/service1.log
If you want to process the iteration in the included file create one. For example put the debug task into the file
shell> cat file_tasks.yml
- debug:
var: item.path
Then the include gives the same results
- include_tasks: file_tasks.yml
loop: "{{ results.files }}"
You can specify the name of the variable for each loop using loop_var with loop_control.
the solution is to change loop playbook like:
---
- name: loop task
hosts: localhost
tasks:
- name: "find all logs file in {{folder_path}}"
find:
paths: "{{folder_path}}/"
file_type: file
patterns: "*.log"
register: results
- name: loop over all files
include_tasks: file_tasks.yml
vars:
logs_path: "{{item.path}}"
loop: "{{results.files}}"

restore backup file created with Copy

If a backup file is created using Copy (with backup=yes), for example, with this task:
- name: copy file
copy:
dest: path/to/dest
src: path/to/src
backup: yes
If the file path/to/dest already exists, it will be moved in a file looking like path/to/dest.12345.2006-07-08#09:10:11
Is there a way to restore it, or to get the filename of the backup file in order to restore it ?
The absolute file name of the backup file (if generated, so if "changed" is true), is returned in the output object backup_file, so (take the following as pseudocode as I am not testing it):
- name: copy file
copy:
dest: path/to/dest
src: path/to/src
backup: yes
register: copy_file
- debug: var=copy_file.backup_file
- name: restore backup
copy:
dest: path/to/dest
src: copy_file.backup_file
remote_src: true
when: copy_file.changed and some condition of yours
See: https://docs.ansible.com/ansible/latest/collections/ansible/builtin/copy_module.html#return-backup_file
#guido has demonstrated in his answer how to use the backup_file attribute returned by the copy module. As pointed out in my different comments, this is really handy if:
you want to restore the file during the same run of ansible
you store the value somewhere when copying (e.g. a var in memory cache, a db entry, a file on disk...) in one run and retrieve it in a later run to perform your restore action.
Here is an other solution you can use if you don't have that info or you don't want to manage that. For clarity sake, I took the case where you want to restore the latest backup file available on disk. You can perfectly adapt that to meet any other specific requirement.
The basic workflow:
list all the backup files available on the target disk.
choose the most recent one.
copy that file back in place.
Note that although my demo playbook targets localhost, it will work with any target provided the file has backup candidates. I'll let you harden this point your own way and according to your own requirements.
Prior to running my playbook, I created a file and copied it several times with a random content creating backups each time. I used the following playbook to do so:
---
- name: Create some file with backup
hosts: localhost
gather_facts: false
vars:
my_file: /tmp/test_restore/toto.txt
tasks:
- name: copy file
copy:
dest: "{{ my_file }}"
content: "{{ 1000 | random }}"
backup: true
The resulting folder content is as follow:
/tmp/test_restore/
├── toto.txt
├── toto.txt.14644.2020-12-19#10:39:43~
├── toto.txt.14752.2020-12-19#10:39:45~
└── toto.txt.14861.2020-12-19#10:39:48~
Now comes the restore playbook:
---
- name: Restore latest file backup
hosts: localhost
gather_facts: false
vars:
my_file: /tmp/test_restore/toto.txt
tasks:
- name: "Find all backups for {{ my_file }}"
find:
recurse: no
paths:
- "{{ my_file | dirname }}"
patterns:
- '{{ my_file | basename }}\..*~'
use_regex: true
register: find_backup
- name: Select the latest backup found on disk
set_fact:
latest_backup: "{{ (find_backup.files | sort(attribute='mtime') | last).path }}"
- name: Show the latest selected backup
debug:
var: latest_backup
- name: "Restore latest backup of {{ my_file }}"
copy:
src: "{{ latest_backup }}"
remote_src: true
dest: "{{ my_file }}"
Which gives:
PLAY [Restore latest file backup] ******************************************************************************************************************************************************************************************************
TASK [Find all backups for /tmp/test_restore/toto.txt] *********************************************************************************************************************************************************************************
ok: [localhost]
TASK [Select the latest backup found on disk] ******************************************************************************************************************************************************************************************
ok: [localhost]
TASK [Show the latest selected backup] *************************************************************************************************************************************************************************************************
ok: [localhost] => {
"latest_backup": "/tmp/test_restore/toto.txt.14861.2020-12-19#10:39:48~"
}
TASK [Restore latest backup of /tmp/test_restore/toto.txt] *****************************************************************************************************************************************************************************
changed: [localhost]
PLAY RECAP *****************************************************************************************************************************************************************************************************************************
localhost : ok=4 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
If you run that playbook a second time, the last task will report ok instead of changed, demonstrating if needed that the file has the same content as the latest backup.

ansible - ensure content of file is the same across servers

Using Ansible 2.9.12
Question: How do I configure Ansible to ensure the contents of a file is equal amongst at least 3 hosts, when the file is present at at least one host?
Imagine there are 3 hosts.
Host 1 does not has /file.txt.
Host 2 has /file.txt with contents hello.
Host 3 has /file.txt with contents hello.
Before the play is run, I am unaware whether the file is present or not. So the file could exist on host1, or host2 or host3. But the file exists on at least one of the hosts.
How would I ensure each time Ansible runs, the files across the hosts are equal. So in the end, Host 1 has the same file with the same contents as Host 2 or Host 3.
I'd like this to be dynamically set, instead of specifying the host names or group names, e.g. when: inventory_hostname == host1.
I am not expecting a check to see whether the contents of host 2 and 3 are equal
I do however, want this to be setup in an idempotent fashion.
The play below does the job, I think
shell> cat pb.yml
- hosts: all
tasks:
- name: Get status.
stat:
path: /file.txt
register: status
- block:
- name: Create dictionary status.
set_fact:
status: "{{ dict(keys|zip(values)) }}"
vars:
keys: "{{ ansible_play_hosts }}"
values: "{{ ansible_play_hosts|
map('extract', hostvars, ['status','stat','exists'])|
list }}"
- name: Fail. No file exists.
fail:
msg: No file exists
when: status.values()|list is not any
- name: Set reference to first host with file present.
set_fact:
reference: "{{ status|dict2items|
selectattr('value')|
map(attribute='key')|
first }}"
- name: Fetch file.
fetch:
src: /file.txt
dest: /tmp
delegate_to: "{{ reference }}"
run_once: true
- name: Copy file if not exist
copy:
src: "/tmp/{{ reference }}/file.txt"
dest: /file.txt
when: not status[inventory_hostname]
But, this doesn't check the existing files are in sync. It would be safer to sync all hosts, I think
- name: Synchronize file
synchronize:
src: "/tmp/{{ reference }}/file.txt"
dest: /file.txt
when: not status[inventory_hostname]
Q: "FATAL. could not find or access '/tmp/test-multi-01/file.txt on the Ansible controller. However, folder /tmp/test-multi-03 is present with the file.txt in it."
A: There is a problem with the fetch module when the task is delegated to another host. When the TASK [Fetch file.] is delegated to test-multi-01 which is localhost in this case changed: [test-multi-03 -> 127.0.0.1] the file will be fetched from test-multi-01 but will be stored in /tmp/test-multi-03/file.txt. The conclusion is, the fetch module ignores delegate_to when it comes to creating host-specific directories (not reported yet).
As a workaround, it's possible to set flat: true and store the files in a specific directory. For example, add the variable sync_files_dir with the directory, set fetch flat: true, and use the directory to both fetch and copy the file
- hosts: all
vars:
sync_files_dir: /tmp/sync_files
tasks:
- name: Get status.
stat:
path: /file.txt
register: status
- block:
- name: Create dir for files to be fetched and synced
file:
state: directory
path: "{{ sync_files_dir }}"
delegate_to: localhost
- name: Create dictionary status.
set_fact:
status: "{{ dict(keys|zip(values)) }}"
vars:
keys: "{{ ansible_play_hosts }}"
values: "{{ ansible_play_hosts|
map('extract', hostvars, ['status','stat','exists'])|
list }}"
- debug:
var: status
- name: Fail. No file exists.
fail:
msg: No file exists
when: status.values()|list is not any
- name: Set reference to first host with file present.
set_fact:
reference: "{{ status|dict2items|
selectattr('value')|
map(attribute='key')|
first }}"
- name: Fetch file.
fetch:
src: /file.txt
dest: "{{ sync_files_dir }}/"
flat: true
delegate_to: "{{ reference }}"
run_once: true
- name: Copy file if not exist
copy:
src: "{{ sync_files_dir }}/file.txt"
dest: /file.txt
when: not status[inventory_hostname]
We can achieve it by fetching the file from hosts where the file exists. The file(s) will be available on the control machine. However if the file which will be the source, exists on more than 1 node, then there will be no single source of truth.
Consider an inventory:
[my_hosts]
host1
host2
host3
Then the below play can fetch the file, then use that file to copy to all nodes.
# Fetch the file from remote host if it exists
- hosts: my_hosts
tasks:
- stat:
path: /file.txt
register: my_file
- fetch:
src: /file.txt
dest: /tmp/
when: my_file.stat.exists
- find:
paths:
- /tmp
patterns: file.txt
recurse: yes
register: local_file
delegate_to: localhost
- copy:
src: "{{ local_file.files[0].path }}"
dest: /tmp
If multiple hosts had this file then it would be in /tmp/{{ ansible_host }}. Then as we won't have a single source of truth, our best estimate can be to use the first file and apply on all hosts.
Well i believe the get_url module is pretty versatile - allows for local file paths or paths from a web server. Try it and let me know.
- name: Download files in all host
hosts: all
tasks:
- name: Download file from a file path
get_url:
url: file:///tmp/file.txt
dest: /tmp/
Edited ans:
(From documentation: For the synchronize module, the “local host” is the host the synchronize task originates on, and the “destination host” is the host synchronize is connecting to)
- name: Check that the file exists
stat:
path: /etc/file.txt
register: stat_result
- name: copy the file to other hosts by delegating the task to the source host
synchronize:
src: path/host
dest: path/host
delegate_to: my_source_host
when: stat_result.stat.exists

Ansible Playbook - Synchronize module - Register variable and with_items

I'm trying to write a playbook that will rsync the folders from source to target after a database refresh. Our Peoplesoft HR application also requires a filesystem refresh along with database. I'm new to ansible and not an expert with python. I've written this but my playbook fails if any of the with_items doesn't exist. I'd like to use this playbook for all apps and the folders may differ between apps. How can I skip the folders that doesn't exist in source. I'm passing {{ target }} at command line.
---
- hosts: '<hostname>'
remote_user: <user>
tasks:
- shell: ls -l /opt/custhome/prod/
register: folders
- name: "Copy PROD filesystem to target"
synchronize:
src: "/opt/custhome/prod/{{ item }}"
dest: "/opt/custhome/dev/"
delete: yes
when: "{{ folders == item }}"
with_items:
- 'src/cbl/'
- 'sqr/'
- 'bin/'
- 'NVISION/'
In this case, NVISION doesn't exist in HR app but it does in FIN app. But the playbook is failing coz that folder doesn't exist in source.
You can use find module to find and store paths to source folders and then to iterate over results. Example playbook:
- hosts: '<hostname>'
remote_user: <user>
tasks:
- name: find all directories
find:
file_type: directory
paths: /opt/custhome/prod/
patterns:
- "src"
- "sqr"
- "bin"
register: folders
#debug to understand contents of {{ folders }} variable
# - debug: msg="{{ folders }}"
- name: "Copy PROD filesystem to target"
synchronize:
src: "{{ item.path }}"
dest: "/opt/custhome/dev/"
delete: yes
with_items: "{{ folders.files }}"
You may want to use recurse to descend into subdirectories and use_regex to use the power of python regex instead of shell globbing.

Resources