Is there with_fileglob that works remotely in ansible? - ansible

Is there with_fileglob that works remotely in ansible?
Mainly I do want to use something similar with the with_fileglob but that will glob the files on the remote/target machine, not on the one that is running ansible.

Use find module to filter the files and then process the resulting list:
- name: Get files on remote machine
find:
paths: /path/on/remote
register: my_find
- debug:
var: item.path
with_items: "{{ my_find.files }}"

All of the with_* looping mechanisms are local lookups unfortunately so there's no really clean way to do this in Ansible. Remote operations by design must be enclosed in tasks as it would need to deal with connections and inventory etc.
What you can do is generate your fileglob by shelling out to the host and then registering the output and looping over the stdout_lines part of the output.
So a trivial example may be something like this:
- name : get files in /path/
shell : ls /path/*
register: path_files
- name: fetch these back to the local Ansible host for backup purposes
fetch:
src : /path/"{{item}}"
dest: /path/to/backups/
with_items: "{{ path_files.stdout_lines }}"
This would connect to the remote host (e.g., host.example.com), get all the file names under /path/ and then copy them back to the Ansible host to the path: /path/host.example.com/.

Using ls /path/* didn't work for me, so here's an example that uses find and some simple regex to delete all nginx managed virtual hosts:
- name: get all managed vhosts
shell: find /etc/nginx/sites-enabled/ -type f -name \*-managed.conf
register: nginx_managed_virtual_hosts
- name: delete all managed nginx virtual hosts
file:
path: "{{ item }}"
state: absent
with_items: "{{ nginx_managed_virtual_hosts.stdout_lines }}"
You could use it to find all files with a specific extension or any other mix. For instance to simply get all files in a directory: find /etc/nginx/sites-enabled/ -type f.

Here's a way to do it so that you can loop through all found. In my example, i had to look for all instances of pip to wipe out awscli in preparation to install awscli v2.0. I've done similar with lineinfile to strip out vars in /etc/skel dotfiles
- name: search for pip
find:
paths: [ /usr/local/bin, /usr/bin ]
file_type: any
pattern: pip*
register: foundpip
- name: Parse out pip paths (say that 3 times fast)
set_fact:
pips: "{{ foundpip | json_query('files[*].path') }}"
- name: List all the found versions of pip
debug:
msg: "{{ pips }}"
#upgrading pip often leaves broken symlinks or older wrappers behind which doesn't affect pip but breaks playbooks so ignore!
- name: remove awscli with found versions of pip
pip:
name: awscli
state: absent
executable: "{{ item }}"
loop: "{{ pips }}"
ignore_errors: yes

Related

Delete files older than x days inside folder of folders

I would like to use ansible to delete older files. I have a data log folder, inside this folder I have multiple directories:
/data/log/folder1/
/data/log/folder2/
....
I tried to use this ansible playbook :
---
- hosts: all
tasks:
- name: find all files that are older than 10 days
find:
paths: /data/log/*/
age: 10d
recursive: yes
register: filesOlderThan10
- name: remove older than 10
file:
path: "{{ item.path }}"
state: absent
with_items: "{{ (filesOlderThan10.files }}"
When I launch the playbook nothing is deleted, I'm not sure that I could use this syntax /data/log/*/
I am therefore looking for suggestions to improve this code
As of now I've found three or four errors in the playbook
Use become or make sure its set in config/inventory if you need to remove the files which you do not have permission.
paths: Should be fully qualified path and no wild cards accepted in the path I believe
It should be paths: /data/log
'recursive' is not correct option with find module. It should be 'recurse'
There is an unneeded '(' in the last line.
The below code should work
---
- hosts: all
tasks:
- name: find all files that are older than 10 days
find:
paths: /data/log
age: 10d
recurse: yes
register: filesOlderThan10
- name: remove older than 10
file:
path: "{{ item.path }}"
state: absent
with_items: "{{ filesOlderThan10.files }}"
I've been previously using a cronjob with find and decided to move to AWX and after checking here and other articles, I've come up with the following. Tested and working as we speak.
First task registers all files older than 3 days as being matched_files_dirs.
Second task removes them.
Does the job but is slower than just running cron on linux.
---
- name: Cleanup
hosts: linux
gather_facts: false
tasks:
- name: Collect files
shell: find /opt/buildagent/system*/target_directory -type f -mtime +3
register: matched_files_dirs
- name: Remove files
become_user: root
file:
path: "{{ item }}"
state: absent
with_items: "{{ matched_files_dirs.stdout_lines }}"

How to copy local files named with the destination server name?

How would you resolve this small script in ansible playbook ?
Files to copy are named [ServerName].[extension]
The destination server is ServeName
for file in $(ls /var/tmp)
do
ServerName=$(echo $file | awk -F. 'NF{NF--};1'
scp /var/tmp/$file $ServerName:/var/tmp/
scp /var/tmp/pkg.rpm $ServerName:/var/tmp/
ssh $ServerName "cd /var/tmp; yum -y localinstall pkg.rpm "
done
Thanks for your help
The idea would be to have something like this (but working, of course)
- name: main loop
copy:
src: "{{ item }}"
dest: "/var/tmp/myfile.json"
- name: Install package
yum:
name: "packageToInstall"
state: present
delegate_to: "{{ item.split('/')[-1][:-5] }}"
with_fileglob:
- "/var/temp/*json"
If you were to write this in yaml, you should use ansible_hostname, ansible already has several informations about the host via setup.
- name: copying files
copy:
src: /mine/file.ext
dest: /etc/{{ ansible_hostname }}.ext
More about copy in module description.
All facts gather during setup are available here.

Ansible Playbook - Synchronize module - Register variable and with_items

I'm trying to write a playbook that will rsync the folders from source to target after a database refresh. Our Peoplesoft HR application also requires a filesystem refresh along with database. I'm new to ansible and not an expert with python. I've written this but my playbook fails if any of the with_items doesn't exist. I'd like to use this playbook for all apps and the folders may differ between apps. How can I skip the folders that doesn't exist in source. I'm passing {{ target }} at command line.
---
- hosts: '<hostname>'
remote_user: <user>
tasks:
- shell: ls -l /opt/custhome/prod/
register: folders
- name: "Copy PROD filesystem to target"
synchronize:
src: "/opt/custhome/prod/{{ item }}"
dest: "/opt/custhome/dev/"
delete: yes
when: "{{ folders == item }}"
with_items:
- 'src/cbl/'
- 'sqr/'
- 'bin/'
- 'NVISION/'
In this case, NVISION doesn't exist in HR app but it does in FIN app. But the playbook is failing coz that folder doesn't exist in source.
You can use find module to find and store paths to source folders and then to iterate over results. Example playbook:
- hosts: '<hostname>'
remote_user: <user>
tasks:
- name: find all directories
find:
file_type: directory
paths: /opt/custhome/prod/
patterns:
- "src"
- "sqr"
- "bin"
register: folders
#debug to understand contents of {{ folders }} variable
# - debug: msg="{{ folders }}"
- name: "Copy PROD filesystem to target"
synchronize:
src: "{{ item.path }}"
dest: "/opt/custhome/dev/"
delete: yes
with_items: "{{ folders.files }}"
You may want to use recurse to descend into subdirectories and use_regex to use the power of python regex instead of shell globbing.

Move files on remote system, only if destination doesn't exist, with Ansible

I'm trying to write an Ansible role that moves a number of files on the remote system. I found a Stack Overflow post about how to do this, which essentially says "just use the command module with 'mv'". I have a single task defined with a with_items statement like this where each item in dirs is a dictionary with src and dest keys:
- name: Move directories
command: mv {{ item.src }} {{ item.dest }}
with_items: dirs
This is good and it works, but I run into problems if the destination directory already exists. I don't want to overwrite it, so I thought about trying to stat each dest directory first. I wanted to update the dirs variable with the stat info, but as far as I know, there isn't a good way to set or update variables once they're defined. So I used stat to get the info on each directory and then saved the data with register:
- name: Check if directories already exist
stat: path={{ item.dest }}
with_items: dirs
register: dirs_stat
Is there a way to tie the registered stat info to the mv commands? This would be easy if it were a single directory. The looping is what makes this tricky. Is there a way to do this without unrolling this loop into two tasks per directory?
This is not the simplest solution by any means, but if you wanted to use Ansible and not "unroll":
---
- hosts: all
vars:
dirs:
- src: /home/ubuntu/src/test/src1
dest: /home/ubuntu/src/test/dest1
- src: /home/ubuntu/src/test/src2
dest: /home/ubuntu/src/test/dest2
tasks:
- stat:
path: "{{item.dest}}"
with_items: dirs
register: dirs_stat
- debug:
msg: "should not copy {{ item.0.src }}"
with_together:
- dirs
- dirs_stat.results
when: item.1.stat.exists
Simply adapt the debug task to run the appropriate command task instead and when: to when: not ....
You can use stat keyword in your playbook to check if it exists or not if it doesn't then move.
---
- name: Demo Playbook
hosts: all
become: yes
tasks:
- name: check destination
stat:
path: /path/to/dest
register: p
- name: copy file if not exists
command: mv /path/to/src /path/to/src
when: p.stat.exists == False

How can i run ansible command if certain file changed

I am using ansible to deploy my django App
using
- name: Upgrade the virtualenv.
pip: requirements={{project_root}}/www/requirements.txt virtualenv={{project_root}}/www/virtualenv
But i only want to run that if requirements.txt changed since last run
We need to determine if any of the requirement files have changed. The steps are as follows:
Touch the temp requirement files. (If they didn't exist, the md5 will be different for the new blank file)
Calculate the md5 hash of the previous requirement files
Caclulate the md5 hash of the current requirement files (the ones just pulled down from GIT)
Iterate through the results of these stat commands in-step, comparing the md5 hash, register the output of the comparison
Only if ANY of the results in #4 changed will we install the pip packages
Copy the current requirement files to the tmp location.
Here's my playbook, {{virtualenv.requirements}} is a list of requirement files, eg: ['/work/project/requirements.txt', '/work/project/requirements-prod.txt']:
- name: Assures temp requirements directory exists
file: path="/tmp{{ virtualenv.path }}" state=directory
sudo: yes
when: install_pip_packages
- name: Assures temp requirements files exists
file: path="/tmp{{ item }}" state=touch
sudo: yes
with_items: virtualenv.requirements_files
when: install_pip_packages
- name: Calculate md5 of temp requirements
stat: path="/tmp{{ item }}"
with_items: virtualenv.requirements_files
register: temp_requirements_stat
when: install_pip_packages
- name: Calculate md5 of current requirements
stat: path="{{ item }}"
with_items: virtualenv.requirements_files
register: current_requirements_stat
when: install_pip_packages
- name: Check requirement files for changes
command: test {{ temp_requirements_stat.results[item.0].stat.md5 }} = {{ current_requirements_stat.results[item.0].stat.md5 }}
changed_when: "requirements_check.rc != 0"
failed_when: requirements_check.stderr
with_indexed_items: virtualenv.requirements_files
register: requirements_check
when: install_pip_packages
- name: Install packages required by the Django app inside virtualenv
pip: virtualenv={{ virtualenv.path }} extra_args='-U' requirements="{{ virtualenv.requirements_files | join(' -r ') }}"
when: install_pip_packages and requirements_check.changed
- name: Copy requirements to /tmp
command: cp "{{ item }}" "/tmp{{ item }}"
sudo: yes
with_items: virtualenv.requirements_files
when: install_pip_packages
Here are two options:
put your requirements.txt under Ansible control and use 'copy' or 'template' module, then invoke 'pip' module with 'notify:' statement
second way is more complex:
retrieve md5 sum of requirements.txt on each Ansible run and compare it with saved md5 somewhere on the server ('stat' module could be used)
retrieve pre-saved md5 sum of requirements.txt
if current md5 is not equal to presaved, then invoke pip task ('when:' statement)
save new md5 somewhere on the server for next Ansible run
I use this pretty short workaround for git repository.
- name: get requirements changes since last pull
shell: "cd {{ project_root }}; git log --name-status --oneline origin/master
{{ git_result.before }}..{{ git_result.after }} | grep requirements.txt"
register: pip_check
failed_when: false
- name: update pip requirements
pip: requirements={{ project_root }}/requirements.txt
virtualenv=~/.virtualenvs/www/
when: pip_check.stdout_lines
It's not universal and not cross-platform recipe, but works well for many situations.

Resources