Ansible playbook to run a command for each subdirectory in a directory - ansible

I have a following directory structure:
parent_dir/
├── subdir_1
├── subdir_2
└── subdir_3
The subdirs don't have a fixed name and there can be an arbitrary number of them.
How to make ansible run a task for each sub directory?
(any task will do, eventually every dir will be a python package to install, but that isn't important for the context of this question)

This is the solution I managed to come up with, perhaps there is a cleaner way with lookups to achieve this in a single task.
Copy pasting the following code will create a directory structure with a minimal ansible playbook that does the required. (tested on Ubuntu/dash)
mkdir action_per_dir
cd action_per_dir
mkdir -p parent_dir/subdir_1 parent_dir/subdir_2 parent_dir/subdir_3
cat > action_per_dir.yml << "EOF"
---
# Gets all the directories and stores all the return values of `find`
# into the results_of_find
# The return value will consist of:
# https://docs.ansible.com/ansible/latest/modules/find_module.html#return-values
- hosts: localhost
tasks:
- name: Get all dirs
find:
paths: parent_dir
file_type: directory
register: result_of_find
# We're interested only in the `files` part of results of find.
# In pseudo code what's happening here is:
# for each item in result_of_find.files:
# print item.path
#
# The output will be very verbose but for debugging purposes it can be filtered
# ansible-playbook action_per_dir.yml | grep msg
- name: Print all the dirs
debug:
msg: "{{ item.path }}"
with_items: "{{ result_of_find.files }}"
EOF
After that it just needs to be run:
ansible-playbook action_per_dir.yml | grep msg

Related

Get multiple file contents to one Ansible variable

On Ubuntu 18 server in directory /home/adminuser/keys are 5 files that contain key parts:
/home/adminuser/key/
|- unseal_key_0
|- unseal_key_1
|- unseal_key_2
|- unseal_key_3
|- unseal_key_4
File contents:
1bbeaafab5037a287bde3e5203c8b2cd205f4cc55b4fcffe7931658dc20d8cdcdf
bdf7a6ee4c493aca5b9cc2105077ec67738a0e8bf21936abfc5d1ff8080b628fcb
545c087d3d59d02556bdbf8690c8cc9faafec0e9766bb42de3a7884159356e91b8
053207b0683a8a2886129f7a1988601629a9e7e0d8ddbca02333ce08f1cc7b3887
2320f6275804341ebe5d39a623dd309f233e454b4453c692233ca86212a3d40b5f
Part of Ansible playbook (task):
- name: Reading file contents
command: cat {{item}}
register: unseal_keys
with_fileglob: "/home/adminuser/keys/*"
The error that I get:
"[WARNING]: Unable to find '/home/adminuser/keys' in expected paths (use -vvvvv to see paths)"
I have tried to:
change user that creates directory and files
change path to /home/adminuser/keys/ and /home/adminuser/keys
I expect all of the file contents (that is parts of a single key) to be merged into one string:
1bbeaafab5037a287bde3e5203c8b2cd205f4cc55b4fcffe7931658dc20d8cdcdfbdf7a6ee4c493aca5b9cc2105077ec67738a0e8bf21936abfc5d1ff8080b628fcb545c087d3d59d02556bdbf8690c8cc9faafec0e9766bb42de3a7884159356e91b8 053207b0683a8a2886129f7a1988601629a9e7e0d8ddbca02333ce08f1cc7b38872320f6275804341ebe5d39a623dd309f233e454b4453c692233ca86212a3d40b5f
Given the files below for testing
shell> tree /tmp/admin/
/tmp/admin/
└── key
├── key_0
├── key_1
└── key_2
1 directory, 3 files
shell> cat /tmp/admin/key/key_0
abc
shell> cat /tmp/admin/key/key_1
def
shell> cat /tmp/admin/key/key_2
ghi
Use the module assemble to: "assemble a configuration file from fragments."
Declare the path
key_all_path: /tmp/admin/key_all
and assemble the fragments
- assemble:
src: /tmp/admin/key
dest: "{{ key_all_path }}"
This will create the file /tmp/admin/key_all
shell> cat /tmp/admin/key_all
abc
def
ghi
Read the file and join the lines. Declare the variable
key_all: "{{ lookup('file', key_all_path).splitlines()|join('') }}"
gives
key_all: abcdefghi
Example of a complete playbook for testing
- hosts: localhost
vars:
key_all_path: /tmp/admin/key_all
key_all: "{{ lookup('file', key_all_path).splitlines()|join('') }}"
tasks:
- assemble:
src: /tmp/admin/key
dest: "{{ key_all_path }}"
- debug:
var: key_all
Thanks !
Problem was in paths and hosts where task had to be executed.
Problem is solved by locating and reading files localy and executing this task:
- name: Reading file contents
command: cat "{{item}}"
register: keys ----> all file contents to variable "keys"
with_fileglob: "~/keys/*" ----> this is path to directory all files are storedon my local machine
delegate_to: localhost ----> here I specify that this task will be executed on local machine
become: false ----> remove sudo so that password is not requested

Ansible with_fileglob - issue with a variable in a path

I have a problem when I want to add a variable into a path which is used by with_fileglob - it seems that the variable is always expended to "[]".
I ran the playbook with parameter --extra-vars environment="dev" and got from debug output extra_vars: ('environment=dev',).
Unfortunatelly copy task with with_fileglob failed:
- name: Copy all files from environment subdirectory
copy:
src: "{{item}}"
dest: /etc/
with_fileglob: directory/{{ environment }}/*
TASK [Copy all files from environment subdirectory] ************************************************************************
task path: /home/ansible/playbook/playbook.yml:511
looking for "files/directory/[]" at "/home/ansible/playbook/files/files/directory/[]"
looking for "files/directory/[]" at "/home/ansible/playbook/files/directory/[]"
looking for "files/directory/[]" at "/home/ansible/playbook/files/files/directory/[]"
looking for "files/directory/[]" at "/home/ansible/playbook/files/directory/[]"
[WARNING]: Unable to find 'files/directory/[]' in expected paths (use -vvvvv to see paths)
I am using ansible 2.9.3.
May I ask you what I did wrong?
Thanks a lot for your hints in advance.
environment is a reserved keyword and can't be used as the name of a variable. See Creating valid variable names. The fixed variable in the playbook below works as expected
shell> cat pb.yml
- hosts: localhost
tasks:
- debug:
var: item
with_fileglob: "directory/{{ env }}/*"
Given the tree
shell> tree directory
directory
└── dev
├── file1
├── file2
└── file3
1 directory, 3 files
the abridged result is
shell> ansible-playbook pb.yml -e "env=dev" | grep item:
item: /scratch/tmp/directory/dev/file2
item: /scratch/tmp/directory/dev/file1
item: /scratch/tmp/directory/dev/file3

ansible-playbook gather information from a file

I want to read a file by ansible and find specific thing and store all of them in a file in my localhost
for example there is /tmp/test file in all host and I want to grep specific thing in this file and store all of them in my home.
What should I do?
There might be many ways to accomplish this. The choice of Ansible modules (or even tools) can vary.
One approach is (using only Ansible):
Slurp the remote file
Write new file with filtered content
Fetch the file to Control machine
Example:
- hosts: remote_host
tasks:
# Slurp the file
- name: Get contents of file
slurp:
src: /tmp/test
register: testfile
# Filter the contents to new file
- name: Save contents to a variable for looping
set_fact:
testfile_contents: "{{ testfile.content | b64decode }}"
- name: Write a filtered file
lineinfile:
path: /tmp/filtered_test
line: "{{ item }}"
create: yes
when: "'TEXT_YOU_WANT' in item"
with_items: "{{ testfile_contents.split('\n') }}"
# Fetch the file
- name: Fetch the filtered file
fetch:
src: /tmp/filtered_test
dest: /tmp/
This will fetch the file to /tmp/<ANSIBLE_HOSTNAME>/tmp/filtered_test.
You can use the Ansible fetch module to download files from the remote system to your local system. You can then do the processing locally, as shown in this Ansible cli example:
REMOTE=[YOUR_REMOTE_SERVER]; \
ansible -m fetch -a "src=/tmp/test dest=/tmp/ansiblefetch/" $REMOTE && \
grep "[WHAT_YOU_ARE_INTERESTED_IN]" /tmp/ansiblefetch/$REMOTE/tmp/test > /home/user/ansible_files/$REMOTE
This snippet runs the ad-hoc version of Ansible, calling the module fetch with the source folder (on the remote) and the destination folder (locally) as arguments. Fetch copies the file into a folder [SRC]/[REMOTE_NAME]/[DEST], from which we then grep what we are interested in, and output that in the /home/user/ansible_files/$REMOTE.

How to delete the oldest directory with ansible

How to delete the oldest directory with ansible.
suppose I have the following tree structure
Parent Directory
-Dir2020-05-20
-Dir2020-05-21
-Dir2020-05-22
-Dir2020-05-23
now every time an ansible playbook is run, it should delete the oldest directory, For e.g it should delete Dir2020-05-20 in its first run if we consider its creation date to be 2020-05-20.
age attribute of file module does not seen helpful as i have to run this playbook very randomly and i want to keep limited no. of these directories.
Just assign dirpath to the path of your "Parent Directory" where all these directories are present
---
- hosts: localhost
vars:
dir_path: "/home/harshit/ansible/test/" ##parent directory path, make sure it ends with a slash
tasks:
- name: find oldest directory
shell:
cmd: "ls `ls -tdr | head -n 1 `"
chdir: "{{dir_path}}"
register: dir_name_to_delete
- name: "delete oldest directory: {{dir_path}}{{dir_name_to_delete.stdout}}"
file:
state: absent
path: "{{dir_path}}{{dir_name_to_delete.stdout}}"
Considering a recommended practice is not to use shell or command modules wherever possible I suggest a pure ansible solution for this case:
- name: Get directory list
find:
paths: "{{ target_directory }}"
file_type: directory
register: found_dirs
- name: Get the oldest dir
set_fact:
oldest_dir: "{{ found_dirs.files | sort(attribute='mtime') | first }}"
- name: Delete oldest dir
file:
state: absent
path: "{{ oldest_dir.path }}"
when:
- found_dirs.files | count > 3
There are two ways to know how many files were found with find module - either using its return value matched like this when: found_dirs.matched > 3 or using count filter. I prefer the latter method because I just use this filter in a lot of other cases so this is just a habit.
For your reference, ansible has a whole bunch of useful filters (e.g. I used count and sort here, but there are dozens of them). One does not need to remember those filter names, of course, just keep in mind they exist and might be useful in many cases.

How to create directory using ansible with directory names taken from a file

How to create a directory using ansible where directory names should be taken from a different file
I have tried using with_file command which didnt help much
-bash-4.2$ cat main.yml
---
- name: It is a test yml
file:
dest: "/tmp/destination/{{ item }}"
state: directory
with_file:
- "/tmp/stuff.yml"
-bash-4.2$ cat /tmp/stuff.yml
test
hello
world
The expected output is - 3 Folders to be created as below
/tmp/destination/test
/tmp/destination/hello
/tmp/destination/world
But the output what I received is - Only 1 Folder created as below
/tmp/destination/test?hello?world
Use with_lines
- name: It is a test yml
file:
dest: "/tmp/destination/{{ item }}"
state: directory
with_lines: cat /tmp/stuff.yml

Resources