How to include variables with include_vars with the same name without overwriting previous - ansible

I am having this let's call it include.yaml
#- name: "Playing with Ansible and Include files"
- hosts: localhost
connection: local
tasks:
- find: paths="./" recurse=yes patterns="test.yaml"
register: file_to_exclude
- debug: var=file_to_exclude.stdout_lines
- name: shell
shell: "find \"$(pwd)\" -name 'test.yaml'"
register: files_from_dirs
- debug: var=files_from_dirs.stdout_lines
- name: Include variable files
include_vars: "{{ item }}"
with_items:
- "{{ files_from_dirs.stdout_lines }}"
- debug: var=files
and 2 ore more test files
./dir1/test.yaml
that contains
files:
- file1
- file2
./dir2/test.yaml
that contains
files:
- file3
- file4
the result is
TASK [Include variable files] ******************************************************************************************
ok: [localhost] => (item=/mnt/c/Users/GFlorinescu/ansible_scripts/ansible/1st/test.yaml)
ok: [localhost] => (item=/mnt/c/Users/GFlorinescu/ansible_scripts/ansible/2nd/test.yaml)
TASK [debug] ***********************************************************************************************************
ok: [localhost] => {
"files": [
"file3",
"file4"
]
}
How can I get all the values in files, at the moment the last included files variable from last file overrides the files from the previous files? Of course without changing the variables names in files test.yaml?
In other words I want files to be:
ok: [localhost] => {
"files": [
"file1",
"file2",
"file3",
"file4"
]
}
To be more specific, I ask for any kind of solution or module, even not official or some github module, I don't want a specific include_vars module solution.

Put the included variables into the dictionaries with unique names. For example, create the names from the index of the loop. Then, iterate the names and concatenate the lists
- command: "find {{ playbook_dir }} -name test.yaml"
register: files_from_dirs
- include_vars:
file: "{{ item }}"
name: "{{ name }}"
loop: "{{ files_from_dirs.stdout_lines }}"
loop_control:
extended: true
vars:
name: "files_{{ ansible_loop.index }}"
- set_fact:
files: "{{ files|d([]) + lookup('vars', item).files }}"
with_varnames: "files_[0-9]+"
- debug:
var: files
give
files:
- file1
- file2
- file3
- file4
Notes:
You have to provide either a path relative to the home directory or an absolute path. See the example below
- command: "echo $PWD"
register: out
- debug:
var: out.stdout
give
out.stdout: /home/admin
For example, when you want to find the files relative to the directory of the playbook
- command: "find {{ playbook_dir }} -name test.yaml"
register: files_from_dirs
- debug:
var: files_from_dirs.stdout_lines
give
files_from_dirs.stdout_lines:
- /export/scratch/tmp8/test-987/dir1/test.yaml
- /export/scratch/tmp8/test-987/dir2/test.yaml
The same is valid for the module find. For example,
- find:
paths: "{{ playbook_dir }}"
recurse: true
patterns: test.yaml
register: files_from_dirs
- debug:
var: files_from_dirs.files|map(attribute='path')|list
give the same result
files_from_dirs.files|map(attribute='path')|list:
- /export/scratch/tmp8/test-987/dir1/test.yaml
- /export/scratch/tmp8/test-987/dir2/test.yaml
Simplify the code and put the declaration of files into the vars. For example, the below declaration gives the same result
files: "{{ query('varnames', 'files_[0-9]+')|
map('extract', hostvars.localhost, 'files')|
flatten }}"
Example of a complete playbook for testing
- hosts: localhost
vars:
files: "{{ query('varnames', 'files_[0-9]+')|
map('extract', hostvars.localhost, 'files')|
flatten }}"
tasks:
- find:
paths: "{{ playbook_dir }}"
recurse: true
patterns: test.yaml
register: files_from_dirs
- include_vars:
file: "{{ item }}"
name: "{{ name }}"
loop: "{{ files_from_dirs.files|map(attribute='path')|list }}"
loop_control:
extended: true
vars:
name: "files_{{ ansible_loop.index }}"
- debug:
var: files
(maybe off-topic, see comments)
Q: "Is there a way to write the path where it was found?"
A: Yes, it is. See the self-explaining example below. Given the inventory
shell> cat hosts
host_1 file_1=alice
host_2 file_2=bob
host_3
the playbook
- hosts: host_1,host_2,host_3
vars:
file_1_list: "{{ hostvars|json_query('*.file_1') }}"
file_2_list: "{{ hostvars|json_query('*.file_2') }}"
file_1_dict: "{{ dict(hostvars|dict2items|
selectattr('value.file_1', 'defined')|
json_query('[].[key, value.file_1]')) }}"
file_1_lis2: "{{ hostvars|dict2items|
selectattr('value.file_1', 'defined')|
json_query('[].{key: key, file_1: value.file_1}') }}"
tasks:
- debug:
msg: |-
file_1_list: {{ file_1_list }}
file_2_list: {{ file_2_list }}
file_1_dict:
{{ file_1_dict|to_nice_yaml|indent(2) }}
file_1_lis2:
{{ file_1_lis2|to_nice_yaml|indent(2) }}
run_once: true
gives
msg: |-
file_1_list: ['alice']
file_2_list: ['bob']
file_1_dict:
host_1: alice
file_1_lis2:
- file_1: alice
key: host_1

Related

Check if a file has certain strings

I have some files (file1), in some servers (group: myservers), which should look like this:
search www.mysebsite.com
nameserver 1.2.3.4
nameserver 1.2.3.5
This is an example of what this file should look like:
The first line is mandatory ("search www.mysebsite.com").
The second and the third lines are mandatory as well, but the ips can change (although they should all be like this: ...).
I've being researching to implement some tasks using Ansible to check if the files are properly configured. I don't want to change any file, only check and output if the files are not ok or not.
I know I can use ansible.builtin.lineinfile to check it, but I still haven't managed to find out how to achieve this.
Can you help please?
For example, given the inventory
shell> cat hosts
[myservers]
test_11
test_13
Create a dictionary of what you want to audit
audit:
files:
/etc/resolv.conf:
patterns:
- '^search example.com$'
- '^nameserver \d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}$'
/etc/rc.conf:
patterns:
- '^sshd_enable="YES"$'
- '^syslogd_flags="-ss"$'
Declare the directory at the controller where the files will be stored
my_dest: /tmp/ansible/myservers
fetch the files
- fetch:
src: "{{ item.key }}"
dest: "{{ my_dest }}"
loop: "{{ audit.files|dict2items }}"
Take a look at the fetched files
shell> tree /tmp/ansible/myservers
/tmp/ansible/myservers
├── test_11
│   └── etc
│   ├── rc.conf
│   └── resolv.conf
└── test_13
└── etc
├── rc.conf
└── resolv.conf
4 directories, 4 files
Audit the files. Create the dictionary host_files_results in the loop
- set_fact:
host_files_results: "{{ host_files_results|default({})|
combine(host_file_dict|from_yaml) }}"
loop: "{{ audit.files|dict2items }}"
loop_control:
label: "{{ item.key }}"
vars:
host_file_path: "{{ my_dest }}/{{ inventory_hostname }}/{{ item.key }}"
host_file_lines: "{{ lookup('file', host_file_path).splitlines() }}"
host_file_result: |
[{% for pattern in item.value.patterns %}
{{ host_file_lines[loop.index0] is regex pattern }},
{% endfor %}]
host_file_dict: "{ {{ item.key }}: {{ host_file_result|from_yaml is all }} }"
gives
ok: [test_11] =>
host_files_results:
/etc/rc.conf: true
/etc/resolv.conf: true
ok: [test_13] =>
host_files_results:
/etc/rc.conf: true
/etc/resolv.conf: true
Declare the dictionary audit_files that aggregates host_files_results
audit_files: "{{ dict(ansible_play_hosts|
zip(ansible_play_hosts|
map('extract', hostvars, 'host_files_results'))) }}"
gives
audit_files:
test_11:
/etc/rc.conf: true
/etc/resolv.conf: true
test_13:
/etc/rc.conf: true
/etc/resolv.conf: true
Evaluate the audit results
- block:
- debug:
var: audit_files
- assert:
that: "{{ audit_files|json_query('*.*')|flatten is all }}"
fail_msg: "[ERR] Audit of files failed. [TODO: list failed]"
success_msg: "[OK] Audit of files passed."
run_once: true
gives
msg: '[OK] Audit of files passed.'
Example of a complete playbook for testing
- hosts: myservers
vars:
my_dest: /tmp/ansible/myservers
audit:
files:
/etc/resolv.conf:
patterns:
- '^search example.com$'
- '^nameserver \d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}$'
/etc/rc.conf:
patterns:
- '^sshd_enable="YES"$'
- '^syslogd_flags="-ss"$'
audit_files: "{{ dict(ansible_play_hosts|
zip(ansible_play_hosts|
map('extract', hostvars, 'host_files_results'))) }}"
tasks:
- fetch:
src: "{{ item.key }}"
dest: "{{ my_dest }}"
loop: "{{ audit.files|dict2items }}"
loop_control:
label: "{{ item.key }}"
- set_fact:
host_files_results: "{{ host_files_results|default({})|
combine(host_file_dict|from_yaml) }}"
loop: "{{ audit.files|dict2items }}"
loop_control:
label: "{{ item.key }}"
vars:
host_file_path: "{{ my_dest }}/{{ inventory_hostname }}/{{ item.key }}"
host_file_lines: "{{ lookup('file', host_file_path).splitlines() }}"
host_file_result: |
[{% for pattern in item.value.patterns %}
{{ host_file_lines[loop.index0] is regex pattern }},
{% endfor %}]
host_file_dict: "{ {{ item.key }}: {{ host_file_result|from_yaml is all }} }"
- debug:
var: host_files_results
- block:
- debug:
var: audit_files
- assert:
that: "{{ audit_files|json_query('*.*')|flatten is all }}"
fail_msg: "[ERR] Audit of files failed. [TODO: list failed]"
success_msg: "[OK] Audit of files passed."
run_once: true
... implement some tasks using Ansible to check if the files are properly configured. I don't want to change any file, only check and output if the files are not ok or not.
Since Ansible is mostly used as Configuration Management Tool there is no need to check (before) if a file is properly configured. Just declare the Desired State and make sure that the file is in that state. As this is approach is working with Validating: check_mode too, if interested in a Configuration Check or an Audit it could be implemented simply as follow:
resolv.conf as is it should be
# Generated by NetworkManager
search example.com
nameserver 192.0.2.1
hosts.ini
[test]
test.example.com NS_IP=192.0.2.1
resolv.conf.j2 template
# Generated by NetworkManager
search {{ DOMAIN }}
nameserver {{ NS_IP }}
A minimal example playbook for Configuration Check in order to audit the config
---
- hosts: test
become: false
gather_facts: false
vars:
# Ansible v2.9 and later
DOMAIN: "{{ inventory_hostname.split('.', 1) | last }}"
tasks:
- name: Check configuration (file)
template:
src: resolv.conf.j2
dest: resolv.conf
check_mode: true # will never change existing config
register: result
- name: Config change
debug:
msg: "{{ result.changed }}"
will result for no changes into an output of
TASK [Check configuration (file)] ******
ok: [test.example.com]
TASK [Config change] *******************
ok: [test.example.com] =>
msg: false
or for changes into
TASK [Check configuration (file)] ******
changed: [test.example.com]
TASK [Config change] *******************
ok: [test.example.com] =>
msg: true
and depending on what's in the config file.
If one is interested in an other message text and need to invert the output therefore, just use msg: "{{ not result.changed }}" as it will report an false if true and true if false.
Further Reading
Using Ansible inventory, variables in inventory, the template module (to) Template a file out to a target host and Enforcing check_mode on tasks makes it extremely simply to prevent Configuration Drift.
And as a reference for getting the search domain, Ansible: How to get hostname without domain name?.

Ansible - loop over multiple items in stdout_lines

I am performing a grep with multiple items.
---
- hosts: my_host
gather_facts: false
vars:
my_list:
- whatever
- something
tasks:
- name: grep for item in search path
shell: "grep -rIL {{ item }} /tmp"
register: the_grep
loop: "{{ my_list }}"
- debug:
msg: "{{ item.stdout_lines }}"
loop: "{{ the_grep.results }}"
Depending on the result, multiple files could match.
msg:
- /tmp/something.conf
- /tmp/folder/file.txt
Q: How would I configure Ansible to loop over the items in stdout_lines?
The use case I'm solving is to delete .ini sections based on the item, but in this case, Ansible doesn't loop over the stdout_lines.
- name: remove stanza from ini file
ini_file:
path: "{{ item.stdout_lines }}"
section: "{{ item.item }}"
mode: '0600'
state: absent
loop: "{{ the_grep.results }}"
when: item.stdout_lines | length > 0
It seems that this doesn't work, but configuring item.stdout_lines[0] gives the partially expected result, since Ansible will use only the first item in that list. But ofc, not the 2nd and so on.
Perhaps there's a prettier answer, but solved it by using with_nested and creating a json_query:
- name: remove stanza from ini file
ini_file:
path: "{{ item.0 }}"
section: "{{ item.1.item }}"
mode: '0600'
state: absent
with_nested:
- "{{ the_grep | json_query('results[].stdout_lines[]') }}"
- "{{ the_grep.results }}"

Ansible - Looking for files and compare their hash

Practice:
I have the file files.yml with a list of files and their respective md5_sum hash, like:
files:
- name: /opt/file_compare1.tar
hash: 9cd599a3523898e6a12e13ec787da50a /opt/file_compare1.tar
- name: /opt/file_compare2tar.gz
hash: d41d8cd98f00b204e9800998ecf8427e /opt/file_compare2.tar.gz
I need to create a playbook to check this list of files if the current hash is the same or if it was changed, the playbook should have a debug message like below:
---
- hosts: localhost
connection: local
vars_files:
- files.yml
tasks:
- name: Use md5 to calculate checksum
stat:
path: "{{ item.name }}"
checksum_algorithm: md5
register: hash_check
with_items:
- "{{ files }}"
- name: Debug files - Different
debug:
msg: |
"Hash changed: {{ item.name }}"
when:
- item.hash != hash_check
with_items:
- "{{ files }}"
- name: Debug files - Equal
debug:
msg: |
"Hash NOT changed: {{ item.name }}"
when:
- item.hash == hash_check
with_items:
- "{{ files }}"
- debug:
msg: |
- "{{ hash_check }} {{ item.name }}"
with_items:
- "{{ files }}"
For example, given the files
files:
- name: /scratch/file_compare1.tar
hash: 4f8805b4b64dcc575547ec1c63793aec /scratch/file_compare1.tar
- name: /scratch/file_compare2.tar.gz
hash: 2dc4f1e9ca4081cc49d25195627982ef /scratch/file_compare2.tar.gz
the tasks below
- name: Use md5 to calculate checksum
stat:
path: "{{ item.name }}"
checksum_algorithm: md5
register: hash_check
loop: "{{ files }}"
- name: Debug files - Different
debug:
msg: |
Hash NOT changed: {{ item.0.name }}
{{ item.0.hash.split()|first }}
{{ item.1 }}
with_together:
- "{{ files }}"
- "{{ hash_check.results|map(attribute='stat.checksum')|list }}"
when: item.0.hash.split()|first == item.1
give
msg: |-
Hash NOT changed: /scratch/file_compare1.tar
4f8805b4b64dcc575547ec1c63793aec
4f8805b4b64dcc575547ec1c63793aec
msg: |-
Hash NOT changed: /scratch/file_compare2.tar.gz
2dc4f1e9ca4081cc49d25195627982ef
2dc4f1e9ca4081cc49d25195627982ef
A more robust option would be to create a dictionary with the calculated hashes
- name: Use md5 to calculate checksum
stat:
path: "{{ item.name }}"
checksum_algorithm: md5
register: hash_check
loop: "{{ files }}"
- set_fact:
path_hash: "{{ dict(_path|zip(_hash)) }}"
vars:
_path: "{{ hash_check.results|map(attribute='stat.path')|list }}"
_hash: "{{ hash_check.results|map(attribute='stat.checksum')|list }}"
gives
path_hash:
/scratch/file_compare1.tar: 4f8805b4b64dcc575547ec1c63793aec
/scratch/file_compare2.tar.gz: 2dc4f1e9ca4081cc49d25195627982ef
Then use this dictionary to compare the hashes. For example, the task below gives the same results
- name: Debug files - Different
debug:
msg: |
Hash NOT changed: {{ item.name }}
{{ item.hash.split()|first }}
{{ path_hash[item.name] }}
loop: "{{ files }}"
when: item.hash.split()|first == path_hash[item.name]
The next option is to create a dictionary with the original hashes and both lists of original and calculated hashes
- name: Use md5 to calculate checksum
stat:
path: "{{ item.name }}"
checksum_algorithm: md5
register: hash_check
loop: "{{ files }}"
- set_fact:
hash_name: "{{ dict(_hash|zip(_name)) }}"
hash_orig: "{{ _hash }}"
hash_stat: "{{ hash_check.results|map(attribute='stat.checksum')|list }}"
vars:
_hash: "{{ files|map(attribute='hash')|map('split')|map('first')|list }}"
_name: "{{ files|map(attribute='name')|list }}"
gives
hash_name:
2dc4f1e9ca4081cc49d25195627982ef: /scratch/file_compare2.tar.gz
4f8805b4b64dcc575547ec1c63793aec: /scratch/file_compare1.tar
hash_orig:
- 4f8805b4b64dcc575547ec1c63793aec
- 2dc4f1e9ca4081cc49d25195627982ef
hash_stat:
- 4f8805b4b64dcc575547ec1c63793aec
- 2dc4f1e9ca4081cc49d25195627982ef
Then calculate the difference of the lists and use it to extract both lists of changed and unchanged files
- set_fact:
files_diff: "{{ _diff|map('extract', hash_name)|list }}"
files_orig: "{{ _orig|map('extract', hash_name)|list }}"
vars:
_diff: "{{ hash_orig|difference(hash_stat) }}"
_orig: "{{ hash_orig|difference(_diff) }}"
- name: Debug files changed
debug:
var: files_diff
- name: Debug files NOT changed
debug:
var: files_orig
gives
files_diff: []
files_orig:
- /scratch/file_compare1.tar
- /scratch/file_compare2.tar.gz
I used your suggestion to complement the playbook, it's working now.
The idea is to get a list of files, read each one and compare with both hash, file, and current hash.
---
- hosts: localhost
connection: local
gather_facts: false
vars_files:
- files3.yml
tasks:
- stat:
path: "{{ item.file }}"
checksum_algorithm: md5
loop: "{{ files }}"
register: stat_results
- name: NOT changed files
debug:
msg: "NOT changed: {{ item.stat.path }}"
when: item.stat.checksum == item.item.checksum.split()|first
loop: "{{ stat_results.results }}"
loop_control:
label: "{{ item.stat.path }}"
- name: Changed files
debug:
msg: "CHANGED: {{ item.stat.path }}"
when: item.stat.checksum != item.item.checksum.split()|first
loop: "{{ stat_results.results }}"
loop_control:
label: "{{ item.stat.path }}"
Result:
>> ansible-playbook playbooks/check-file3.yml
PLAY [localhost] ********************************************************************************************************************************************************************************************************************
TASK [stat] *************************************************************************************************************************************************************************************************************************
ok: [localhost] => (item={'file': '/opt/file_compare1.tar', 'checksum': '9cd599a3523898e6a12e13ec787da50a /opt/file_compare1.tar'})
ok: [localhost] => (item={'file': '/opt/file_compare2.tar.gz', 'checksum': 'd41d8cd98f00b204e9800998ecf8427e /opt/file_compare2.tar.gz'})
TASK [NOT changed files] ************************************************************************************************************************************************************************************************************
skipping: [localhost] => (item=/opt/file_compare1.tar)
ok: [localhost] => (item=/opt/file_compare2.tar.gz) => {
"msg": "NOT changed: /opt/file_compare2.tar.gz"
}
TASK [Changed files] ****************************************************************************************************************************************************************************************************************
ok: [localhost] => (item=/opt/file_compare1.tar) => {
"msg": "CHANGED: /opt/file_compare1.tar"
}
skipping: [localhost] => (item=/opt/file_compare2.tar.gz)
PLAY RECAP **************************************************************************************************************************************************************************************************************************
localhost : ok=3 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0

Double iteration in Ansible

How can I iterate by list of dictionaries and assign each list from key of dictionary to with_items and iterate by this list: For example:
- name: "Deploy"
template:
src: "myfile"
dst: "{{ item }}/myfile" //"item" is "mypath1", "mypath2"..
with_items: {{ item.paths }}
loop:
- {group: "mygroup1", paths: ["mypath1", "mypath2"]}
- {group: "mygroup2", paths: ["mypath3", "mypath4"]}
when: "{{ item.group }} in group_names"
Use with_subelements. For example, given the inventory
shell> cat hosts
[mygroup1]
srv1
[mygroup2]
srv2
The playbook
shell> cat pb1.yml
- hosts: srv1,srv2
tasks:
- name: Deploy
debug:
msg: "{{ inventory_hostname }}: Create myfile at {{ item.1 }}/myfile"
with_subelements:
- "{{ _list }}"
- paths
when: item.0.group in group_names
vars:
_list:
- {group: mygroup1, paths: [mypath1, mypath2]}
- {group: mygroup2, paths: [mypath3, mypath4]}
gives
msg: 'srv1: Create myfile at mypath1/myfile'
msg: 'srv1: Create myfile at mypath2/myfile'
msg: 'srv2: Create myfile at mypath3/myfile'
msg: 'srv2: Create myfile at mypath4/myfile'
You'll be better off with a dictionary, instead of a list. For example, the playbook below gives the same results using a simple loop
shell> cat pb2.yml
- hosts: srv1,srv2
tasks:
- name: Deploy
debug:
msg: "{{ inventory_hostname }}: Create myfile at {{ item }}/myfile"
loop: "{{ group_names|map('extract', _dict)|flatten }}"
vars:
_dict:
mygroup1: [mypath1, mypath2]
mygroup2: [mypath3, mypath4]
But, the Ansible-way would be to put the data into the group_vars. For example
shell> cat group_vars/mygroup1.yml
my_paths: [mypath1, mypath2]
shell> cat group_vars/mygroup2.yml
my_paths: [mypath3, mypath4]
and the simple playbook below give the same results
shell> cat pb3.yml
- hosts: srv1,srv2
tasks:
- name: Deploy
debug:
msg: "{{ inventory_hostname }}: Create myfile at {{ item }}/myfile"
loop: "{{ my_paths }}"

In Ansible loop, test existence of files from registered results

I have several files that I need to backup in different directories. I have tried the code below and not working for me.
vars:
file_vars:
- {name: /file1}
- {name: /etc/file2}
- {name: /etc/file/file3}
tasks:
- name: "Checking if config files exists"
stat:
path: "{{ item.name }}"
with_items: "{{ file_vars }}"
register: stat_result
- name: Backup Files
copy: src={{ item.name }} dest={{ item.name }}{{ ansible_date_time.date }}.bak
with_items: "{{ file_vars }}"
remote_src: yes
when: stat_result.stat.exists == True
The problem is the condition
when: stat_result.stat.exists == True
There is no attribute stat_result.stat. Instead, the attribute stat_result.results is a list of the results from the loop. It's possible to create a dictionary of files and their statuses. For example
- set_fact:
files_stats: "{{ dict(my_files|zip(my_stats)) }}"
vars:
my_files: "{{ stat_result.results|json_query('[].item.name') }}"
my_stats: "{{ stat_result.results|json_query('[].stat.exists') }}"
Then simply use this dictionary in the condition
when: files_stats[item.name]
Below is a shorter version which creates the dictionary more efficiently
- set_fact:
files_stats: "{{ dict(stat_result.results|
json_query('[].[item.name, stat.exists]')) }}"
Please try using below worked for me:
---
- name: Copy files
hosts: localhost
become: yes
become_user: root
vars_files:
- files.yml
tasks:
- name: "Checking if config files exists"
stat:
path: "{{ item }}"
with_items: "{{ files }}"
register: stat_result
- name: Ansible
debug:
msg: "{{ stat_result }}"
- name: Backup Files
copy:
src: "{{ item }}"
dest: "{{ item.bak }}"
with_items: "{{ files }}"
when: stat_result == "True"
and files.yml will look like:
---
files:
- /tmp/file1
- /tmp/file2
you can check you playbook syntax using below command:
ansible-playbook copy.yml --syntax-check
Also you do dry run your playbook before actual execution.
ansible-playbook -i localhost copy.yml --check

Resources