In this task I found a roundabout method to compare two files (dconfDump and dconfDumpLocalCurrent) and to set a variable (previously defined as false) to true if the two files differ.
The solution seem to work, but it looks ugly and, as a beginner with ansible, I have the impression a better solution should be existing.
---
# vars file for dconfLoad
local_changed : false
target_changed : false
---
- name: local changed is true when previous target different then local current
shell: diff /home/frank/dconfDump /home/frank/dconfDumpLocalCurrent
register: diff_oldtarget_localCurrent
register: local_changed
ignore_errors: true
- debug:
msg: CHANGED LOCALLY
when: local_changed
Some background to the task, which is an attempt to synchronize files: A file LocalCurrent is compared with LocalOld and CurrentTarget, to determine if the LocalCurrent is changed and if it is different than currentTarget. If LocalCurrent is not changed and CurrentTarget is changed, then apply the change (and set LocalOld to CurrentTarget); if LocalCurrent is changed then upload to controller.
What is the appropriate approach with ansible? Thank you for help!
You can use stat to get the checksum and then compare it. Please see below.
tasks:
- name: Stat of dconfDump
stat:
path : "/tmp/dconfDump"
register: dump
- name: SHA1 of dconfDump
set_fact:
dump_sha1: "{{ dump.stat.checksum }}"
- name: Stat of dconfDumpLocalCurrent
stat:
path : "/tmp/dconfDumpLocalCurrent"
register: dump_local
- name: SHA1 of dconfDumpLocalCurrent
set_fact:
local_sha1: "{{ dump_local.stat.checksum }}"
- name: Same
set_fact:
val: "False"
when: dump_sha1 != local_sha1
- name: Different
set_fact:
val: "True"
when: dump_sha1 == local_sha1
- name: Print
debug:
msg: "{{val}}"
Use stat and create dictionary of checksums. For example
- stat:
path: "{{ item }}"
loop:
- LocalOld
- LocalCurrent
- CurrentTarget
register: result
- set_fact:
my_files: "{{ dict(paths|zip(chkms)) }}"
vars:
paths: "{{ result.results|map(attribute='stat.path')|list }}"
chkms: "{{ result.results|map(attribute='stat.checksum')|list }}"
- debug:
var: my_files
gives (abridged) if all files are the same
my_files:
CurrentTarget: 7c73e9f589ca1f0a1372aa4cd6944feec459c4a8
LocalCurrent: 7c73e9f589ca1f0a1372aa4cd6944feec459c4a8
LocalOld: 7c73e9f589ca1f0a1372aa4cd6944feec459c4a8
Then use the dictionary to compare the checksums and copy files. For example
# If LocalCurrent is not changed and CurrentTarget is changed,
# then apply the change (and set LocalOld to CurrentTarget)
- debug:
msg: Set LocalOld to CurrentTarget
when:
- my_files['LocalCurrent'] == my_files['LocalOld']
- my_files['LocalCurrent'] != my_files['CurrentTarget']
- debug:
msg: Do not copy anything
when:
- my_files['LocalCurrent'] == my_files['LocalOld']
- my_files['LocalCurrent'] == my_files['CurrentTarget']
gives
TASK [debug] ****
skipping: [localhost]
TASK [debug] ****
ok: [localhost] =>
msg: Do not copy anything
Related
I am having this let's call it include.yaml
#- name: "Playing with Ansible and Include files"
- hosts: localhost
connection: local
tasks:
- find: paths="./" recurse=yes patterns="test.yaml"
register: file_to_exclude
- debug: var=file_to_exclude.stdout_lines
- name: shell
shell: "find \"$(pwd)\" -name 'test.yaml'"
register: files_from_dirs
- debug: var=files_from_dirs.stdout_lines
- name: Include variable files
include_vars: "{{ item }}"
with_items:
- "{{ files_from_dirs.stdout_lines }}"
- debug: var=files
and 2 ore more test files
./dir1/test.yaml
that contains
files:
- file1
- file2
./dir2/test.yaml
that contains
files:
- file3
- file4
the result is
TASK [Include variable files] ******************************************************************************************
ok: [localhost] => (item=/mnt/c/Users/GFlorinescu/ansible_scripts/ansible/1st/test.yaml)
ok: [localhost] => (item=/mnt/c/Users/GFlorinescu/ansible_scripts/ansible/2nd/test.yaml)
TASK [debug] ***********************************************************************************************************
ok: [localhost] => {
"files": [
"file3",
"file4"
]
}
How can I get all the values in files, at the moment the last included files variable from last file overrides the files from the previous files? Of course without changing the variables names in files test.yaml?
In other words I want files to be:
ok: [localhost] => {
"files": [
"file1",
"file2",
"file3",
"file4"
]
}
To be more specific, I ask for any kind of solution or module, even not official or some github module, I don't want a specific include_vars module solution.
Put the included variables into the dictionaries with unique names. For example, create the names from the index of the loop. Then, iterate the names and concatenate the lists
- command: "find {{ playbook_dir }} -name test.yaml"
register: files_from_dirs
- include_vars:
file: "{{ item }}"
name: "{{ name }}"
loop: "{{ files_from_dirs.stdout_lines }}"
loop_control:
extended: true
vars:
name: "files_{{ ansible_loop.index }}"
- set_fact:
files: "{{ files|d([]) + lookup('vars', item).files }}"
with_varnames: "files_[0-9]+"
- debug:
var: files
give
files:
- file1
- file2
- file3
- file4
Notes:
You have to provide either a path relative to the home directory or an absolute path. See the example below
- command: "echo $PWD"
register: out
- debug:
var: out.stdout
give
out.stdout: /home/admin
For example, when you want to find the files relative to the directory of the playbook
- command: "find {{ playbook_dir }} -name test.yaml"
register: files_from_dirs
- debug:
var: files_from_dirs.stdout_lines
give
files_from_dirs.stdout_lines:
- /export/scratch/tmp8/test-987/dir1/test.yaml
- /export/scratch/tmp8/test-987/dir2/test.yaml
The same is valid for the module find. For example,
- find:
paths: "{{ playbook_dir }}"
recurse: true
patterns: test.yaml
register: files_from_dirs
- debug:
var: files_from_dirs.files|map(attribute='path')|list
give the same result
files_from_dirs.files|map(attribute='path')|list:
- /export/scratch/tmp8/test-987/dir1/test.yaml
- /export/scratch/tmp8/test-987/dir2/test.yaml
Simplify the code and put the declaration of files into the vars. For example, the below declaration gives the same result
files: "{{ query('varnames', 'files_[0-9]+')|
map('extract', hostvars.localhost, 'files')|
flatten }}"
Example of a complete playbook for testing
- hosts: localhost
vars:
files: "{{ query('varnames', 'files_[0-9]+')|
map('extract', hostvars.localhost, 'files')|
flatten }}"
tasks:
- find:
paths: "{{ playbook_dir }}"
recurse: true
patterns: test.yaml
register: files_from_dirs
- include_vars:
file: "{{ item }}"
name: "{{ name }}"
loop: "{{ files_from_dirs.files|map(attribute='path')|list }}"
loop_control:
extended: true
vars:
name: "files_{{ ansible_loop.index }}"
- debug:
var: files
(maybe off-topic, see comments)
Q: "Is there a way to write the path where it was found?"
A: Yes, it is. See the self-explaining example below. Given the inventory
shell> cat hosts
host_1 file_1=alice
host_2 file_2=bob
host_3
the playbook
- hosts: host_1,host_2,host_3
vars:
file_1_list: "{{ hostvars|json_query('*.file_1') }}"
file_2_list: "{{ hostvars|json_query('*.file_2') }}"
file_1_dict: "{{ dict(hostvars|dict2items|
selectattr('value.file_1', 'defined')|
json_query('[].[key, value.file_1]')) }}"
file_1_lis2: "{{ hostvars|dict2items|
selectattr('value.file_1', 'defined')|
json_query('[].{key: key, file_1: value.file_1}') }}"
tasks:
- debug:
msg: |-
file_1_list: {{ file_1_list }}
file_2_list: {{ file_2_list }}
file_1_dict:
{{ file_1_dict|to_nice_yaml|indent(2) }}
file_1_lis2:
{{ file_1_lis2|to_nice_yaml|indent(2) }}
run_once: true
gives
msg: |-
file_1_list: ['alice']
file_2_list: ['bob']
file_1_dict:
host_1: alice
file_1_lis2:
- file_1: alice
key: host_1
Let say, I have this directory structure:
# ls /root/ansible_test/
one two
And the playbooks looks like this:
- name: gathering all dirs
stat:
path: /root/ansible_test/{{ item }}/
register: dir_check
changed_when: false
check_mode: no
loop:
- "one"
- "two"
- "three"
- name: check all of the dirs are created
set_fact:
all_dirs_created: true
when: item.stat.exists == true
loop: "{{ dir_check.results }}"
- debug:
msg: "Not all dirs are created!"
when: all_dirs_created is not defined
My problem is that the "one" and "two" dirs are created, so the fact will be defined because if one dir exists, then the loop will return true. I also tried opposite and checked item.stat.exists == false but if one dir is not created (three) then fact will be created also.
I would like to play the task set_fact only if all of the item in the loop is true or if one of them is false. How do I achieve this in this case?
Q: set_fact only if all of the items in the loop are true or if one of them is false
A: Count the items. For example
- set_fact:
dirs_missing: "{{ _all|int - _exist|int }}"
vars:
_all: "{{ dir_check.results|length }}"
_exist: "{{ dir_check.results|
map(attribute='stat.exists')|
select|length }}"
gives (in your case)
dirs_missing: '1'
Now, you can set whatever you want, e.g.
- name: check all of the dirs are created
set_fact:
all_dirs_created: true
when: dirs_missing|int == 0
- debug:
msg: "Not all dirs are created!
(Exactly {{ dirs_missing }} missing.)"
when: all_dirs_created is not defined
gives
TASK [check all of the dirs are created] ********************************
skipping: [localhost]
TASK [debug] ************************************************************
ok: [localhost] =>
msg: Not all dirs are created! (Exactly 1 missing.)
You can simplify the code by using the filter counter from the latest collection Community.General. See Counting elements in a sequence, e.g.
- name: check all of the dirs are created
set_fact:
all_dirs_created: true
when: _counts[false] is not defined
vars:
_counts: "{{ dir_check.results|
map(attribute='stat.exists')|
community.general.counter }}"
The solutions above counted the items of the list to enable the evaluation of the option if one of them is false. The code can be simplified further if the counting of the items is not necessary. For example, test if all items are true
- name: check all of the dirs are created
set_fact:
all_dirs_created: true
when: dir_check.results|map(attribute='stat.exists') is all
However, then you have to test the existence of the variable all_dirs_created. More practical is setting both values. Ultimately, the expected functionality of your last two tasks can be reduced to the code below
- name: check all of the dirs are created
set_fact:
all_dirs_created: "{{ dir_check.results|
map(attribute='stat.exists') is all }}"
- debug:
msg: Not all dirs are created!
when: not all_dirs_created
I am trying to set up a playbook as follows where I need to send an email notification when my task fails. The alert with email notification as such works as expected, however I am unable to capture the output just for the failed conditions and currently the way I am registering gives me all results including the variables (file paths in this example) where the condition does not fail.
---
- hosts: test-box
tasks:
- name: alert on failure
block:
- name: generate a list of desired file paths
find:
paths: /base/sdir
recurse: yes
file_type: file
patterns: "abrn.*.dat"
use_regex: yes
register: file_paths
- name: check if file stopped updating
vars:
msg: |
"{{ item }}"
"{{ ansible_date_time.epoch }}"
"{{ item.stat.mtime|int }}"
"{{ ( (ansible_date_time.epoch|int - item.stat.mtime|int) / 60 ) | int }} min"
with_items: "{{ ts.results }}"
fail:
msg: |
"{{ msg.split('\n') }}"
register: failed_items ### -> HOW TO REGISTER ONLY THE FILE PATHS (RESULTS) WHERE THIS FAIL CONDITION IS MET??
when: ( (ansible_date_time.epoch|int - item.stat.mtime|int) / 60 ) | int > 2
rescue:
- name: email notification
mail:
host: localhost
port: 25
from: A
to: B
subject: TASK FAILED
body: |
Failed subdirs: {{ failed_items }} ## This gives me all results including those where the failed condition is not met
delegate_to: localhost
...
In the body of the email, I want to capture only the file paths where the mtime condition is met but currently I get all file paths.
Any suggestions on how I can filter to capture the output only for matching condition?
Thanks.
You should use set_fact, this way you will assign a list variable only if your when condition is true.
---
- hosts: localhost
become: false
vars:
path: "{{ '%s/bin' | format(lookup('env', 'HOME')) }}"
time: "{{ 2 * 60 }}" # 2 mins
tasks:
- name: Find files in a defined path
find:
paths: "{{ path }}"
register: _result
- name: Add files not modified for longer time than defined to list
set_fact:
stale_files: "{{ stale_files | default([]) + [item.path] }}"
loop: "{{ _result.files }}"
when: ((ansible_date_time.epoch | float) - item.mtime) > (time | float)
- name: Show stale files
debug:
msg: "{{ stale_files }}"
Well you could use also another approach, ie. make loop with filtering (for me selectattr filter did not work with lt test, thus json_query filter should work, see Can't compare attribute to a number. Error: "not supported between instances of 'AnsibleUnsafeText' and 'int'").
Problem
How to load only the unique items of per file stored lists identified via the same key path into a single variable using ansible?
Scenario
There is one specific folder /myfolder on a remote machine that includes files with pattern based name file-*.yml, while they have the same structure (key path to list is always the same - same-root-key.same-list-key). Contents are shown below:
file file-1.yml with the following content:
# /myfolder/file-1.yml
---
same-root-key:
same-list-key:
- a
- b
- c
file file-2.yml with the following content:
# /myfolder/file-2.yml
---
same-root-key:
same-list-key:
- a
- x
file file-3.yml with the following content:
# /myfolder/file-3.yml
---
same-root-key:
same-list-key:
- b
file file-4.yml with the following content:
# /myfolder/file-4.yml
---
same-root-key:
same-list-key:
- y
file file-5.yml with the following content:
# /myfolder/file-5.yml
---
same-root-key:
same-list-key: []
Expectation
- name: load only the unique items of per file stored lists
set_fact:
result: # somehow load the expected items
- debug:
var: result
# "result": [
# "a",
# "b",
# "c",
# "x",
# "y"
# ]
Idea to concept (WORKING)
main.yml content:
# get list of file paths from remote machine
- find:
paths: "/myfolder"
patterns: "file-*.yml"
register: folder_files
# iteration
- name: concept
include_tasks: ./block-file.yml
loop: "{{ folder_files.files }}"
- debug:
var: results
block-file.yml content:
---
# load remote file content
- slurp:
src: '{{ item.path }}'
register: tmp_file
# get list values
- set_fact:
tmp_fact: "{{ tmp_file.content | b64decode | from_yaml }}"
# lists union
- set_fact:
result: "{{ result|default([]) | union(tmp_fact.same-root-key.same-list-key) }}"
Related
In Ansible, how to combine variables from separate files into one array?
with_fileglob works for local files (not for files on remote machine)
Ansible: read remote file
lookup() works for local files (not for files on remote machine)
https://docs.ansible.com/ansible/latest/collections/ansible/builtin/slurp_module.html
slurp should be used to load remote file content
Is there with_fileglob that works remotely in ansible?
with_fileglob on remote machine can be done using find
https://serverfault.com/questions/737007/how-to-combine-two-lists
union should be used to create a set (list consisting of unique values) from two lists
Issue looping on block module for a set of tasks in Ansible
our case may require an iteration over a block, which is not supported as a direct loop over a block
the block must be stored in a separate file, while looping over a file using include_tasks is supported
Notes
Later, after I have posted this topic, I have fixed the concept and now it's working. But I am still curious how can I achieve the same behavior in a much more elegant and cleaner way. The above mentioned concept was updated.
A simpler approach would be to fetch the files, e.g.
- fetch:
dest: "{{ fetch_dir }}"
src: "{{ item.path }}"
loop: "{{ folder_files.files }}"
given the remote host is "test_11" and "fetch_dir=fetch" gives
shell> tree fetch/test_11/myfolder/
fetch/test_11/myfolder/
├── file-1.yml
├── file-2.yml
├── file-3.yml
├── file-4.yml
└── file-5.yml
Then, collect the lists, e.g.
- set_fact:
tmp_fact: "{{ tmp_fact|default([]) +
(lookup('file', fetch_dir ~ '/' ~
inventory_hostname ~ '/' ~
item.path)|
from_yaml)['same-root-key']['same-list-key'] }}"
loop: "{{ folder_files.files }}"
gives
tmp_fact:
- a
- b
- c
- y
- b
- a
- x
Then, select the unique items
- set_fact:
result: "{{ tmp_fact|unique }}"
gives
result:
- a
- b
- c
- y
- x
Q: "The solution ... downloads files ... in contrast to the solution ... in the question (section: Idea to concept)"
A: You might want to reconsider the concept. fetch is more user-friendly compared to slurp. The proposed solution is idempotent, i.e. the files will be downloaded only when changed. Running the task repeatedly
- fetch:
dest: "{{ fetch_dir }}"
src: "{{ item.path }}"
loop: "{{ folder_files.files }}"
loop_control:
label: "{{ item.path }}"
gives
TASK [fetch] *************************************************************
ok: [test_11] => (item=/myfolder/file-1.yml)
ok: [test_11] => (item=/myfolder/file-4.yml)
ok: [test_11] => (item=/myfolder/file-3.yml)
ok: [test_11] => (item=/myfolder/file-2.yml)
ok: [test_11] => (item=/myfolder/file-5.yml)
As a result, the playbook
- hosts: test_11
gather_facts: false
vars:
fetch_dir: fetch
tasks:
- find:
paths: "/myfolder"
patterns: "file-*.yml"
register: folder_files
- fetch:
dest: "{{ fetch_dir }}"
src: "{{ item.path }}"
loop: "{{ folder_files.files }}"
loop_control:
label: "{{ item.path }}"
- set_fact:
tmp_fact: "{{ tmp_fact|default([]) +
(lookup('file', fetch_dir ~ '/' ~
inventory_hostname ~ '/' ~
item.path)|
from_yaml)['same-root-key']['same-list-key'] }}"
loop: "{{ folder_files.files }}"
shows no changes when running repeatedly
PLAY RECAP **************************************************************
test_11: ok=3 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
In your concept, you use slurp. This module always transfers the data from the remote host. The fetch module is more efficient. It compares the checksums and transfers the data only if the files are different. In addition to this, the fetch module supports check_mode.
I have a playbook with a block that has a when condition. Inside is a task with a loop. How can I change this loop so that when the condition is false the skipped task doesn't fail?
block:
- name: create a file
lineinfile:
line: "Hello World"
path: "{{my_testfile}}"
create: yes
- name: use the file
debug:
msg: "{{ item}}"
with_lines: cat "{{my_testfile}}"
when: false
TASK [create a file] ************************************************************************************************************************************************************
TASK [use the file] *************************************************************************************************************************************************************
cat: files/my/testfile: No such file or directory
fatal: [ipad-icpi01]: FAILED! => {"msg": "lookup_plugin.lines(cat \"files/mytestfile\") returned 1"}
Change your failing task to the following which will always be able to run, even if the file does not exists, and will not use the shell or command where there is no need to:
- name: use the file
debug:
msg: "{{ item }}"
loop: "{{ (lookup('file', my_testfile, errors='ignore') | default('', true)).split('\n') }}"
The key points:
use the file lookup plugin with errors='ignore' so that it returns the file content or None rather than an error when file does not exists.
use the default filter with second option to true so that it return default value if var exists but is null or empty.
split the result on new lines to get a list of lines (empty list if file does not exist).
Note: as reported by #Vladimir, I corrected your var name which is not valid in ansible.
Test the existence of the file. For example
- block:
- name: create a file
lineinfile:
line: "Hello World"
path: "{{ my_testfile }}"
create: yes
- name: use the file
shell: '[ -f "{{ my_testfile }}" ] && cat {{ my_testfile }}'
register: result
- name: use the file
debug:
msg: "{{ item }}"
loop: "{{ result.stdout_lines }}"
when: false
The lookup plugin file should be preferred.
I ended up with a mix of the provided answers. These tasks will be skipped without failing or creating a warning.
- block:
- name: create a file
lineinfile:
line: "Hello World"
path: "{{ my_testfile }}"
create: yes
- name: get the file
slurp:
src: "{{ my_testfile }}"
register: result
- name: use the file
debug:
msg: "{{ item }}"
loop: "{{ (result['content'] | b64decode).split('\n') }}"
when: false