I have a playbook with a block that has a when condition. Inside is a task with a loop. How can I change this loop so that when the condition is false the skipped task doesn't fail?
block:
- name: create a file
lineinfile:
line: "Hello World"
path: "{{my_testfile}}"
create: yes
- name: use the file
debug:
msg: "{{ item}}"
with_lines: cat "{{my_testfile}}"
when: false
TASK [create a file] ************************************************************************************************************************************************************
TASK [use the file] *************************************************************************************************************************************************************
cat: files/my/testfile: No such file or directory
fatal: [ipad-icpi01]: FAILED! => {"msg": "lookup_plugin.lines(cat \"files/mytestfile\") returned 1"}
Change your failing task to the following which will always be able to run, even if the file does not exists, and will not use the shell or command where there is no need to:
- name: use the file
debug:
msg: "{{ item }}"
loop: "{{ (lookup('file', my_testfile, errors='ignore') | default('', true)).split('\n') }}"
The key points:
use the file lookup plugin with errors='ignore' so that it returns the file content or None rather than an error when file does not exists.
use the default filter with second option to true so that it return default value if var exists but is null or empty.
split the result on new lines to get a list of lines (empty list if file does not exist).
Note: as reported by #Vladimir, I corrected your var name which is not valid in ansible.
Test the existence of the file. For example
- block:
- name: create a file
lineinfile:
line: "Hello World"
path: "{{ my_testfile }}"
create: yes
- name: use the file
shell: '[ -f "{{ my_testfile }}" ] && cat {{ my_testfile }}'
register: result
- name: use the file
debug:
msg: "{{ item }}"
loop: "{{ result.stdout_lines }}"
when: false
The lookup plugin file should be preferred.
I ended up with a mix of the provided answers. These tasks will be skipped without failing or creating a warning.
- block:
- name: create a file
lineinfile:
line: "Hello World"
path: "{{ my_testfile }}"
create: yes
- name: get the file
slurp:
src: "{{ my_testfile }}"
register: result
- name: use the file
debug:
msg: "{{ item }}"
loop: "{{ (result['content'] | b64decode).split('\n') }}"
when: false
Related
I am having this let's call it include.yaml
#- name: "Playing with Ansible and Include files"
- hosts: localhost
connection: local
tasks:
- find: paths="./" recurse=yes patterns="test.yaml"
register: file_to_exclude
- debug: var=file_to_exclude.stdout_lines
- name: shell
shell: "find \"$(pwd)\" -name 'test.yaml'"
register: files_from_dirs
- debug: var=files_from_dirs.stdout_lines
- name: Include variable files
include_vars: "{{ item }}"
with_items:
- "{{ files_from_dirs.stdout_lines }}"
- debug: var=files
and 2 ore more test files
./dir1/test.yaml
that contains
files:
- file1
- file2
./dir2/test.yaml
that contains
files:
- file3
- file4
the result is
TASK [Include variable files] ******************************************************************************************
ok: [localhost] => (item=/mnt/c/Users/GFlorinescu/ansible_scripts/ansible/1st/test.yaml)
ok: [localhost] => (item=/mnt/c/Users/GFlorinescu/ansible_scripts/ansible/2nd/test.yaml)
TASK [debug] ***********************************************************************************************************
ok: [localhost] => {
"files": [
"file3",
"file4"
]
}
How can I get all the values in files, at the moment the last included files variable from last file overrides the files from the previous files? Of course without changing the variables names in files test.yaml?
In other words I want files to be:
ok: [localhost] => {
"files": [
"file1",
"file2",
"file3",
"file4"
]
}
To be more specific, I ask for any kind of solution or module, even not official or some github module, I don't want a specific include_vars module solution.
Put the included variables into the dictionaries with unique names. For example, create the names from the index of the loop. Then, iterate the names and concatenate the lists
- command: "find {{ playbook_dir }} -name test.yaml"
register: files_from_dirs
- include_vars:
file: "{{ item }}"
name: "{{ name }}"
loop: "{{ files_from_dirs.stdout_lines }}"
loop_control:
extended: true
vars:
name: "files_{{ ansible_loop.index }}"
- set_fact:
files: "{{ files|d([]) + lookup('vars', item).files }}"
with_varnames: "files_[0-9]+"
- debug:
var: files
give
files:
- file1
- file2
- file3
- file4
Notes:
You have to provide either a path relative to the home directory or an absolute path. See the example below
- command: "echo $PWD"
register: out
- debug:
var: out.stdout
give
out.stdout: /home/admin
For example, when you want to find the files relative to the directory of the playbook
- command: "find {{ playbook_dir }} -name test.yaml"
register: files_from_dirs
- debug:
var: files_from_dirs.stdout_lines
give
files_from_dirs.stdout_lines:
- /export/scratch/tmp8/test-987/dir1/test.yaml
- /export/scratch/tmp8/test-987/dir2/test.yaml
The same is valid for the module find. For example,
- find:
paths: "{{ playbook_dir }}"
recurse: true
patterns: test.yaml
register: files_from_dirs
- debug:
var: files_from_dirs.files|map(attribute='path')|list
give the same result
files_from_dirs.files|map(attribute='path')|list:
- /export/scratch/tmp8/test-987/dir1/test.yaml
- /export/scratch/tmp8/test-987/dir2/test.yaml
Simplify the code and put the declaration of files into the vars. For example, the below declaration gives the same result
files: "{{ query('varnames', 'files_[0-9]+')|
map('extract', hostvars.localhost, 'files')|
flatten }}"
Example of a complete playbook for testing
- hosts: localhost
vars:
files: "{{ query('varnames', 'files_[0-9]+')|
map('extract', hostvars.localhost, 'files')|
flatten }}"
tasks:
- find:
paths: "{{ playbook_dir }}"
recurse: true
patterns: test.yaml
register: files_from_dirs
- include_vars:
file: "{{ item }}"
name: "{{ name }}"
loop: "{{ files_from_dirs.files|map(attribute='path')|list }}"
loop_control:
extended: true
vars:
name: "files_{{ ansible_loop.index }}"
- debug:
var: files
(maybe off-topic, see comments)
Q: "Is there a way to write the path where it was found?"
A: Yes, it is. See the self-explaining example below. Given the inventory
shell> cat hosts
host_1 file_1=alice
host_2 file_2=bob
host_3
the playbook
- hosts: host_1,host_2,host_3
vars:
file_1_list: "{{ hostvars|json_query('*.file_1') }}"
file_2_list: "{{ hostvars|json_query('*.file_2') }}"
file_1_dict: "{{ dict(hostvars|dict2items|
selectattr('value.file_1', 'defined')|
json_query('[].[key, value.file_1]')) }}"
file_1_lis2: "{{ hostvars|dict2items|
selectattr('value.file_1', 'defined')|
json_query('[].{key: key, file_1: value.file_1}') }}"
tasks:
- debug:
msg: |-
file_1_list: {{ file_1_list }}
file_2_list: {{ file_2_list }}
file_1_dict:
{{ file_1_dict|to_nice_yaml|indent(2) }}
file_1_lis2:
{{ file_1_lis2|to_nice_yaml|indent(2) }}
run_once: true
gives
msg: |-
file_1_list: ['alice']
file_2_list: ['bob']
file_1_dict:
host_1: alice
file_1_lis2:
- file_1: alice
key: host_1
In this task I found a roundabout method to compare two files (dconfDump and dconfDumpLocalCurrent) and to set a variable (previously defined as false) to true if the two files differ.
The solution seem to work, but it looks ugly and, as a beginner with ansible, I have the impression a better solution should be existing.
---
# vars file for dconfLoad
local_changed : false
target_changed : false
---
- name: local changed is true when previous target different then local current
shell: diff /home/frank/dconfDump /home/frank/dconfDumpLocalCurrent
register: diff_oldtarget_localCurrent
register: local_changed
ignore_errors: true
- debug:
msg: CHANGED LOCALLY
when: local_changed
Some background to the task, which is an attempt to synchronize files: A file LocalCurrent is compared with LocalOld and CurrentTarget, to determine if the LocalCurrent is changed and if it is different than currentTarget. If LocalCurrent is not changed and CurrentTarget is changed, then apply the change (and set LocalOld to CurrentTarget); if LocalCurrent is changed then upload to controller.
What is the appropriate approach with ansible? Thank you for help!
You can use stat to get the checksum and then compare it. Please see below.
tasks:
- name: Stat of dconfDump
stat:
path : "/tmp/dconfDump"
register: dump
- name: SHA1 of dconfDump
set_fact:
dump_sha1: "{{ dump.stat.checksum }}"
- name: Stat of dconfDumpLocalCurrent
stat:
path : "/tmp/dconfDumpLocalCurrent"
register: dump_local
- name: SHA1 of dconfDumpLocalCurrent
set_fact:
local_sha1: "{{ dump_local.stat.checksum }}"
- name: Same
set_fact:
val: "False"
when: dump_sha1 != local_sha1
- name: Different
set_fact:
val: "True"
when: dump_sha1 == local_sha1
- name: Print
debug:
msg: "{{val}}"
Use stat and create dictionary of checksums. For example
- stat:
path: "{{ item }}"
loop:
- LocalOld
- LocalCurrent
- CurrentTarget
register: result
- set_fact:
my_files: "{{ dict(paths|zip(chkms)) }}"
vars:
paths: "{{ result.results|map(attribute='stat.path')|list }}"
chkms: "{{ result.results|map(attribute='stat.checksum')|list }}"
- debug:
var: my_files
gives (abridged) if all files are the same
my_files:
CurrentTarget: 7c73e9f589ca1f0a1372aa4cd6944feec459c4a8
LocalCurrent: 7c73e9f589ca1f0a1372aa4cd6944feec459c4a8
LocalOld: 7c73e9f589ca1f0a1372aa4cd6944feec459c4a8
Then use the dictionary to compare the checksums and copy files. For example
# If LocalCurrent is not changed and CurrentTarget is changed,
# then apply the change (and set LocalOld to CurrentTarget)
- debug:
msg: Set LocalOld to CurrentTarget
when:
- my_files['LocalCurrent'] == my_files['LocalOld']
- my_files['LocalCurrent'] != my_files['CurrentTarget']
- debug:
msg: Do not copy anything
when:
- my_files['LocalCurrent'] == my_files['LocalOld']
- my_files['LocalCurrent'] == my_files['CurrentTarget']
gives
TASK [debug] ****
skipping: [localhost]
TASK [debug] ****
ok: [localhost] =>
msg: Do not copy anything
I have to find all config.xml on a server and produce the list on a given server.
Once we registered the files list, i have to check the content in each file on the list using Ansible
I tried to derive the paths for all config.xml
register them and print the list
Added the registered variable into lineinfile path
##Derive Config.xml path
- name: Find the location of xml file
shell: find {{ wlx_mount_point }} -maxdepth 1 -name {{xml_file}} | rev | cut -d '/' -f3- | rev
register: wlx_domain_home
ignore_errors: yes
- debug:
msg: "{{ wlx_domain_home.stdout_lines|list }}"
- name: check domain home directory exists
stat:
path: "{{ wlx_domain_home |list }}"
ignore_errors: true
- debug:
msg: "{{ wlx_domain_home.stdout_lines|list }}"
- name: "Ensure Logging Settings in config.xml"
lineinfile:
line: "{{ item.line }}"
regexp: "{{ item.regexp }}"
path: "{{ wlx_domain_home.stdout_lines|list }}/config/config.xml"
state: present
backrefs: yes
register: config.xml.Logging
with_fileglob: "{{ wlx_domain_home.stdout_lines|list }}/config/config.xml"
with_items:
- line: "<logger-severity>Info</logger-severity>"
regexp: "^logger-severity.*"
Expected results are , it has to look for lines in each file and loop through the list. ` Its printing the list and not able to find the content
"_ansible_ignore_errors": true, "msg": "Destination
[u'/appl/cmpas9/user_projects/pte-ipscasws',
u'/appl/bbb/user_projects/qa-ucxservices_bkp',
u'/appl/app/user_projects/weiss_apps12',
u'appl/view/user_projects/weiss_apps12_oldbkp',
u'appl/voc/user_projects/qa-voc']/config/config.xml does not exist !"
}
This is how i fixed the issue. Now it gives output
- name: find all weblogic domain paths
shell: find /tech/appl -type f -name config.xml
register: wlx_domain_config
- debug:
var: wlx_domain_config.stdout_lines
name: "Ensure allow-unencrypted-null-cipher Encryption Settings in config.xml"
shell: grep -i "" {{ item }}
with_items: "{{ wlx_domain_config.stdout_lines }}"
register: allowunencrypted
debug:
var: allowunencrypted.stdout_lines
I have created playbook which will run on a remote host and check whether the files exist or not. I want to extract the only files which are not present on the remote host. But my playbook giving all paths whether they are present or not.
Playbook:-
- name: Playbook for files not present on remote hosts
hosts: source
gather_facts: false
vars:
Filepath: /opt/webapps/obiee/oracle_common/inventory/ContentsXML/comps.xml
tasks:
- name: Getting files location path
shell: grep -i "COMP NAME" {{ Filepath }} |sed 's/^.*INST_LOC="//'|cut -f1 -d'"' | sed '/^$/d;s/[[:blank:]]//g' // extract files from comps.xml
register: get_element_attribute
- name: check path present or not
stat:
path: "{{ item }}"
with_items:
- "{{ get_element_attribute.stdout_lines }}"
register: path_output
- name: path exists or not
set_fact:
path_item: "{{ item }}" # here i am getting the output as expected that's files not present on remote host
with_items: "{{ path_output.results }}"
register: final_output
when: item.stat.exists == False
- debug:
var: final_output # giving both output i.e. files present and absent
- name: Create a fact list
set_fact:
paths: "{{ final_output.results | map(attribute='item.item') | list }}" # i have add this condition " item.stat.exists == False' inside this stmt
- name: Print Fact
debug:
var: paths
The issue resolved by using below command:
- name: Create a fact list
set_fact:
paths: "{{ final_output.results | selectattr('item.stat.exists', 'equalto', false) | map(attribute='item.item') | list }}"
register: config_facts
The following query should get all the file names which don't exsist on the remote host and store them in the fact 'paths':
- name: Create a fact list
set_fact:
paths: "{{ final_output | json_query(query)}}"
vars:
query: "results[?(#._ansible_item_label.stat.exists==`false`)]._ansible_item_label.item"
I'd like to register the contents of bashrc for two users and edit as/if required. My play is as follows.
- name: Check bashrc
shell: cat {{ item }}/.bashrc
register: bashrc
with_items:
- "{{ nodepool_home }}"
- "{{ zuul_home }}"
- name: Configure bashrc
shell:
cmd: |
cat >> {{ item }}/.bashrc <<EOF
STUFF
EOF
with_items:
- "{{ nodepool_home }}"
- "{{ zuul_home }}"
when: '"STUFF" not in bashrc.stdout'
It fails as follows:
fatal: [ca-o3lscizuul]: FAILED! => {"failed": true, "msg": "The conditional check '\"STUFF\" not in bashrc.stdout' failed. The error was: error while evaluating conditional (\"STUFF\" not in bashrc.stdout): Unable to look up a name or access an attribute in template string ({% if \"STUFF\" not in bashrc.stdout %} True {% else %} False {% endif %}).\nMake sure your variable name does not contain invalid characters like '-': argument of type 'StrictUndefined' is not iterable\n\nThe error appears to have been in '/root/openstack-ci/infrastructure-setup/staging/zuul/create-user.yml': line 35, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n - name: Configure bashrc\n ^ here\n"}
I think, if I understand your requirement correctly, you can use the 'lineinfile' or 'blockinfile' modules and save yourself the hassle of testing for the existence of the content:
- name: Noddy example data
set_fact:
single_line: "STUFF"
multi_line: |
STUFF
STUFF
profile_dirs:
- "{{ nodepool_home }}"
- "{{ zuul_home }}"
- name: Ensure STUFF exists in file
lineinfile:
path: "{{ item }}/.bashrc"
line: "{{ single_line }}"
loop: "{{ profile_dirs }}"
- name: Ensure block of STUFF exists in file
blockinfile:
path: "{{ item }}/.bashrc"
block: "{{ multi_line }}"
loop: "{{ profile_dirs }}"
Both modules give a lot more control and you can find their docs here: lineinfile | blockinfile