I am performing a grep with multiple items.
---
- hosts: my_host
gather_facts: false
vars:
my_list:
- whatever
- something
tasks:
- name: grep for item in search path
shell: "grep -rIL {{ item }} /tmp"
register: the_grep
loop: "{{ my_list }}"
- debug:
msg: "{{ item.stdout_lines }}"
loop: "{{ the_grep.results }}"
Depending on the result, multiple files could match.
msg:
- /tmp/something.conf
- /tmp/folder/file.txt
Q: How would I configure Ansible to loop over the items in stdout_lines?
The use case I'm solving is to delete .ini sections based on the item, but in this case, Ansible doesn't loop over the stdout_lines.
- name: remove stanza from ini file
ini_file:
path: "{{ item.stdout_lines }}"
section: "{{ item.item }}"
mode: '0600'
state: absent
loop: "{{ the_grep.results }}"
when: item.stdout_lines | length > 0
It seems that this doesn't work, but configuring item.stdout_lines[0] gives the partially expected result, since Ansible will use only the first item in that list. But ofc, not the 2nd and so on.
Perhaps there's a prettier answer, but solved it by using with_nested and creating a json_query:
- name: remove stanza from ini file
ini_file:
path: "{{ item.0 }}"
section: "{{ item.1.item }}"
mode: '0600'
state: absent
with_nested:
- "{{ the_grep | json_query('results[].stdout_lines[]') }}"
- "{{ the_grep.results }}"
Related
I have several files that I need to backup in different directories. I have tried the code below and not working for me.
vars:
file_vars:
- {name: /file1}
- {name: /etc/file2}
- {name: /etc/file/file3}
tasks:
- name: "Checking if config files exists"
stat:
path: "{{ item.name }}"
with_items: "{{ file_vars }}"
register: stat_result
- name: Backup Files
copy: src={{ item.name }} dest={{ item.name }}{{ ansible_date_time.date }}.bak
with_items: "{{ file_vars }}"
remote_src: yes
when: stat_result.stat.exists == True
The problem is the condition
when: stat_result.stat.exists == True
There is no attribute stat_result.stat. Instead, the attribute stat_result.results is a list of the results from the loop. It's possible to create a dictionary of files and their statuses. For example
- set_fact:
files_stats: "{{ dict(my_files|zip(my_stats)) }}"
vars:
my_files: "{{ stat_result.results|json_query('[].item.name') }}"
my_stats: "{{ stat_result.results|json_query('[].stat.exists') }}"
Then simply use this dictionary in the condition
when: files_stats[item.name]
Below is a shorter version which creates the dictionary more efficiently
- set_fact:
files_stats: "{{ dict(stat_result.results|
json_query('[].[item.name, stat.exists]')) }}"
Please try using below worked for me:
---
- name: Copy files
hosts: localhost
become: yes
become_user: root
vars_files:
- files.yml
tasks:
- name: "Checking if config files exists"
stat:
path: "{{ item }}"
with_items: "{{ files }}"
register: stat_result
- name: Ansible
debug:
msg: "{{ stat_result }}"
- name: Backup Files
copy:
src: "{{ item }}"
dest: "{{ item.bak }}"
with_items: "{{ files }}"
when: stat_result == "True"
and files.yml will look like:
---
files:
- /tmp/file1
- /tmp/file2
you can check you playbook syntax using below command:
ansible-playbook copy.yml --syntax-check
Also you do dry run your playbook before actual execution.
ansible-playbook -i localhost copy.yml --check
---
- name: Mikrotik info
hosts: mikrotik
connection: network_cli
remote_user: root
gather_facts: false
tasks:
- name: show info
routeros_command:
commands: /system routerboard print
register: rb_info
- name: Debug info
debug:
msg: "{{ rb_info.stdout_lines }}"
Output:
routerboard: yes
model: 751G-2HnD
serial-number: 3A6502B2A2E7
firmware-type: ar7240
factory-firmware: 3.0
current-firmware: 6.42.3
upgrade-firmware: 6.43.4
I need to filter it for "upgrade-firmware" string and get output like this:
upgrade-firmware: 6.43.4
I should use regex_replace? Or I can use grep or something like that?
Any thoughts are greatly appreciated.
Thank you
(update)
Use from_yaml and combine a dictionary. For example
- set_fact:
minfo: "{{ minfo|default({})|combine(item|from_yaml) }}"
loop: "{{ rb_info.stdout_lines }}"
- debug:
var: minfo['upgrade-firmware']
give
minfo['upgrade-firmware']: 6.43.4
(for the record)
Robust solution is to write the data to template and include_vars. The tasks below
- tempfile:
register: tempfile
- template:
src: minfo.j2
dest: "{{ tempfile.path }}"
- include_vars:
file: "{{ tempfile.path }}"
name: minfo
- debug:
var: minfo
with the template
shell> cat minfo.j2
{% for item in rb_info.stdout_lines %}
{{ item }}
{% endfor %}
should give
"minfo": {
"current-firmware": "6.42.3",
"factory-firmware": 3.0,
"firmware-type": "ar7240",
"model": "751G-2HnD",
"routerboard": true,
"serial-number": "3A6502B2A2E7",
"upgrade-firmware": "6.43.4"
}
The tasks below creates variable upgrade_firmware
- set_fact:
upgrade_firmware: "{{ item.split(':').1|trim }}"
loop: "{{ rb_info.stdout_lines|map('quote')|map('trim')|list }}"
when: item is search('^upgrade-firmware')
- debug:
var: upgrade_firmware
It is possible to put all the parameters into the dictionary
- set_fact:
minfo: "{{ minfo|default({})|
combine({item.split(':').0: item.split(':').1|trim}) }}"
loop: "{{ rb_info.stdout_lines|map('quote')|map('trim')|list }}"
- debug:
var: minfo['upgrade-firmware']
I want 'lucy' to follow the user module creators' default behaviour which is to create and use a group matching the user name 'lucy'. However for 'frank' I want the primary group to be an existing one; gid 1003. So my hash looks like this:
lucy:
comment: dog
frank:
comment: cat
group: 1003
And my task looks like this:
- name: Set up local unix user accounts
user:
name: "{{ item.key }}"
comment: "{{ item.value.comment }}"
group: "{{ item.value.group | default(undef) }}"
loop: "{{ users|dict2items }}"
This doesn't work, as undef is not recognised. Nor is anything else I can think of. 'null', 'None' etc. all fail. '' creates an empty string which is not right either. I can't find out how to do it.
Any ideas?
default(omit) is what you are looking for. For example,
- name: Set up local Unix user accounts
user:
name: "{{ item.key }}"
comment: "{{ item.value.comment }}"
group: "{{ item.value.group | default(omit) }}"
loop: "{{ users|dict2items }}"
Comments
Comment by Lucas Basquerotto: "... omit only works correctly when used directly in a module, it won't work in a set_fact ..."
A: You're wrong. For example, default(omit) works both in set_fact and in the module. The first item in the list defaults to false with the result "VARIABLE IS NOT DEFINED!". The second item defaults to omit. Omitted parameter get_checksum defaults to true with the checksum in the results
shell> cat pb.yml
- hosts: localhost
tasks:
- set_fact:
test:
- "{{ gchk|default(false) }}"
- "{{ gchk|default(omit) }}"
- stat:
path: /etc/passwd
get_checksum: "{{ item }}"
loop: "{{ test }}"
register: result
- debug:
var: item.stat.checksum
loop: "{{ result.results }}"
gives
shell> ansible-playbook pb.yml | grep item.stat.checksum
item.stat.checksum: VARIABLE IS NOT DEFINED!
item.stat.checksum: 7c73e9f589ca1f0a1372aa4cd6944feec459c4a8
In addition to this, default(omit) works as expected also in some expressions. For example
- debug:
msg: "{{ {'a': item}|combine({'b': true}) }}"
loop: "{{ test }}"
gives
msg:
a: false
b: true
msg:
b: true
See the results without default values
shell> ansible-playbook pb.yml -e "gchk={{ true|bool }}"
To start off I have all my variables defined in YAML
app_dir: "/mnt/{{ item.name }}"
app_dir_ext: "/mnt/{{ item.0.name }}"
workstreams:
- name: tigers
service_workstream: tigers-svc
app_sub_dir:
- inbound
- inbound/incoming
- inbound/error
- inbound/error/processed
- name: lions
service_workstream: lions-svc
app_sub_dir:
- inbound
- inbound/incoming
- inbound/error
- inbound/error/processed
You may note app_dir: "/mnt/{{ item.name }}" and app_dir_ext: "/mnt/{{ item.0.name }}" looking odd, so I originally had my variables set as below in YAML but decided to use the above mainly due to less lines in YAML when I have a large amount of workstreams.
workstreams:
- name: tigers
service_workstream: tigers-svc
app_dir: /mnt/tigers
...
I then have Ansible code to check if the directories exists, if not create them and apply permissions (!note, have taken this approach due to a ssh timeout on operation when using the file: module on a number of very big NFS mounted shares).
- name: Check workstreams app_dir
stat:
path: "{{ app_dir }}"
register: app_dir_status
with_items:
- "{{ workstreams }}"
- name: Check workstreams app_sub_dir
stat:
path: "{{ app_dir_ext }}/{{ item.1 }}/"
register: app_sub_dir_status
with_subelements:
- "{{ workstreams }}"
- app_sub_dir
- name: create workstreams app_dir
file:
path: "/mnt/{{ item.0.name }}"
state: directory
owner: "ftp"
group: "ftp"
mode: '0770'
recurse: yes
with_nested:
- '{{ workstreams }}'
- app_dir_status.results
when:
- '{{ item.1.stat.exists }} == false'
This is a little hacky but works, however I have a 2nd, 3rd, 4th path to check..etc
My question here is how to I update/refactor the above code to use <register_name>.stat.exists == false from both app_dir_status and app_sub_dir_status to control my task ?
You don't need to make nested loop! There's all required data inside app_sub_dir_status – just strip unnecessary items.
Here's simplified example:
---
- hosts: localhost
gather_facts: no
vars:
my_list:
- name: zzz1
sub:
- aaa
- ccc
- name: zzz2
sub:
- aaa
- bbb
tasks:
- stat:
path: /tmp/{{ item.0.name }}/{{ item.1 }}
register: stat_res
with_subelements:
- "{{ my_list }}"
- sub
- debug:
msg: "path to create '{{ item }}'"
with_items: "{{ stat_res.results | rejectattr('stat.exists') | map(attribute='invocation.module_args.path') | list }}"
You can iterate over stat_res.results | rejectattr('stat.exists') | list as well, but will have to construct path again as /tmp/{{ item.item.0.name }}/{{ item.item.1 }} – note double item, because first item is an element of stat_res.results which contain another item as element of your original loop for stat task.
P.S. Also I see no reason for your first task, as subdir task can detect all missing directories.
Consider the following play. What I am trying to do is add a field, tmp_path which is basically the key and revision appended together to each element in the scripts dict.
---
- hosts: localhost
connection: local
gather_facts: no
vars:
scripts:
a.pl:
revision: 123
b.pl:
revision: 456
tasks:
- with_dict: "{{ scripts }}"
debug:
msg: "{{ item.key }}_{{ item.value.revision }}"
# - with_items: "{{ scripts }}"
# set_fact: {{item.value.tmp_path}}="{{item.key}}_{{item.value.revision}}"
# - with_items: "{{ scripts }}"
# debug:
# msg: "{{ item.value.tmp_path }}"
...
Obviously the commented code doesn't work, any idea how I can get this working? Is it possible to alter the scripts dict directly, or should I somehow be creating a new dict to reference instead?
By the way welcome to correct the terminology for what I am trying to do.
OK, I think I got a solution (below), at least to let me move forwards with this. Disadvantages are it has removed the structure of my dict and also seems a bit redundant having to redefine all the fields and use a new variable, If anyone can provide a better solution I will accept that instead.
---
- hosts: localhost
connection: local
gather_facts: no
vars:
scripts:
a.pl:
revision: 123
b.pl:
revision: 456
tasks:
- with_dict: "{{ scripts }}"
debug:
msg: "{{ item.key }}_{{ item.value.revision }}"
- with_dict: "{{ scripts }}"
set_fact:
new_scripts: "{{ (new_scripts | default([])) + [ {'name': item.key, 'revision': item.value.revision, 'tmp_path': item.key ~ '_' ~ item.value.revision}] }}"
# - debug:
# var: x
# - with_dict: "{{ scripts }}"
- with_items: "{{ new_scripts }}"
debug:
msg: "{{ item.tmp_path }}"
...
BTW credit to the following question which pointed me in the right direction:
Using Ansible set_fact to create a dictionary from register results