How to render all ansible variables to yml file with ansible 2.3 - ansible

I am storing all ansible variables to a yaml file (filtering out those that starting with 'ansible_') with this playbook:
- hosts: localhost
tasks:
- set_fact:
all_vars: "{{all_vars | default({}) |combine({item.key: item.value})}}"
when: "{{not item.key.startswith('ansible_')}}"
with_dict: "{{vars}}"
- copy:
content: "{{ all_vars }}"
dest: "/tmp/tmp.yml"
This is group_vars/all/defaults.yml
SOME_FACT1: "some-fact"
SOME_FACT2: "{{ SOME_FACT1 }}"
SOME_FACT3: "{{ SOME_FACT2 }}"
This works perfectly with ansible 2.2. But with ansible 2.3 (2.3.1.0) the variables are not rendered.
I get results like this:
... "SOME_FACT1": "some-fact", "SOME_FACT3": "{{ SOME_FACT2 }}", "SOME_FACT2": "{{ SOME_FACT1 }}" ...
How can i force ansible 2.3 to render the variables?

The problem seems, that ansible will not render vars and (I do not know why) all_vars. But any variable inside vars/all_vars is rendered properly when used directly.
So this works:
- hosts: localhost
tasks:
- set_fact:
all_vars: "{{all_vars | default([]) |union([item.key + ':{{' + item.key + '|to_json}}'])}}"
when: "{{not item.key.startswith('ansible_')}}"
with_dict: "{{vars}}"
- copy:
content: "{{ all_vars | join('\n') }}"
dest: "/tmp/tmp1.yml"
- template:
src: "/tmp/tmp1.yml"
dest: "/tmp/tmp.yml"
The idea is:
Create a file that lists all variables in the format
SOME_VAR: {{ SOME_VAR | to_json }}
...
Render that file using template.
Not very nice, but it works.

Related

Ansible - loop over multiple items in stdout_lines

I am performing a grep with multiple items.
---
- hosts: my_host
gather_facts: false
vars:
my_list:
- whatever
- something
tasks:
- name: grep for item in search path
shell: "grep -rIL {{ item }} /tmp"
register: the_grep
loop: "{{ my_list }}"
- debug:
msg: "{{ item.stdout_lines }}"
loop: "{{ the_grep.results }}"
Depending on the result, multiple files could match.
msg:
- /tmp/something.conf
- /tmp/folder/file.txt
Q: How would I configure Ansible to loop over the items in stdout_lines?
The use case I'm solving is to delete .ini sections based on the item, but in this case, Ansible doesn't loop over the stdout_lines.
- name: remove stanza from ini file
ini_file:
path: "{{ item.stdout_lines }}"
section: "{{ item.item }}"
mode: '0600'
state: absent
loop: "{{ the_grep.results }}"
when: item.stdout_lines | length > 0
It seems that this doesn't work, but configuring item.stdout_lines[0] gives the partially expected result, since Ansible will use only the first item in that list. But ofc, not the 2nd and so on.
Perhaps there's a prettier answer, but solved it by using with_nested and creating a json_query:
- name: remove stanza from ini file
ini_file:
path: "{{ item.0 }}"
section: "{{ item.1.item }}"
mode: '0600'
state: absent
with_nested:
- "{{ the_grep | json_query('results[].stdout_lines[]') }}"
- "{{ the_grep.results }}"

Can we have 2 with_items in ansible in a single task

Below is the condition
- name: Find the image
slurp:
src: "{{ IMAGE }}"
register: slurp_results
- name: Upload image
shell: |
skopeo copy -docker-archive:{{ item }}.tar docker://{{ URL }}/TESTIMAGE
with_items: "{{ (slurp_results.content|b64decode).splitlines() }}"
The above code works.
But I would need "TESTIMAGE" also to be replaced as {{ item }} like below.
skopeo copy -docker-archive:{{ item }}.tar docker://{{ URL }}/{{ item }}
How to define 2 with_items in a single shell task with 2 different slurp results
I believe you can by using the subelements module. Here is a link. Try going by this example:
- name: Setup MySQL users, given the mysql hosts and privs subkey lists
mysql_user:
name: "{{ item.0.name }}"
password: "{{ item.0.mysql.password }}"
host: "{{ item.1 }}"
priv: "{{ item.0.mysql.privs | join('/') }}"
with_subelements:
- "{{ users }}"
- mysql.hosts
Users is referred to as item.0 and hosts as item.1 and so on.

How to extract the output from stdout.lines in ansible

---
- name: Mikrotik info
hosts: mikrotik
connection: network_cli
remote_user: root
gather_facts: false
tasks:
- name: show info
routeros_command:
commands: /system routerboard print
register: rb_info
- name: Debug info
debug:
msg: "{{ rb_info.stdout_lines }}"
Output:
routerboard: yes
model: 751G-2HnD
serial-number: 3A6502B2A2E7
firmware-type: ar7240
factory-firmware: 3.0
current-firmware: 6.42.3
upgrade-firmware: 6.43.4
I need to filter it for "upgrade-firmware" string and get output like this:
upgrade-firmware: 6.43.4
I should use regex_replace? Or I can use grep or something like that?
Any thoughts are greatly appreciated.
Thank you
(update)
Use from_yaml and combine a dictionary. For example
- set_fact:
minfo: "{{ minfo|default({})|combine(item|from_yaml) }}"
loop: "{{ rb_info.stdout_lines }}"
- debug:
var: minfo['upgrade-firmware']
give
minfo['upgrade-firmware']: 6.43.4
(for the record)
Robust solution is to write the data to template and include_vars. The tasks below
- tempfile:
register: tempfile
- template:
src: minfo.j2
dest: "{{ tempfile.path }}"
- include_vars:
file: "{{ tempfile.path }}"
name: minfo
- debug:
var: minfo
with the template
shell> cat minfo.j2
{% for item in rb_info.stdout_lines %}
{{ item }}
{% endfor %}
should give
"minfo": {
"current-firmware": "6.42.3",
"factory-firmware": 3.0,
"firmware-type": "ar7240",
"model": "751G-2HnD",
"routerboard": true,
"serial-number": "3A6502B2A2E7",
"upgrade-firmware": "6.43.4"
}
The tasks below creates variable upgrade_firmware
- set_fact:
upgrade_firmware: "{{ item.split(':').1|trim }}"
loop: "{{ rb_info.stdout_lines|map('quote')|map('trim')|list }}"
when: item is search('^upgrade-firmware')
- debug:
var: upgrade_firmware
It is possible to put all the parameters into the dictionary
- set_fact:
minfo: "{{ minfo|default({})|
combine({item.split(':').0: item.split(':').1|trim}) }}"
loop: "{{ rb_info.stdout_lines|map('quote')|map('trim')|list }}"
- debug:
var: minfo['upgrade-firmware']

Ansible access same variables from multiple Json files

I have multiple .json files on local host where I place my playbook:
json-file-path/{{ testName }}.json
{{ testName }}.json are: testA.json, testB.json, testC.json ... etc.
All .json files have same keys with different values like this:
json-file-path/testA.json:
{
“a_key”: “a_value1”
“b_key”: “b_value1”
}
json-file-path/testB.json:
{
“a_key”: “a_value2”
“b_key”: “b_value2”
}
json-file-path/testC.json:
{
“a_key”: “a_value3”
“b_key”: “b_value3”
}
.....
I need to access the key-value variables from all .json files and if the values meet some condition, I will perform some task in target host. For example, I have:
a_value1=3
a_value2=4
a_value3=1
I go through my .json file one by one, if a_key[value]>3, I will copy this .json file to target host, otherwise skip the task. In this case, I will only copy testC.json to target host.
How would I achieve this? I was thinking of re-constructing my .json files using {{ testName }} as dynamic key of dict like this:
{
“testName”: “testA”
{
“a_key”: “a_value1”
“b_key”: “b_value1”
}
So I can access my variable as {{ testName}}.a_key. So far I haven’t been able to achieve this.
I have tried the following in my playbook:
—-
- host: localhost
tasks:
- name: construct json files
vars:
my_vars:
a_key: “{{ a_value }}”
b_key: “{{ b_value }}”
with_dict: “{{ testName }}”
copy:
content: “{{ my_vars | to_nice_json }}”
dest: /json-file-path/{{ testName }}.json
My updated playbook are:
/mypath/tmp/include.yaml:
—-
- hosts: remote_hostName
tasks:
- name: load json files
set_fact:
json_data: “{{ lookup(‘file’, item) | from_json }}”
- name: copy json file if condition meets
copy:
src: “{{ item }}”
dest: “{{ /remote_host_path/tmp}}/{{item | basename }}”
delegate_to: “{{ remote_hostName }}”
when: json_data.a_key|int>5
/mypath/test.yml:
—-
- hosts: localhost
vars:
local_src_ dir: /mypath/tmp
remote_host: remote_hostName
remote_dest_dir: /remote_host_path/tmp
tasks:
- name: looping
include: include.yaml
with_fileglob:
- “{{ local_src_dir }}/*json”
All json files on localhost under /mypath/tmp/.
Latest version of playbook. It is working now:
/mypath/tmp/include.yaml:
—-
- name: loafing json flies
include_vars:
file: “{{ item }}”
name: json_data
- name: copy json file to remote if condition meets
copy:
src: “{{ item }}”
dest: ‘/remote_host_path/tmp/{{item | basename}}’
delegate_to: “{{ remote_host }}”
when: json_data.a_key > 5
/mypath/test.yml:
—-
- hosts: localhost
vars:
local_src_dir: /mypath/tmp
remote_host: remote_hostName
remote_dest_dir: /remote_host_path/tmp
tasks:
- name: looping json files
include: include.yaml
with_fileglob:
- “{{ local_src_dir }}”/*json”
I am hoping that I have understood your requirements correctly, and that this helps move you forward.
Fundamentally, you can load each of the JSON files so you can query the values as native Ansible variables. Therefore you can loop through all the files, read each one, compare the value you are interested in and then conditionally copy to your remote host via a delegated task. Therefore, give this a try:
Create an include file include.yaml:
---
# 'item' contains a path to a local JSON file on each pass of the loop
- name: Load the json file
set_fact:
json_data: "{{ lookup('file', item) | from_json }}"
- name: Delegate a copy task to the remote host conditionally
copy:
src: "{{ item }}"
dest: "{{ remote_dest_dir }}/{{ item | basename }}"
delegate_to: "{{ remote_host }}"
when: json_data.a_key > value_threshold
then in your playbook:
---
- hosts: localhost
connection: local
# Set some example vars, tho these could be placed in a variety of places
vars:
local_src_dir: /some/local/path
remote_host: <some_inventory_hostname>
remote_dest_dir: /some/remote/path
value_threshold: 3
tasks:
- name: Loop through all *json files, passing matches to include.yaml
include: include.yaml
loop: "{{ lookup('fileglob', local_src_dir + '/*json').split(',') }}"
Note: As you are running an old version of Ansible, you may need older alternate syntax for all of this to work:
In your include file:
- name: Load the json file
set_fact:
include_vars: "{{ item }}"
- name: Delegate a copy task to the remote host conditionally
copy:
src: "{{ item }}"
dest: "{{ remote_dest_dir }}/{{ item | basename }}"
delegate_to: "{{ remote_host }}"
when: a_key > value_threshold
and in your playbook:
- name: Loop through all *json files, passing matches to include.yaml
include: include.yaml
with_fileglob:
- "{{ local_src_dir }}/*json"

Adding field to dict items

Consider the following play. What I am trying to do is add a field, tmp_path which is basically the key and revision appended together to each element in the scripts dict.
---
- hosts: localhost
connection: local
gather_facts: no
vars:
scripts:
a.pl:
revision: 123
b.pl:
revision: 456
tasks:
- with_dict: "{{ scripts }}"
debug:
msg: "{{ item.key }}_{{ item.value.revision }}"
# - with_items: "{{ scripts }}"
# set_fact: {{item.value.tmp_path}}="{{item.key}}_{{item.value.revision}}"
# - with_items: "{{ scripts }}"
# debug:
# msg: "{{ item.value.tmp_path }}"
...
Obviously the commented code doesn't work, any idea how I can get this working? Is it possible to alter the scripts dict directly, or should I somehow be creating a new dict to reference instead?
By the way welcome to correct the terminology for what I am trying to do.
OK, I think I got a solution (below), at least to let me move forwards with this. Disadvantages are it has removed the structure of my dict and also seems a bit redundant having to redefine all the fields and use a new variable, If anyone can provide a better solution I will accept that instead.
---
- hosts: localhost
connection: local
gather_facts: no
vars:
scripts:
a.pl:
revision: 123
b.pl:
revision: 456
tasks:
- with_dict: "{{ scripts }}"
debug:
msg: "{{ item.key }}_{{ item.value.revision }}"
- with_dict: "{{ scripts }}"
set_fact:
new_scripts: "{{ (new_scripts | default([])) + [ {'name': item.key, 'revision': item.value.revision, 'tmp_path': item.key ~ '_' ~ item.value.revision}] }}"
# - debug:
# var: x
# - with_dict: "{{ scripts }}"
- with_items: "{{ new_scripts }}"
debug:
msg: "{{ item.tmp_path }}"
...
BTW credit to the following question which pointed me in the right direction:
Using Ansible set_fact to create a dictionary from register results

Resources