Ansible - best method to copy results of json_query to file - ansible

Below is an excerpt from a playbook that queries a REST service, stores the results, and displays to the screen via the debug module. What is the best way to write the result to a file with the desired format?
vars:
jquery: "json[].{File: filename, Path: filepath, Size: size}"
uri:
url: "https://somewhere.com/subscriptions"
register: "subscriptions"
debug:
msg: "{{ subscriptions | json_query(jquery) }}"
copy:
content: "{{ subscriptions | json_query(jquery) }}"
dest: "./subscriptions.txt"
The debug output looks like the below:
{ File: "afile",
Path: "somepath/afile",
Size: "9999.0"
},
{ File: "bfile",
Path: "somepath/bfile",
Size: "9999.0"
}
Using the copy module the results are all ran together. What is the best way to preserve the formatting of the debug ouput? Bonus points if the "{}," characters can be removed.
I'm assuming the correct answer involves the use of templating?
Thank you.

The shortest answer is probably:
copy:
content: "{{ subscriptions | json_query(jquery) | to_nice_json }}"
dest: "./subscriptions.txt"
This will produce nicely formatted JSON output. Because your result is a list, the output file will look like:
[
{
"File": "afile",
"Path": "somepath/afile",
"Size": "9999.0"
},
{
"File": "bfile",
"Path": "somepath/bfile",
"Size": "9999.0"
}
]
If you want output that isn't valid JSON, you'll probably have to use
the template module to produce it yourself.
If you want your output to look like this:
File: afile,
Path: somepath/afile
Size: 9999.0
File: bfile,
Path: somepath/bfile
Size: 9999.0
You might write:
copy:
content: |
{% for item in items %}
File: {{ item.File }}
Path: {{ item.Path }}
Size: {{ item.Size }}
{% endfor %}
dest: "./subscriptions.txt"
vars:
items: "{{ subscriptions | json_query(jquery) }}"
...but ideally you would use the template module instead of
embedding the template in the copy module like this (that's just a matter of moving the content of the content key in this example into a file, and using that as the src of the template task).

Related

how do I use an ansible string variable that contains jinja delimiters?

this
- name: ugly
hosts: localhost
vars:
badstr: "asdf{%jkl"
mydir: "."
mydict:
filea:
Value: "blue!42!"
fileb:
Value: "a{%isbad"
tasks:
- copy:
dest: "{{ item.key }}"
content: "{{ item.value.Value }}"
loop: "{{ mydict|default({})|dict2items }}"
gives me this error:
fatal: [localhost]: FAILED! => {"msg": "An unhandled exception occurred while templating 'asdf{%jkl'. Error was a <class 'ansible.errors.AnsibleError'>, original message: template error while templating string: Encountered unknown tag 'jkl'.. String: asdf{%jkl"}
The 'mydict' structure is returned from a plugin and I do not get to define the members. One of the 'Value's contains a "{%". Any reference to it will cause an error, whether as a variable, file content or in a template.
I have tried all kinds of quoting and combinations of unsafe, {{, %raw, etc. It either gives me the error or puts the name of the variable in the file.
How can I write the value to a file? Or just use it as a variable?
Ansible 2.8.4 on MacOS 11.3, also ansible 2.9 on RHEL 7.
You can use !unsafe for the variables expected to have these chars. Check this documentation. when !unsafe is used, the string/variable will never get templated.
- name: ugly
hosts: localhost
vars:
badstr: !unsafe "asdf{%jkl"
mydir: "."
mydict:
filea:
Value: !unsafe "blue!42!"
fileb:
Value: !unsafe "a{%isbad"
tasks:
- copy:
dest: "{{ item.key }}"
content: "{{ item.value.Value }}"
loop: "{{ mydict|default({})|dict2items }}"
When handling values returned by lookup plugins, Ansible uses a data
type called unsafe to block templating. Marking data as unsafe
prevents malicious users from abusing Jinja2 templates to execute
arbitrary code on target machines. The Ansible implementation ensures
that unsafe values are never templated. It is more comprehensive than
escaping Jinja2 with {% raw %} ... {% endraw %} tags.
You can use the same unsafe data type in variables you define, to
prevent templating errors and information disclosure. You can mark
values supplied by vars_prompts as unsafe. You can also use unsafe in
playbooks. The most common use cases include passwords that allow
special characters like { or %, and JSON arguments that look like
templates but should not be templated. For example:
---
mypassword: !unsafe 234%234{435lkj{{lkjsdf
The problem here is not in the copy task where the values are
evaluated; the problem is how they are being set. For example, if I
create a simple ansible module named example.sh that looks like
this:
#!/bin/sh
cat <<EOF
{
"files": {
"filea": {
"Value": "blue!42!"
},
"fileb": {
"Value": "a{%isbad"
}
}
}
EOF
I can write a playbook like this:
- name: ugly
hosts: localhost
tasks:
- example:
register: mydict
- copy:
dest: "{{ item.key }}"
content: "{{ item.value.Value }}"
loop: "{{ mydict.files|dict2items }}"
And this runs as expected, creating without any errors a file fileb with the content:
a{%isbad
Similarly, if I read the data from a JSON file and pass it through from_json, it also works fine:
- name: ugly
hosts: localhost
tasks:
- set_fact:
mydict: "{{ lookup('file', 'data.json')|from_json }}"
- copy:
dest: "{{ item.key }}"
content: "{{ item.value.Value }}"
loop: "{{ mydict.files|dict2items }}"
The problem only happens if you define the variables in a context in
which Ansible is looking for Jinja templating -- so, as the values of
variables in a playbook, a vars file, the arguments to set_fact,
etc.
You can potentially work around the problem by changing how you are
consuming these values.

How can I loop over a list of dicts and their content lists

I have the following var
---
- hosts: all
vars:
new_service:
name: test
Unit:
- option: Description
value: "Testname"
Service:
- option: ExecStart
value: "/usr/bin/python3 -m http.server 8080"
- option: WorkingDirectory
value: /home/test/testservice/html
I want to be able to use the ini_file module to create a service template so that the above var is converted into the following ini file
[Unit]
Description=Testname
[Service]
ExecStart=/usr/bin/python3 -m http.server 8080
WorkingDirectory=/home/test/testservice/html
I cannot figure out how to loop over this. I was thinking to use the product() so as to loop over nested lists, maybe something like this?
- name: "Create new unit section of service file"
ini_file:
path: "~{{ USERNAME }}/.config/systemd/user/{{ new_service[name] }}"
section: "{{ item.0 }}"
option: "{{ item.1.option }}"
value: "{{ item.1.value }}"
loop: "{{ ['unit', 'service'] | product({{ new_service[item.0] }})"
But I do not believe item is defined in the loop definition itself
(The reason I'm going with ini_file rather than template is because I want the service file creation to be able to handle any number of fields on demand)
You can still use a template to have a variable number of sections and options. Using loop with ini_file here is not efficient IMO. The only real use case would be if you need to keep the original contents of the file only adding new ones. But performance will be dramatically lower than a single template, especially if your have a lot of elements.
The only difficulty I see is that you have a name attribute in your dict which is not a section title. But it can be easily ruled out.
template.j2
{% for section in new_service %}
{% if section != 'name' %}
[{{ section }}]
{% for option in new_service[section] %}
{{ option.option }}={{ option.value }}
{% endfor %}
{% endif %}
{% endfor %}
Back to original question
If you really want to go through the loop route, it is still possible but will require quite a bit of effort with your actual data structure (loop/set_fact/... to finally get a single loopable structure).
If possible, I would change it to the following:
new_service:
name: test
sections:
- section: Unit
options:
- option: Description
value: "Testname"
- section: Service
options:
- option: ExecStart
value: "/usr/bin/python3 -m http.server 8080"
- option: WorkingDirectory
value: /home/test/testservice/html
And you can then directly loop through this structure using the subelements lookup. Note that "name" (on top level) is not a var but a string identifier for your service name value and should be used as such (fixed in my below example):
- name: "Create new unit section of service file"
ini_file:
path: "~{{ USERNAME }}/.config/systemd/user/{{ new_service.name }}"
section: "{{ item.0.section }}"
option: "{{ item.1.option }}"
value: "{{ item.1.value }}"
loop: "{{ lookup('subelements', new_service.sections, 'options') }}"
You can easily adapt my first example template to this new data structure as well if needed.

Ansible: How to do lookup in a CSV file and populate relevant file contents

So the logic I'm looking for is
Get the list of filenames from a template directory
Tidy-up filename down to match the format of employeeID from a CSV file
Check the employeeID in a lookup file (csv file)
if Found, get the other reference information and populate the template for the employeeID.conf
I've below similar data in a LookupFile
EmployeeID,EmployeeName,EmployeeCountry
E123,John,USA
E345,George,UK
...
Set of template filenames
E123.conf.j2
E345.conf.j2
...
Each template contains (eg E123.conf)
{
"id": {{EmployeeID}},
"name": {{EmployeeName}},
"country": {{EmployeeCountry}},
"somethingUnique": "hardcodedValueForEmployee"
}
I was able to get logic to populate within the template, but comparing with filename is not working.
The coding I've done so far is (but after while registering values, i'm kind stuck). Below is what I've done till now
- name: "List templates and get filenames from a huge list of templates"
find:
paths: "{{base_dir_template}}"
patterns: "*.j2"
file_type: file
register: emp_usecase_templates
- name: "Derive EmpID from filenames so as to compare it with lookup"
set_fact: emp_usecase_derived_list="{{item.path | basename | replace('.conf.j2', '')}}"
with_items: "{{emp_usecase_templates.files}}"
register: emp_usecase_derived_list_result
- name: "Set Employee variables into template. But not working."
set_fact:
EmployeeName: "{{ lookup_file | selectattr('EmployeeID','match',item) | map(attribute='EmployeeName') | list }}"
with_items: "{{emp_usecase_derived_list_result}}"
The tasks below
- read_csv:
path: employees.csv
key: EmployeeID
register: employees
- name: List templates and get filenames from a huge list of templates
find:
paths: "{{ base_dir_template }}"
patterns: "*.j2"
file_type: file
register: emp_usecase_templates
- name: Set Employee variables into template
template:
src: "{{ item }}"
dest: "{{ my_filename }}"
loop: "{{ emp_usecase_templates.files|map(attribute='path')|list }}"
vars:
my_template: "{{ item|basename }}"
my_filename: "{{ (my_template|splitext).0 }}"
EmployeeID: "{{ my_template.split('.').0 }}"
EmployeeName: "{{ employees.dict[EmployeeID]['EmployeeName'] }}"
EmployeeCountry: "{{ employees.dict[EmployeeID]['EmployeeCountry'] }}"
created the files
shell> cat E123.conf
{
"id": E123,
"name": John,
"country": USA,
"somethingUnique": "hardcodedValueForEmployee"
}
shell> cat E345.conf
{
"id": E345,
"name": George,
"country": UK,
"somethingUnique": "hardcodedValueForEmployee"
}

JMESPath query expression with value range

I have the below json that has a range. I am trying to get values from json for a specific entry from the range to be used as an ansible variable .
for instance i would like to get the folder value of of server002 from below json to be used as an ansible variable using JSON Query Filter. Please help.
[
{"hosts": "server001:060",
"values": {
"folder": "/my_folder1/",
"pool": "pool1",
"dsname": "DS1",
"network": "nw_1"
}},
{"hosts": "server061:080",
"values": {
"folder": "/my_folder2/",
"pool": "pool2",
"dsname": "DS2",
"network": "nw_2"
}}
]
I don't see a server002 in your example, but below is an example search for the second server in your list. (Change 'json_file_path' to the path where your JSON file is located.)
- name: Set search facts
set_fact:
host_to_find: 'server061:080'
json_file_path: <path to json file>
- name: Get data for host
vars:
hosts_data: "{{ lookup('file', json_file_path) | from_json }}"
set_fact:
host: "{{ hosts_data | selectattr('hosts', 'match', host_to_find) | list | first }}"
- name: Display value of folder var
debug:
var: host['values']['folder']
Below is a working play which should satisfy your use-case:
---
- name: JSON range extraction
hosts: 127.0.0.1
connection: local
gather_facts: no
tasks:
- name: Set facts for search
set_fact:
host_to_find: '002'
hosts_json_string: '[{"hosts":"server001:060","values":{"folder":"/my_folder1/","pool":"pool1","dsname":"DS1","network":"nw_1"}},{"hosts":"server061:080","values":{"folder":"/my_folder2/","pool":"pool2","dsname":"DS2","network":"nw_2"}}]'
- name: Convert json string to facts
set_fact:
hosts_data: "{{ hosts_json_string | from_json }}"
- name: Sort json by hosts and replace the value of hosts to make range extraction easier
set_fact:
sorted_hosts: "{{hosts_data|sort(attribute='hosts')|regex_replace('(server(\\d+):(\\d+))','\\2-\\3')}}"
- name: Find index of host_to_find in sorted_hosts and set_fact
vars:
hosts_range: "{{sorted_hosts| json_query('[*].hosts')}}"
set_fact:
host_index: "{% for range in hosts_range %}{% set range_split = range.split('-') %}{% if ((host_to_find|int >= range_split[0]|int) and (host_to_find|int <= range_split[1]|int)) %}{{ loop.index0 }}{% endif %}{% endfor %}"
- name: Get the folder location
set_fact:
folder_location: "{{ sorted_hosts[host_index|int].get('values').folder }}"
when: not host_index == ""
...

How to combine two lists together?

I have two lists:
the_list:
- { name: foo }
- { name: bar }
- { name: baz }
and I use a task which gets some value for its every element:
- name: Get values
shell:
magic_command {{ item.name }}
with_items: the_list
register: spells
from now on I can use the_list and its correspondig values together:
- name: Use both
shell:
do_something {{ item.0.name }} {{ item.1.stdout }}
with_together:
- "{{ the_list }}"
- "{{ spells.results }}"
All works fine but it's uncomfortable to use with_together for many tasks and it'll be hard to read that code in a future so I would be more than happy to build merged_list from that which I can use in a simple way. Let say something like this:
merged_list:
- { name: foo, value: jigsaw }
- { name: bar, value: crossword }
- { name: baz, value: monkey }
which makes the puzzle. Anyone can help ?
I wrote two ansible filters to tackle this problem: zip and todict which are available in my repo at https://github.com/ahes/ansible-filter-plugins
Sample ansible playbook:
- hosts: localhost
vars:
users:
- { name: user1 }
- { name: user2 }
tasks:
- name: Get uids for users
command: id -u {{ item.name }}
register: uid_results
with_items: users
- set_fact:
uids: "{{ uid_results.results | map(attribute='stdout') | todict('uid') }}"
- set_fact:
users: "{{ users | zip(uids) }}"
- name: Show users with uids
debug: var=users
Result would be:
TASK [Show users with uids] ****************************************************
ok: [localhost] => {
"users": [
{
"name": "user1",
"uid": "1000"
},
{
"name": "user2",
"uid": "2000"
}
]
}
It may be an overkill but you should try to write a custom filter plugin.
Each time you iterates the_list you simple wants to add value to that dict {name: 'foo'} right?
After the update you just want that the new dict has the value like: {name: 'foo', value: 'jigsaw'}
The filter plugin for that it's pretty simple:
def foo(my_list, spells):
try:
aux = my_list
for i in xrange(len(aux)):
my_list[i].update(spells[i])
return my_list
except Exception, e:
raise errors.AnsibleFilterError('Foo plugin error: %s, arguments=%s' % str(e), (my_list,spells) )
class FilterModule(object):
def filters(self):
return {
'foo' : foo
}
After adding this python code inside your plugins directory, you can easily call the foo plugin passing the spells list as a parameter:
- name: Get values
shell:
magic_command {{ item.name }}
with_items: the_list
register: spells
- name: Use both
shell:
do_something {{ item.name }} {{ item.value }}
with_items:
- "{{ the_list | foo(spells.results) }}"
NOTE: The python code it's just an example. Read the ansible documentations about developing filter plugins.
I think I've found a cleaner, easier way to deal with these kind of things. Ansible runs all strings through jinja and then tries to load the result as yaml. This is because jinja only outputs strings so that allows it to load a data structure from a variable if there is one.
So any valid yaml in a string is loaded as a data structure -- so if you template valid yaml it will get loaded as data.
Trying to template correct yaml in the conventional, human form is tricky. But yaml loads all json. So, json is easier because there is no need to worry about whitespace. One bonus though, yaml does not care about extra commas, so that makes templating it easier.
In this case here is the playbook from the top answer rewritten to use this method.
- hosts: localhost
vars:
users:
- { name: "user1" }
- { name: "user2" }
tasks:
- name: Get uids for users
command: id -u {{ item.name }}
register: uid_results
loop: "{{ users }}"
- name: Show users with uids
debug: var=users_with_uids
vars:
users_with_uids: |
[
{% for user_dict, uid in users | zip(uids) %}
{
"name": {{ user_dict['name'] | to_json }},
"uid": {{ uid | to_json }},
},
{% endfor %}
]
uids: "{{ uid_results.results | map(attribute='stdout') }}"
Notes
The | character tells yaml to load a multi-line string. Instead of putting the variables in quotes I use the to_json filter which will quote it and, more importantly, automatically escape anything in the variable that needs escaping. Also, remember commas after list or dictionary elements.
The results should be the same:
TASK [Show users with uids] ************************************************************
ok: [localhost] => {
"users_with_uids": [
{
"name": "user1",
"uid": "1000"
},
{
"name": "user2",
"uid": "1001"
}
]
}
One more thing
I like to use the yaml callback especially for testing this. That way if my json-looking yaml doesn't get loaded I'll see a json-like structure. Otherwise it will come back in normal looking yaml if it was loaded. You can enable this by environment variable -- export ANSIBLE_STDOUT_CALLBACK=community.general.yaml.

Resources