convert dictionary keys in playbook - ansible

I have an existing playbook variable dictionary defined like:
vars:
resource_tags: {
Name: "some name"
Service: "some service"
}
This is used in various calls to tasks in this form. But in another task, I need it in a different format, and rather than have it hard-coded, I was wondering if it could be built in a task.
I need it to look like:
{
"tag:Name": "some name"
"tag:Service": "some service"
}
I tried iterating using with_dict and setting a fact with combine:
- set_fact:
ec2_remote_facts_filter: "{{ ec2_remote_facts_filter | default({}) | combine( { 'tag:'item.name: item.val } ) }}"
with_dict: "{{ ec2_count_resource_tags }}"
And obviously that doesn't work.
Is this even possible?

If you don't mind a bit of hackery:
- debug: msg="{{ resource_tags | to_json(indent=0) | regex_replace('\n\"','\n\"tag:') }}"
This will convert your dict into JSON-formatted string with indent=0, meaning each key will start from new line; then insert tag: after first double quote on every line.
Because the result is valid JSON, Ansible template engine will convert it back into dict as the last step of variable substitution, giving you:
ok: [localhost] => {
"msg": {
"tag:Name": "some name",
"tag:Service": "some service"
}
}
I suppose there may be some corner cases if there are newlines inside your values, but in general it should be fine.

Maybe you need a custom lookup plugin in your case.
1) Edit file ansible.cfg and uncomment key 'lookup_plugins' with value './plugins/lookup'
2) Create a plugin file named 'ec2remote.py' in './plugins/lookup'
3) Use it in your playbook:
- debug:
msg: "{{ item }}"
with_ec2remote: "{{ ec2_count_resource_tags }}"
4) Implements your ec2remote.py (many examples here)
class LookupModule(LookupBase):
def run(self, terms, **kwargs):
result = {}
for k,v in terms.items():
result["tag:"+k] = v
return result
Usually, I prefer to develop plugins that are easily usable and testable and thus preserve an understandable playbook.

Related

Ansible - passing dynamic variables into Jinja2 template

I'm having a problem accessing dynamically named Ansible variables in a Jinja2 template. I have a list of tenants like this:
tenants:
- liamtest1
- liamtest2
In my playbook I create terraform configuration files for each of these tenants like this:
- name: Generate a .tf file for each tenant in list
template:
src: templates/tenant.tf.j2
dest: "{{ enviro }}/terraform/{{ item }}.tf"
with_items: "{{ hostvars[inventory_hostname][enviro]['tenants'] }}"
Later in the playbook I use the terraform module to apply my configuration and register the outputs to a variable:
- name: Run terraform
terraform:
project_path: "{{ enviro }}/terraform/"
state: present
register: tf_result
I've prefixed my terraform outputs with the tenant name so that I don't get duplicates. This bit is all working fine and I can I can display these outputs with a debug task, for example tenant_domain:
- debug:
var: tf_result.outputs.{{ item + '_domain' }}.value
with_items: "{{ hostvars[inventory_hostname][enviro]['tenants'] }}"
Produces this output:
ok: [localhost] => (item=liamtest1) => {
"ansible_loop_var": "item",
"item": "liamtest1",
"tf_result.outputs.liamtest1_domain.value": "liamtest1.mydomain.com"
}
ok: [localhost] => (item=liamtest2) => {
"ansible_loop_var": "item",
"item": "liamtest2",
"tf_result.outputs.liamtest2_domain.value": "liamtest2.mydomain.com"
}
The bit I can't seem to do is generate another set of files (this time javascript files for mongodb) from another Jinja2 template.
I've tried this:
- name: Generate a .js file for each tenant in list
vars:
domain: tf_result.outputs.{{ item + '_domain' }}.value
template:
src: templates/tenant.js.j2
dest: "{{ enviro }}/mongodb/{{ item }}.js"
with_items: "{{ hostvars[inventory_hostname][enviro]['tenants'] }}"
If I reference that in my Jinja2 template using {{ domain }} it ends up with just a string e.g. tf_result.outputs.liamtest1_domain.value in the first file and tf_result.outputs.liamtest2_domain.value in the second file.
I also tried using lookup in the Jinja2 template like this:
{{ lookup('vars', domain) }}
Which gives me:
"AnsibleUndefinedVariable: No variable found with this name: tf_result.outputs.liamtest1_domain.value"
I've also tried some other variations such as:
{{ lookup(hostvars[inventory_hostname], domain) }}
I've tried a few other things as well, I'm not sure they're all worth mentioning as none of them worked but for example I tried setting the variable inside the Jinja template instead of at the task level like this for example:
{% set domain = lookup('vars', 'tf_result.outputs.' + item + '_domain' %}
You simply have a syntax problem in your yaml.
# Wrong
vars:
domain: tf_result.outputs.{{ item + '_domain' }}.value
This is declaring a var which value is a concatenation of (literally) "tf_result.outputs." followed by the value of the current item and "_domain.value". What you want is the actual value contained in that full variable. This is the correct syntax:
# Correct
vars:
domain: "{{ tf_result.outputs[item + '_domain'].value }}"

Ansible loop through a set of dicts, register that list and then print the specific output [duplicate]

This question already has answers here:
Using set_facts and with_items together in Ansible
(7 answers)
Closed 4 years ago.
Here is what I am trying to do.
log onto a networking switch using the built in networking modules and send a command.
register that command as a var
print that var or use that var elsewhere inside of a playbook.
This seems simple right? But here are the issues I am facing.
First of all, I am logging into one device (currently) and then issuing a simple command inside of a loop (this becomes 2 commands, and 2 outputs).
I want to put the outputs from both of the commands into a list.
Next I would like to loop through this list and inspect the return of the value from each command (remember this is 2 outputs).
Here is the current play:
- name: Checking for free ports
nxos_command:
provider:
host: "{{inventory_hostname}}"
username: "{{user.stdout}}"
commands: "show run interface {{ item.interface }}"
when: device.ansible_facts.ansible_device_os == 'nxos'
loop: "{{ device_vars[inventory_hostname] }}"
register: ports
Then when I use debug, I get a bunch of data:
- debug:
var: item.stdout
loop: "{{ports.results}}"
register: ports_output
I then set the fact and then debug (print) once more:
- name: Setting var
set_fact:
port_list: "{{item.stdout}}"
loop: "{{ports.results}}"
- debug: var=port_list
The problem I am getting is that even though port_list is a list, ansible is only returning one value of that list. This is the last value/command from the initial play. So I am assuming it is being overwritten somewhere.
Here would be my desired output:
ok: [device1] => {
"port_list": [
"1st output from the device",
"2nd output from the device"
]
}
But all I can get is this:
ok: [device1] => {
"port_list": [
"2nd output from the device"
]
}
Here is are the vars I am declaring inside of my site.yml:
vars:
device_vars:
device1:
- interface: Ethernet1/1
description: "some description
vlan: 1
- interface: Ethernet1/2
description: "some description"
vlan: 1
port_list: []
I think my issue here is I am working with a dict of dicts of lists etc. and it doesn't seem that Ansible is very friendly with this.
I've managed to get the data into this format (omitted):
{
"ports":
{
"results":
[
{
"stdout":
[
"1st output from the device"
]
},
{
"stdout":
[
"2nd output from the device"
]
}
]
}
}
I've spent 3 days on this and can't seem to find a solution.
During the loop set_fact overwrites the variable, therefore you just see the last variable is being set. However you can also use set_fact to append the the previously assigned value and include all of variables as follows:
- name: Setting var
set_fact:
port_list: "{{ port_list|default([]) + [item.stdout] }}"
loop: "{{ports.results}}"
default([]) filter above is to assign an initial value to port_list variable.
Append the elements to the list
port_list: "{{ port_list + [item.stdout] }}"

Ansible command loop amend string value and push original value into new array

Hi I'm using ansible to create some elastic search indexes using laravel's artisan commands. The issue I have is that when I have to create an index I have to use the php class name which the base name is snake cased from camel case e.g. UserConfigurator = user_configurator. In my vars I have the following;
elastic_search:
version: 6.3.2
indexes:
- "App\\ElasticSearch\\Configurators\\UserConfigurator"
- "App\\ElasticSearch\\Configurators\\ClientConfigurator"
models:
- "App\\Models\\User"
- "App\\Models\\Client"
and in my playbook I have the following;
- name: check if indexes exist
uri: url='http://localhost:9200/{{ index_name }}'
method=HEAD
status_code=200,404
- name: create indexes
command: php artisan elastic:create-index "{{ item }}"
args:
chdir: "{{site_root}}"
with_items: "{{elastic_search.indexes}}"
The playbook isn't sufficient enough to do what I want to do due to lack of experience. Any ideas how I may loop over each elastic_search.indexes and convert the class basename to a snake case and check to see if the index exists or not and push them into two separate arrays so then I can use the one of the new variables to create the index and the other variable to update the index?
Any ideas how I may loop over each elastic_search.indexes and convert the class basename to a snake case
The algorithm I use for snake-casing something is to break the word at an upper-to-lower boundary, lowercase the entire thing, then reassemble the parts, being mindful that the first upper-to-lower transition is special and should not be separated by _.
We can leverage the behavior that when set_fact sees a block of text that looks like JSON, it will coerce that block of text into a dict (you'll see that behavior on the command line with ansible -e '{"hello": "world"}' too), along with the infinitely handy regex_replace jinja2 filter provided by ansible:
- hosts: all
gather_facts: no
vars:
class_names:
- 'App\ElasticSearch\Configurators\UserConfigurator'
- 'App\ElasticSearch\Configurators\ClientConfigurator'
tasks:
- set_fact:
index_names: >-
[
{% for cn in class_names -%}
{{ '' if loop.first else ',' }}
{{ cn |
regex_replace(".*\\([^\\]+$)", "\1") |
regex_replace("([A-Z])([a-z])", "\x00\1\2") |
lower | replace("\x00", "_") | regex_replace("^_", "") |
to_json }}
{% endfor %}
]
with_items: '{{ class_names }}'
- debug: var=index_names verbosity=0
- debug: var=item verbosity=0
with_items: '{{ index_names }}'
which produces the correct output:
TASK [debug] *******************************************************************
ok: [localhost] => (item=user_configurator) => {
"item": "user_configurator"
}
ok: [localhost] => (item=client_configurator) => {
"item": "client_configurator"
}
and now you can use those indices in a command of your choosing.

How to use a dictionary of registered ansible variables in vars?

I want to pass multiple variables to a task using vars. Currently, I am doing it like below
vars:
var1_name: "var1_value"
var2_name: "var2_value"
As the number of variables can grow in size, I'd rather prefer to pass the dictionary of variables to the task using vars. I have constructed a dictionary of variables like below
- name: set fact
hosts: localhost
tasks:
- set_fact:
variables: "{{ variables|default({}) | combine( {item.variable: item.value} ) }}"
with_items:
- variable: var1_name
value: "var1_value"
- variable: var2_name
value: "var2_name"
Dictionary looks something like this:
"variables": {
"var1_name": "var1_value",
"var2_name": "var2_value",
}
Now, I want to make variables in this dictionary available to roles executing on other hosts.
But, when I tried to pass dictionary to vars like below
vars: "{{ variables }}"
Ansible throws the error:
ERROR! Vars in a Play must be specified as a dictionary, or a list of dictionaries
How to pass a dictionary variable in vars?
After doing some searching through the Ansible source code, it looks like this is an issue even the developers of Ansible face. In some integration tests, there are specific tests that are commented out because of this same error.
https://github.com/ansible/ansible/blob/devel/test/integration/targets/include_import/role/test_include_role.yml#L96
## FIXME Currently failing with
## ERROR! Vars in a IncludeRole must be specified as a dictionary, or a list of dictionaries
# - name: Pass all variables in a variable to role
# include_role:
# name: role1
# tasks_from: vartest.yml
# vars: "{{ role_vars }}"
I also found out that this is the underlying function that is being called to include the variables:
https://github.com/ansible/ansible/blob/devel/lib/ansible/playbook/base.py#L681
def _load_vars(self, attr, ds):
'''
Vars in a play can be specified either as a dictionary directly, or
as a list of dictionaries. If the later, this method will turn the
list into a single dictionary.
'''
def _validate_variable_keys(ds):
for key in ds:
if not isidentifier(key):
raise TypeError("'%s' is not a valid variable name" % key)
try:
if isinstance(ds, dict):
_validate_variable_keys(ds)
return combine_vars(self.vars, ds)
elif isinstance(ds, list):
all_vars = self.vars
for item in ds:
if not isinstance(item, dict):
raise ValueError
_validate_variable_keys(item)
all_vars = combine_vars(all_vars, item)
return all_vars
elif ds is None:
return {}
else:
raise ValueError
except ValueError as e:
raise AnsibleParserError("Vars in a %s must be specified as a dictionary, or a list of dictionaries" % self.__class__.__name__,
obj=ds, orig_exc=e)
except TypeError as e:
raise AnsibleParserError("Invalid variable name in vars specified for %s: %s" % (self.__class__.__name__, e), obj=ds, orig_exc=e)
Seems as if since "{{ }}" is actually just a YAML string, Ansible doesn't recognize it as a dict, meaning that the vars attribute isn't being passed through the Jinja2 engine but instead being evaluated for what it actually is.
The only way to pass YAML objects around would be to use anchors, however this would require that the object be defined in whole instead of dynamically.
var: &_anchored_var
attr1: "test"
att2: "bar"
vars:
<<: *_anchored_var
I recommend using a structured way of managing variables like:
File myvars1.yml
myvars:
var1_name: "var1_value"
var2_name: "var2_value"
Then read the variables like
- name: Read all variables
block:
- name: Get All Variables
stat:
path: "{{item}}"
with_fileglob:
- "/myansiblehome/vars/common/myvars1.yml"
- "/myansiblehome/vars/common/myvars2.yml"
register: _variables_stat
- name: Include Variables when found
include_vars : "{{item.stat.path}}"
when : item.stat.exists
with_items : "{{_variables_stat.results}}"
no_log : true
delegate_to: localhost
become: false
After that, use like:
- name: My Running Module
mymodule:
myaction1: "{{ myvars.var1_name }}"
myaction2: "{{ myvars.var2_name }}"
Hope it helps

How to combine two lists together?

I have two lists:
the_list:
- { name: foo }
- { name: bar }
- { name: baz }
and I use a task which gets some value for its every element:
- name: Get values
shell:
magic_command {{ item.name }}
with_items: the_list
register: spells
from now on I can use the_list and its correspondig values together:
- name: Use both
shell:
do_something {{ item.0.name }} {{ item.1.stdout }}
with_together:
- "{{ the_list }}"
- "{{ spells.results }}"
All works fine but it's uncomfortable to use with_together for many tasks and it'll be hard to read that code in a future so I would be more than happy to build merged_list from that which I can use in a simple way. Let say something like this:
merged_list:
- { name: foo, value: jigsaw }
- { name: bar, value: crossword }
- { name: baz, value: monkey }
which makes the puzzle. Anyone can help ?
I wrote two ansible filters to tackle this problem: zip and todict which are available in my repo at https://github.com/ahes/ansible-filter-plugins
Sample ansible playbook:
- hosts: localhost
vars:
users:
- { name: user1 }
- { name: user2 }
tasks:
- name: Get uids for users
command: id -u {{ item.name }}
register: uid_results
with_items: users
- set_fact:
uids: "{{ uid_results.results | map(attribute='stdout') | todict('uid') }}"
- set_fact:
users: "{{ users | zip(uids) }}"
- name: Show users with uids
debug: var=users
Result would be:
TASK [Show users with uids] ****************************************************
ok: [localhost] => {
"users": [
{
"name": "user1",
"uid": "1000"
},
{
"name": "user2",
"uid": "2000"
}
]
}
It may be an overkill but you should try to write a custom filter plugin.
Each time you iterates the_list you simple wants to add value to that dict {name: 'foo'} right?
After the update you just want that the new dict has the value like: {name: 'foo', value: 'jigsaw'}
The filter plugin for that it's pretty simple:
def foo(my_list, spells):
try:
aux = my_list
for i in xrange(len(aux)):
my_list[i].update(spells[i])
return my_list
except Exception, e:
raise errors.AnsibleFilterError('Foo plugin error: %s, arguments=%s' % str(e), (my_list,spells) )
class FilterModule(object):
def filters(self):
return {
'foo' : foo
}
After adding this python code inside your plugins directory, you can easily call the foo plugin passing the spells list as a parameter:
- name: Get values
shell:
magic_command {{ item.name }}
with_items: the_list
register: spells
- name: Use both
shell:
do_something {{ item.name }} {{ item.value }}
with_items:
- "{{ the_list | foo(spells.results) }}"
NOTE: The python code it's just an example. Read the ansible documentations about developing filter plugins.
I think I've found a cleaner, easier way to deal with these kind of things. Ansible runs all strings through jinja and then tries to load the result as yaml. This is because jinja only outputs strings so that allows it to load a data structure from a variable if there is one.
So any valid yaml in a string is loaded as a data structure -- so if you template valid yaml it will get loaded as data.
Trying to template correct yaml in the conventional, human form is tricky. But yaml loads all json. So, json is easier because there is no need to worry about whitespace. One bonus though, yaml does not care about extra commas, so that makes templating it easier.
In this case here is the playbook from the top answer rewritten to use this method.
- hosts: localhost
vars:
users:
- { name: "user1" }
- { name: "user2" }
tasks:
- name: Get uids for users
command: id -u {{ item.name }}
register: uid_results
loop: "{{ users }}"
- name: Show users with uids
debug: var=users_with_uids
vars:
users_with_uids: |
[
{% for user_dict, uid in users | zip(uids) %}
{
"name": {{ user_dict['name'] | to_json }},
"uid": {{ uid | to_json }},
},
{% endfor %}
]
uids: "{{ uid_results.results | map(attribute='stdout') }}"
Notes
The | character tells yaml to load a multi-line string. Instead of putting the variables in quotes I use the to_json filter which will quote it and, more importantly, automatically escape anything in the variable that needs escaping. Also, remember commas after list or dictionary elements.
The results should be the same:
TASK [Show users with uids] ************************************************************
ok: [localhost] => {
"users_with_uids": [
{
"name": "user1",
"uid": "1000"
},
{
"name": "user2",
"uid": "1001"
}
]
}
One more thing
I like to use the yaml callback especially for testing this. That way if my json-looking yaml doesn't get loaded I'll see a json-like structure. Otherwise it will come back in normal looking yaml if it was loaded. You can enable this by environment variable -- export ANSIBLE_STDOUT_CALLBACK=community.general.yaml.

Resources