I'm trying to create a playbook that's going to create multiple files in a number of different directories. The input variables are env & region and both are lists. I've then combined them to display all the different combinations.
What I want is to then loop through all the different folder paths by accessing the values of each individual list, and this is where I'm stuck.
---
- hosts: localhost
gather_facts: false
become: no
vars:
application_name: test
region: [us-east-1, us-east-2]
env: [test,prod]
tasks:
- name: allregions_and_enviroments
set_fact:
all_comb: "{{ region|product(env) }}"
- debug:
var: all_comb
- name: Create new TF file
template:
src: template.j2
dest: /Users/user/Documents/ansibletemplate/{{item[0]}}/{{item[1]}}/ab-{{application_name}}.tf
with_nested: all_comb
The output of the merged list is this:
ok: [localhost] => {
"all_comb": [
[
"us-east-1",
"test"
],
[
"us-east-1",
"prod"
],
[
"us-east-2",
"test"
],
[
"us-east-2",
"prod"
]
]
}
In the directory path, I want it to loop through the region and then the environment and create a file in ..../us-east-1/test/abc.tf then move on to the next.
Any assistance will be greatly appreciated.
Thanks!
Related
I hava this situation:
A play that run in localhost use include_task for create, on the fly with add_host, two sub-group extracting only one host from two group that are present in the inventory file.
Another play in the same yaml file use this group as hosts (host:sub-group).
This is the inventory file:
all:
children:
group_one:
hosts:
hostA01:
ansible_host: host1a
hostA02:
ansible_host: host2a
hostA03:
ansible_host: host3a
vars:
cluster: hosta
vip: 192.168.10.10
home: /cluster/hosta
user: usr_hosta
pass: pass_hosta
group_two:
hosts:
hostB01:
ansible_host: host1b
hostB02:
ansible_host: host2b
hostB03:
ansible_host: host3b
vars:
cluster: hostb
vip: 192.168.10.20
home: /cluster/hostb
user: usr_hostb
pass: pass_hostb
other groups...
I have created the sub groups, with add_host, for each group in inventoty file. The name the sub-groups add the prefix "sub-" to inventory original groups, like sub-(one/two/etc..)
In hostvars i retrieve this situation:
"groups": {
"all": [
"hostA01",
"hostA02",
"hostA03",
"hostB01",
"hostB02",
"hostB03",
"other_host_from _other groups"
],
"group_one": [
"hostA01",
"hostA02",
"hostA03"
],
"group_two": [
"hostB01",
"hostB02",
"hostB03"
],
"other_group": [
"other_host",
.....
],
"sub-group_one": [
"hostA01"
],
"sub_group_two": [
"hostB01"
],
"sub-other_group": [
"other_first_host"
],
"ungrouped": []
},
"vars_for_group": {
"group_one": {
"cluster: hosta
"vip: 192.168.10.10
"home: /cluster/hosta
"user: usr_hosta
"pass: pass_hosta,
"ansible_host": "host1a",
"host": "hostA01"
},
"group_two": {
"cluster: hostb
"vip: 192.168.10.20
"home: /cluster/hostb
"user: usr_hostb
"pass: pass_hostb,
"ansible_host": "host1b",
"host": "hostB01"
},
"otehr_groups": {
.......
},
},
"inventory_hostname": "127.0.0.1",
"inventory_hostname_short": "127",
"module_setup": true,
"playbook_dir": ""/home/foo/playbook",
"choice": "'sub-two'"
}
}
The environment variable "choice" in the last line of hostvars derives from other tool and indicates the group on which the final user wants to operate (one, two, ..., all).
Now, my playbook is:
---
- hosts: 127.0.0.1
become: yes
gather_facts: yes
remote_user: root
tasks:
- name: include news_groups
ansible.builtin.include_tasks:
newsgroups.yaml
vars:
choice: "{{ lookup('env','CHOICE') }}"
- hosts: "{{ hostvars['localhost']['groups']['{{ hostvars['localhost']['choice'] }}'] }}"
name: second_play
become: yes
gather_facts: no
remote_user: root
tasks:
- hosts: all
name: other play
gather_facts: no
vars:
other_vars:...
tasks:
.....
Unfortunately this line don't work.
- hosts: "{{ hostvars['localhost']['groups']['{{ hostvars['localhost']['choice'] }}'] }}"
I have made several attempts with many different configuration and different sintax (without {{...}}, with or without "double quotes" and 'single quote' but it always seems syntactically wrong or still cannot find the group indicated by the variable. Or, maybe could it also be that the approach is wrong?
Any suggestions?
Thanks in advance
I created an ansible playbook. And I want a task executed only if a json_query returns element.
The json query have to returned an array searching if in an element from array of array exists in an element of another array.
I already tried using json_query (jmespath) with simplified queries
I read the jmespath doc and try with the website tutorial.
Read the ansible documentation and try to find example.
I think the good solution is to use contains built-in functions and map functions. But example with map and the documentation is not clear for me.
Example :
array_of_array_to_check : [
{
hosts : ["host1", "host2"],
name : "name1"
},
{
hosts : [ "host3", "host1"],
name : "name2"
},
{
hosts : ["host4", "host5"],
name : "name3"
}
]
array_parameters: ["host1", "host18"]
Expected :
result: [
{
hosts: ["host1", "host2"],
name: "name1"
},
{
hosts: ["host3", "host1"],
name: "name2"
}
]
here is a way to do it:
---
- hosts: localhost
gather_facts: false
vars:
array_of_array_to_check:
- hosts:
- host1
- host2
name: name1
- hosts:
- host3
- host1
name: name2
- hosts:
- host4
- host5
name: name3
array_parameters:
- host1
- host18
tasks:
- name: parse array and add to results
set_fact:
results_array: "{{ results_array | default([]) + [item] }}"
when: item.hosts | intersect(array_parameters) | length > 0
with_items:
- "{{ array_of_array_to_check }}"
- debug:
var: results_array
basically you parse the array_of_array_to_check list, and if you find common elements in its elements' hosts list with the array_parameters, then you add the whole "item" to the results_array
intersect filter gets the "unique list of all items in both" , so if length is more than 0, then there are matches found.
hope it helps.
I'm using Ansible to install packages on a new deployment. I have a pre-defined list of dicts in a variable.
I want to open an interface to update this list using Jenkins.
My list looks like this:
package_list: [
{'name': 'python-devel', 'apt': 'python-dev'},
{'name': 'python-pip'},
{'name': 'postgresql-devel'},
...
]
The way I communicate Jenkins input to Ansible is using environment variables. I can pass a list of additional packages to be installed and read it as part of my Ansible configuration.
Question is: How I convert a list of strings, to a list of dictionaries matches the structure of my package_list?
For example:
ENV:
PACKAGES=gcc,vim,ntp
ANSIBLE:
additional_packages = [
{'name': 'gcc'},
{'name': 'vim'},
{'name': 'ntp'}
]
Is it even possible?
i believe this playbook will get you where you want. it assumes you have the env variable: PACKAGES=gcc,vim,ntp
it converts the string variable to a list (split by ,), and then in another loop it converts to a list of dictionaries:
playbook:
- hosts: localhost
gather_facts: false
vars:
tasks:
- name: pick up env variable, convert to list
set_fact:
PACKAGES: "{{ lookup('env', 'PACKAGES').split(',') }}"
- name: create dict list variable
set_fact:
PACKAGES_DICT: "{{ PACKAGES_DICT|default([]) + [{'name': item}] }}"
with_items:
- "{{ PACKAGES }}"
- name: print results
debug:
var: PACKAGES_DICT
results:
TASK [print results] **************************************************************************************************************************************************************************************************
ok: [localhost] => {
"PACKAGES_DICT": [
{
"name": "gcc"
},
{
"name": "vim"
},
{
"name": "ntp"
}
]
}
hope this helps
EDIT
refining the code, removing the set_fact task, declaring the PACKAGES variable in vars section:
- hosts: localhost
gather_facts: false
vars:
PACKAGES: "{{ lookup('env', 'PACKAGES').split(',') }}"
tasks:
- name: create dict list variable
set_fact:
PACKAGES_DICT: "{{ PACKAGES_DICT|default([]) + [{'name': item}] }}"
with_items:
- "{{ PACKAGES }}"
- name: print results
debug:
var: PACKAGES_DICT
Using Ansible, list should be written that way:
package_list:
- name: "gcc"
- name: "vim"
- name: "ntp
so to get that list from the string you can do that way:
vars:
package_list: "{{ packages.split(',').values() | list }}"
When we check hostvars with:
- name: Display all variables/facts known for a host
debug: var=hostvars[inventory_hostname]
We get:
ok: [default] => {
"hostvars[inventory_hostname]": {
"admin_email": "admin#surfer190.com",
"admin_user": "root",
"ansible_all_ipv4_addresses": [
"192.168.35.19",
"10.0.2.15"
],...
How would I specify the first element of the "ansible_all_ipv4_addresses" list?
Use dot notation
"{{ ansible_all_ipv4_addresses.0 }}"
This should work just like it would in Python. Meaning you can access the keys with quotes and the index with an integer.
- set_fact:
ip_address_1: "{{ hostvars[inventory_hostname]['ansible_all_ipv4_addresses'][0] }}"
ip_address_2: "{{ hostvars[inventory_hostname]['ansible_all_ipv4_addresses'][1] }}"
- name: Display 1st ipaddress
debug:
var: ip_address_1
- name: Display 2nd ipaddress
debug:
var: ip_address_2
I had this same challenge when trying to parse the result of a command in Ansible.
So the result was:
{
"changed": true,
"instance_ids": [
"i-0a243240353e84829"
],
"instances": [
{
"id": "i-0a243240353e84829",
"state": "running",
"hypervisor": "xen",
"tags": {
"Backup": "FES",
"Department": "Research"
},
"tenancy": "default"
}
],
"tagged_instances": [],
"_ansible_no_log": false
}
And I wanted to parse the value of state into the result register in the ansible playbook.
Here's how I did it:
Since the result is an hash of array of hashes, that is state is in the index (0) hash of the instances array, I modified my playbook to look this way:
---
- name: Manage AWS EC2 instance
hosts: localhost
connection: local
# gather_facts: false
tasks:
- name: AWS EC2 Instance Restart
ec2:
instance_ids: '{{ instance_id }}'
region: '{{ aws_region }}'
state: restarted
wait: True
register: result
- name: Show result of task
debug:
var: result.instances.0.state
I saved the value of the command using register in a variable called result and then got the value of state in the variable using:
result.instances.0.state
This time when the command ran, I got the result as:
TASK [Show result of task] *****************************************************
ok: [localhost] => {
"result.instances.0.state": "running"
}
That's all.
I hope this helps
I need to read from a config file which needs to contain a list. The list needs to be passed to a role as argument.
---
- name: run command on localhost
hosts: localhost
tasks:
- name: read variables from file
shell: cat {{ conf1/TMP.txt }}
register: contents
- name: role to trigger the run script process
hosts: otherhost
roles:
- { role: run_script, applist: "{{ contents }}" }
The content of the conf1/TMP.txt file is as follows:
[ 'a', 'b', 'c', 'd' ]
The above mentioned code segment is not working but the following code segment works:
---
- name: main yml file to trigger the whole process
hosts: otherhost
roles:
- { role: run_script, applist: [ 'a', 'b', 'c', 'd' ] }
Try using a lookup filter instead of a shell command. Example below..
---
- name: run command on localhost
hosts: localhost
tasks:
- set_fact:
contents: "{{ lookup('file', 'tmp.txt') }}"
- debug: var=contents
- name: role to trigger the run script process
hosts: localhost
roles:
- { role: foo, applist: "{{ contents }}" }
The output of debug should look like this
ok: [localhost] => {
"foo": [
"1",
"2",
"3",
"4"
]}