Ansible notify restart when a file changed - ansible

I try to restart a docker for specific item on inventory (aa,bb) when the file changed with copy module. If the copy changed the file so it trigger a handler to restart the specific item. The copy works but the handler is never trigger.
instances.yml
instances:
aa:
name: "aa"
bb:
name: "bb"
playbook.yml
- name: Copy config
copy:
src: "roles/{{ item.value.name }}.yaml"
dest: "etc/{{ item.value.name }}.yaml"
with_dict: "{{ instances1 }}"
register: template
notify:
- my_handler
handlers/main.yml
- name: my_handler
shell: "notify {{ item.key }}"
with_items: "{{ template.results | selectattr('changed', 'equalto', true) | map(attribute='item')| list }}"

A functional sample:
/my_project
|playbook.yml
|--/roles
| |--/test
| |--/tasks
| | main.yml
| |
| |--/handlers
| | main.yml
| |
| |--/files
| | aa.yaml
| | bb.yam
playbook.yml
- hosts: localhost
gather_facts: false
roles:
- test
In folder my_project/roles/test/tasks: main.yml
- name: Copy config
copy:
src: "{{ item.value.name }}.yaml"
dest: "zzz{{ item.value.name }}.txt"
with_dict: "{{ instances }}"
register: template
vars:
instances:
aa:
name: "aa"
bb:
name: "bb"
notify:
- my_handler
- name: Force all notified handlers to run at this point, not waiting for normal sync points
meta: flush_handlers
In folder my_project/roles/test/handlers: main.yml
- name: my_handler
debug:
msg: notify called {{ item }}
with_items: "{{ template.results | selectattr('changed', 'equalto', true) | map(attribute='item')| list }}"
Result:
RUNNING HANDLER [test : my_handler]
ok: [localhost] => (item={'key': 'aa', 'value': {'name': 'aa'}}) => {
"msg": "notify called {'key': 'aa', 'value': {'name': 'aa'}}"
}
ok: [localhost] => (item={'key': 'bb', 'value': {'name': 'bb'}}) => {
"msg": "notify called {'key': 'bb', 'value': {'name': 'bb'}}"
}
to execute Handlers at once add task meta: flush_handlers

Related

JMESPathError in json_query filter: Unknown function: regex_search()

Here is my playbook:
- hosts: localhost
vars:
{
"result": [
{
"_ref": "vlan/ZG5zLnZsYW4kLmNvbS5pbmZvYmxveC5kbnMudmxhbl92aWV3JElORlJBTEFCLjEuNDA5NC4xMQ:LAB1/test1/11",
"id": 11,
"name": "test1",
"parent": {
"_ref": "vlanview/ZG5zLnZsYW5fdmlldyRJTkZSQUxBQi4xLjQwOTQ:LAB1/1/4094"
}
},
{
"_ref": "vlan/ZG5zLnZsYW4kLmNvbS5pbmZvYmxveC5kbnMudmxhbl92aWV3JFNDTEFCLU9PQi4xLjQwOTQuMTE:LAB2/test1/11",
"id": 11,
"name": "test1,
"parent": {
"_ref": "vlanview/ZG5zLnZsYW5fdmlldyRTQ0xBQi1PT0IuMS40MDk0:LAB2/1/4094"
}
}
]
}
tasks:
- set_fact:
var1: "{{result|json_query(jquery)}}"
vars:
jquery: "[].{vlan_view: _ref|regex_search('(?<=:)[^/]*'), vlan_id: id, vlan_name: name}"
- debug: msg={{var1}}
Which errors with:
fatal: [localhost]: FAILED! => {"msg": "JMESPathError in json_query filter plugin:\nUnknown function: regex_search()"}
My desired output
[
{
"vlan_view": LAB1,
"vlan_id": 11,
"vlan_name": "test1"
},
{
"vlan_id": 11,
"vlan_name": "test1",
"vlan_view": "LAB2"
}
]
You cannot do regex operation in JMESPath, as per this issue on their tracker.
And you surely cannot use a Jinja filter as a JMESPath function, as the error is pointing out.
So, you will have to achieve this with Jinja filters and Ansible alone.
And with a loop, it is definitely possible to create a list corresponding to your desired output:
- set_fact:
var1: "{{ var1 | default([]) + [_vlan] }}"
loop: "{{ result }}"
loop_control:
label: "{{ item.id }}"
vars:
_vlan:
vlan_id: "{{ item.name }}"
vlan_name: "{{ item.id }}"
vlan_view: >-
{{
item.parent._ref
| regex_search(':(.*?)\/', '\1')
| first
}}
Given the two tasks:
- set_fact:
var1: "{{ var1 | default([]) + [_vlan] }}"
loop: "{{ result }}"
loop_control:
label: "{{ item.id }}"
vars:
_vlan:
vlan_id: "{{ item.name }}"
vlan_name: "{{ item.id }}"
vlan_view: >-
{{
item.parent._ref
| regex_search(':(.*?)\/', '\1')
| first
}}
- debug:
var: var1
This will yield:
TASK [set_fact] ***************************************************************
ok: [localhost] => (item=11)
ok: [localhost] => (item=11)
TASK [debug] ******************************************************************
ok: [localhost] =>
var1:
- vlan_id: test1
vlan_name: '11'
vlan_view: LAB1
- vlan_id: test1
vlan_name: '11'
vlan_view: LAB2
Get the attributes vlan_view
vlan_view: "{{ result|map(attribute='_ref')|
map('split', ':')|map('last')|
map('split', '/')|map('first')|
map('community.general.dict_kv', 'vlan_view')|
list }}"
gives
vlan_view:
- vlan_view: LAB1
- vlan_view: LAB2
Then use json_query to get the other attributes and combine the dictionaries
var1: "{{ result|json_query('[].{vlan_id: id, vlan_name: name}')|
zip(vlan_view)|map('combine')|list }}"
gives the expected result
var1:
- vlan_id: 11
vlan_name: test1
vlan_view: LAB1
- vlan_id: 11
vlan_name: test1
vlan_view: LAB2
Example of a complete playbook (simplified for testing)
- hosts: localhost
vars:
result:
- _ref: vlan/ZG5z...4xMQ:LAB1/test1/11
id: 11
name: test1
parent:
_ref: vlanview/ZG5zL...wOTQ:LAB1/1/4094
- _ref: vlan/ZG5zL...uMTE:LAB2/test1/11
id: 11
name: test1
parent:
_ref: vlanview/ZG5zL...MDk0:LAB2/1/4094
vlan_view: "{{ result|map(attribute='_ref')|
map('split', ':')|map('last')|
map('split', '/')|map('first')|
map('community.general.dict_kv', 'vlan_view')|
list }}"
var1: "{{ result|json_query('[].{vlan_id: id, vlan_name: name}')|
zip(vlan_view)|map('combine')|list }}"
tasks:
- debug:
var: vlan_view
- debug:
var: var1

Nested loop with user and folder in Ansible

I have the following task:
- name: Create required folders.
become: true
ansible.builtin.file:
owner: "{{ item.key }}"
group: ftp
mode: '0755'
path: '/data/{{ item.key }}/in'
state: directory
loop: "{{ query('dict', ftp) | list }}"
when: "'state' not in item.value or item.value.state == 'present'"
And the following host variables with different users:
ftp:
test:
ssh_public_key: "XXXX"
password: "XXX"
home: /data/test
test2:
ssh_public_key: "XXXX"
password: "XXX"
home: /data/test2
What I want is to create two directories for every user :
path: '/data/{{ user }}/in' # item.key, in the code above
path: '/data/{{ user }}/out' # item.key, in the code above
But I already need the loop for iterating over the users itself:
loop: "{{ query('dict', ftp) | list }}"
How can I handle this, for example, with nested loop?
Use a product filter to generate every possible combination of user/folder.
loop: "{{ ftp.keys() | product(['in', 'out']) }}"
Then, respectively,
item.0 contains the users dictionary keys
item.1 contains the folders
It is not fully clear what your condition when does actually, in order to adapt it too, but I guess that you do have an absent or present state in those use dictionaries.
So, the resulting task should be something along the lines of
- name: Create required folders
ansible.builtin.file:
owner: "{{ item.0 }}"
group: ftp
mode: '0755'
path: "/data/{{ item.0 }}/{{ item.1 }}"
state: directory
loop: "{{ ftp.keys() | product(['in', 'out']) }}"
loop_control:
label: "/data/{{ item.0 }}/{{ item.1 }}"
when: "ftp[item.0].state | default('absent') == 'present'"
become: true
Given the task above, when run on those data:
ftp:
test:
state: present
test1:
test2:
state: present
It will yield:
TASK [Create required folders] ***************************************
ok: [localhost] => (item=/data/test/in)
ok: [localhost] => (item=/data/test/out)
skipping: [localhost] => (item=/data/test1/in)
skipping: [localhost] => (item=/data/test1/out)
ok: [localhost] => (item=/data/test2/in)
ok: [localhost] => (item=/data/test2/out)
Test it first, for example
- debug:
msg: "Create /data/{{ item.0.key }}/{{ item.1 }}"
with_nested:
- "{{ ftp|dict2items }}"
- [in, out]
when: item.0.value.state|d('present') == 'present'
gives (abridged)
msg: Create /data/test/in
msg: Create /data/test/out
msg: Create /data/test2/in
msg: Create /data/test2/out
Then try to create the dictionaries
- file:
owner: "{{ item.0.key }}"
group: ftp
mode: '0755'
path: "/data/{{ item.0.key }}/{{ item.1 }}"
state: directory
with_nested:
- "{{ ftp|dict2items }}"
- [in, out]
when: item.0.value.state|d('present') == 'present'
(not tested)

Ansible when condition registered from csv

I'm using csv file as ingest data for my playbooks, but im having trouble with my when condition. it's either both task will skipped or both task will be ok, my objective is if ansible see the string in when condition it will skipped for the specific instance.
here is my playbook
- name: "Read ingest file from CSV return a list"
community.general.read_csv:
path: sample.csv
register: ingest
- name: debug ingest
debug:
msg: "{{ item.AWS_ACCOUNT }}"
with_items:
- "{{ ingest.list }}"
register: account
- name: debug account
debug:
msg: "{{ account.results | map(attribute='msg') }}"
register: accountlist
- name:
become: yes
become_user: awx
delegate_to: localhost
environment: "{{ proxy_env }}"
block:
- name: "Assume role"
community.aws.sts_assume_role:
role_arn: "{{ item.ROLE_ARN }}"
role_session_name: "pm"
with_items:
- "{{ ingest.list }}"
register: assumed_role
when: "'aws-account-rnd' not in account.results | map(attribute='msg')"
here is the content of sample.csv
HOSTNAME
ENVIRONMENT
AWS_ACCOUNT
ROLE_ARN
test1
dev
aws-account-rnd
arn:aws:iam::XXXX1
test2
uat
aws-account-uat
arn:aws:iam::XXXX2
my objective is to skipped all items in the csv file with aws-acount-rnd
Your condition does not mention item so it will have the same result for all loop items.
Nothing you've shown requires the weird abuse of debug + register that you're doing, and it is in fact getting in your way.
- name: Read CSV file
community.general.read_csv:
path: sample.csv
register: ingest
- name: Assume role
community.aws.sts_assume_role:
role_arn: "{{ item.ROLE_ARN }}"
role_session_name: pm
delegate_to: localhost
become: true
become_user: awx
environment: "{{ proxy_env }}"
loop: "{{ ingest.list }}"
when: item.AWS_ACCOUNT != 'aws-account-rnd'
register: assumed_role
If you'll always only care about one match you can also do this without a loop or condition at all:
- name: Assume role
community.aws.sts_assume_role:
role_arn: "{{ ingest.list | rejectattr('AWS_ACCOUNT', '==', 'aws-account-rnd') | map(attribute='ROLE_ARN') | first }}"
role_session_name: pm
delegate_to: localhost
become: true
become_user: awx
environment: "{{ proxy_env }}"
register: assumed_role
my objective is to skipped all items in the csv file with aws-acount-rnd
The multiple debug you have with register, seems to be a long-winded approach IMHO.
A simple task to debug the Role ARN, only if the account does not match aws-acount-rnd.
- name: show ROLE_ARN when account not equals aws-account-rnd
debug:
var: item['ROLE_ARN']
loop: "{{ ingest.list }}"
when: item['AWS_ACCOUNT'] != 'aws-account-rnd'
This results in:
TASK [show ROLE_ARN when account not equals aws-account-rnd] **********************************************************************************************************************
skipping: [localhost] => (item={'HOSTNAME': 'test1', 'ENVIRONMENT': 'dev', 'AWS_ACCOUNT': 'aws-account-rnd', 'ROLE_ARN': 'arn:aws:iam:XXXX1'})
ok: [localhost] => (item={'HOSTNAME': 'test2', 'ENVIRONMENT': 'uat', 'AWS_ACCOUNT': 'aws-account-uat', 'ROLE_ARN': 'arn:aws:iam:XXXX2'}) => {
"ansible_loop_var": "item",
"item": {
"AWS_ACCOUNT": "aws-account-uat",
"ENVIRONMENT": "uat",
"HOSTNAME": "test2",
"ROLE_ARN": "arn:aws:iam:XXXX2"
},
"item['ROLE_ARN']": "arn:aws:iam:XXXX2"
}
The same logic can be used to pass the item.ROLE_ARN to community.aws.sts_assume_role task.

Parse yaml files in Ansible

I have got multiple yaml files on remote machine. I would like to parse those files in order to get information about names for each kind (Deployment, Configmap, Secret) of object, For example:
...
kind: Deployment
metadata:
name: pr1-dep
...
kind: Secret
metadata:
name: pr1
...
....
kind: ConfigMap
metadata:
name: cm-pr1
....
Ecpected result:
3 variables:
deployments = [pr1-dep]
secrets = [pr1]
configmaps = [cm-pr1]
I started with:
- shell: cat "{{ item.desc }}"
with_items:
- "{{ templating_register.results }}"
register: objs
but i have no idea how to correctly parse item.stdout from objs
Ansible has a from_yaml filter that takes YAML text as input and outputs an Ansible data structure. So for example you can write something like this:
- hosts: localhost
gather_facts: false
tasks:
- name: Read objects
command: "cat {{ item }}"
register: objs
loop:
- deployment.yaml
- configmap.yaml
- secret.yaml
- debug:
msg:
- "kind: {{ obj.kind }}"
- "name: {{ obj.metadata.name }}"
vars:
obj: "{{ item.stdout | from_yaml }}"
loop: "{{ objs.results }}"
loop_control:
label: "{{ item.item }}"
Given your example files, this playbook would output:
PLAY [localhost] ***************************************************************
TASK [Read objects] ************************************************************
changed: [localhost] => (item=deployment.yaml)
changed: [localhost] => (item=configmap.yaml)
changed: [localhost] => (item=secret.yaml)
TASK [debug] *******************************************************************
ok: [localhost] => (item=deployment.yaml) => {
"msg": [
"kind: Deployment",
"name: pr1-dep"
]
}
ok: [localhost] => (item=configmap.yaml) => {
"msg": [
"kind: ConfigMap",
"name: pr1-cm"
]
}
ok: [localhost] => (item=secret.yaml) => {
"msg": [
"kind: Secret",
"name: pr1"
]
}
PLAY RECAP *********************************************************************
localhost : ok=2 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
Creating the variables you've asked for is a little trickier. Here's
one option:
- hosts: localhost
gather_facts: false
tasks:
- name: Read objects
command: "cat {{ item }}"
register: objs
loop:
- deployment.yaml
- configmap.yaml
- secret.yaml
- name: Create variables
set_fact:
names: >-
{{
names|combine({
obj.kind.lower(): [obj.metadata.name]
}, list_merge='append')
}}
vars:
names: {}
obj: "{{ item.stdout | from_yaml }}"
loop: "{{ objs.results }}"
loop_control:
label: "{{ item.item }}"
- debug:
var: names
This creates a single variable named names that at the end of the
playbook will contain:
{
"configmap": [
"pr1-cm"
],
"deployment": [
"pr1-dep"
],
"secret": [
"pr1"
]
}
The key to the above playbook is our use of the combine filter, which can be used to merge dictionaries and, when we add list_merge='append', handles keys that resolve to lists by appending to the existing list, rather than replacing the existing key.
Include the dictionaries from the files into the new variables, e.g.
- include_vars:
file: "{{ item }}"
name: "objs_{{ item|splitext|first }}"
register: result
loop:
- deployment.yaml
- configmap.yaml
- secret.yaml
This will create dictionaries objs_deployment, objs_configmap, and objs_secret. Next, you can either use the dictionaries
- set_fact:
objs: "{{ objs|d({})|combine({_key: _val}) }}"
loop: "{{ query('varnames', 'objs_') }}"
vars:
_obj: "{{ lookup('vars', item) }}"
_key: "{{ _obj.kind }}"
_val: "{{ _obj.metadata.name }}"
, or the registered data
- set_fact:
objs: "{{ dict(_keys|zip(_vals)) }}"
vars:
_query1: '[].ansible_facts.*.kind'
_keys: "{{ result.results|json_query(_query1)|flatten }}"
_query2: '[].ansible_facts.*.metadata[].name'
_vals: "{{ result.results|json_query(_query2)|flatten }}"
Both options give
objs:
ConfigMap: cm-pr1
Deployment: pr1-dep
Secret: pr1

Is it possible to use variable in Jinja2 default filter?

I guess it is not possible to refer to item variable when using
Jinja2 'default' filter?
Like in this example playbook:
---
- hosts: localhost
become: no
gather_facts: no
vars:
users:
- foo:
name: "foo"
home: /home/foo
- bar:
name: "bar"
tasks:
- name: debug
debug:
msg: "{{ item.home | default('/home/{{ item.name }}') }}"
loop: "{{ users }}"
If tried I get output like:
$ ansible-playbook test.yml |grep item
ok: [localhost] => (item={u'home': u'/home/foo', u'foo': None, u'name': u'foo'}) => {
ok: [localhost] => (item={u'bar': None, u'name': u'bar'}) => {
"msg": "/home/{{ item.name }}"
Obviously I want "/home/bar" not "/home/{{ item.name }}".
Just use string concatenation in the expression, don't use nested handlebars...
"{{ item.home | default('/home/' + item.name) }}"
This adds the item.name variable to the static /home part.

Resources