Ansible conditional task processing - ansible

I need some help on an ansible playbook. Using this sample playbook, I would like to modify this playbook in such a way that if the hostname of webservers1 equals "123.456.000", do not bother running the remaining parts of the playbook.
- name: test play 1
hosts: webservers1
serial: 2
gather_facts: False
tasks:
- name: first task
command: hostname
- name: test play 2
hosts: webservers2
serial: 2
gather_facts: False
tasks:
- name: first task
command: hostname
- name: second task
command: hostname
- name: test play 3
hosts: webservers3
serial: 2
gather_facts: False
tasks:
- name: first task
command: hostname
- name: second task
command: hostname
- name: test play 4
hosts: webservers4
serial: 2
gather_facts: False
tasks:
- name: first task
command: hostname
- name: second task
command: hostname

Q: "If the hostname of webservers1 equals "123.456.000", do not bother running the remaining parts of the playbook."
A: It's not possible to break a playbook from a play, e.g.
- hosts: localhost
tasks:
- meta: end_play
- hosts: localhost
tasks:
- debug:
msg: Start play 2
the playbook proceeds to the second play
PLAY [localhost] **************************************************************
PLAY [localhost] **************************************************************
TASK [debug] ******************************************************************
ok: [localhost] =>
msg: Start play 2
There is still an option to test a variable at the beginning of each play. For example, given the inventory
shell> cat hosts
[webservers1]
srv1 ansible_host=123.456.000
srv2 ansible_host=123.456.001
[webservers2]
srv3 ansible_host=123.456.002
srv4 ansible_host=123.456.003
The playbook below tests the condition in the first play and sets the variable. The next plays test this variable, e.g.
- hosts: all
tasks:
- set_fact:
_break: "{{ '123.456.000' in groups.webservers1|
map('extract', hostvars, 'ansible_host')|
list }}"
run_once: true
- hosts: webservers1
tasks:
- meta: end_play
when: _break|bool
- debug:
msg: Start webservers1
- hosts: webservers2
tasks:
- meta: end_play
when: _break|bool
- debug:
msg: Start webservers2
should break the next two plays
PLAY [all] ********************************************************************
TASK [set_fact] ***************************************************************
ok: [srv1]
PLAY [webservers1] ************************************************************
PLAY [webservers2] ************************************************************
PLAY RECAP ********************************************************************
srv1: ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0

Related

Ansible: How to check multiple servers for a text file value, to decide which servers to run the script on?

I am trying to ask Ansible to check if a server is passive or active based on the value of a specific file in each server, then Ansible will decide which server it runs the next script on.
For example with 2 servers:
Server1
cat /tmp/currentstate
PASSIVE
Server2
cat /tmp/currentstate
ACTIVE
In Ansible
Trigger next set of jobs on server where the output was ACTIVE.
Once the jobs complete, trigger next set of jobs on server where output was PASSIVE
What I have done so far to grab the state, and output the value to Ansible is
- hosts: "{{ hostname1 | mandatory }}"
gather_facts: no
tasks:
- name: Grab state of first server
shell: |
cat {{ ans_script_path }}currentstate.log
register: state_server1
- debug:
msg: "{{ state_server1.stdout }}"
- hosts: "{{ hostname2 | mandatory }}"
gather_facts: no
tasks:
- name: Grab state of second server
shell: |
cat {{ ans_script_path }}currentstate.log
register: state_server2
- debug:
msg: "{{ state_server2.stdout }}"
What I have done so far to trigger the script
- hosts: "{{ active_hostname | mandatory }}"
tasks:
- name: Run the shutdown on active server first
shell: sh {{ ans_script_path }}stopstart_terracotta_main.sh shutdown
register: run_result
- debug:
msg: "{{ run_result.stdout }}"
- hosts: "{{ passive_hostname | mandatory }}"
tasks:
- name: Run the shutdown on passive server first
shell: sh {{ ans_script_path }}stopstart_terracotta_main.sh shutdown
register: run_result
- debug:
msg: "{{ run_result.stdout }}"
but I don't know how to set the value of active_hostname & passive_hostname based on the value from the script above.
How can I set the Ansible variable of active_hostname & passive_hostname based on the output of the first section?
A better solution came to my mind is to include hosts in new groups according to their state.
This would be more optimal in case there are more than two hosts.
- hosts: all
gather_facts: no
vars:
ans_script_path: /tmp/
tasks:
- name: Grab state of server
shell: |
cat {{ ans_script_path }}currentstate.log
register: server_state
- add_host:
hostname: "{{ item }}"
# every host will be added to a new group according to its state
groups: "{{ 'active' if hostvars[item].server_state.stdout == 'ACTIVE' else 'passive' }}"
# Shorter, but the new groups will be in capital letters
# groups: "{{ hostvars[item].server_state.stdout }}"
loop: "{{ ansible_play_hosts }}"
changed_when: false
- name: show the groups the host(s) are in
debug:
msg: "{{ group_names }}"
- hosts: active
gather_facts: no
tasks:
- name: Run the shutdown on active server first
shell: hostname -f # changed that for debugging
register: run_result
- debug:
msg: "{{ run_result.stdout }}"
- hosts: passive
gather_facts: no
tasks:
- name: Run the shutdown on passive server first
shell: hostname -f
register: run_result
- debug:
msg: "{{ run_result.stdout }}"
test-001 is PASSIVE
test-002 is ACTIVE
PLAY [all] ***************************************************************
TASK [Grab state of server] **********************************************
ok: [test-002]
ok: [test-001]
TASK [add_host] **********************************************************
ok: [test-001] => (item=test-001)
ok: [test-001] => (item=test-002)
TASK [show the groups the host(s) are in] ********************************
ok: [test-001] => {
"msg": [
"passive"
]
}
ok: [test-002] => {
"msg": [
"active"
]
}
PLAY [active] *************************************************************
TASK [Run the shutdown on active server first] ****************************
changed: [test-002]
TASK [debug] **************************************************************
ok: [test-002] => {
"msg": "test-002"
}
PLAY [passive] ************************************************************
TASK [Run the shutdown on passive server first] ****************************
changed: [test-001]
TASK [debug] **************************************************************
ok: [test-001] => {
"msg": "test-001"
}
PLAY RECAP ****************************************************************
test-001 : ok=5 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
test-002 : ok=4 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
For example, given two remote hosts
shell> ssh admin#test_11 cat /tmp/currentstate.log
ACTIVE
shell> ssh admin#test_13 cat /tmp/currentstate.log
PASSIVE
The playbook below reads the files and runs the commands on active and passive servers
shell> cat pb.yml
- hosts: "{{ host1 }},{{ host2 }}"
gather_facts: false
vars:
server_states: "{{ dict(ansible_play_hosts|
zip(ansible_play_hosts|
map('extract', hostvars, ['server_state', 'stdout'])|
list)) }}"
server_active: "{{ server_states|dict2items|
selectattr('value', 'eq', 'ACTIVE')|
map(attribute='key')|list }}"
server_pasive: "{{ server_states|dict2items|
selectattr('value', 'eq', 'PASSIVE')|
map(attribute='key')|list }}"
tasks:
- command: cat /tmp/currentstate.log
register: server_state
- debug:
var: server_state.stdout
- block:
- debug:
var: server_states
- debug:
var: server_active
- debug:
var: server_pasive
run_once: true
- command: echo 'Shutdown active server'
register: out_active
delegate_to: "{{ server_active.0 }}"
- command: echo 'Shutdown passive server'
register: out_pasive
delegate_to: "{{ server_pasive.0 }}"
- debug:
msg: |
{{ server_active.0 }}: [{{ out_active.stdout }}] {{ out_active.start }}
{{ server_pasive.0 }}: [{{ out_pasive.stdout }}] {{ out_pasive.start }}
run_once: true
shell> ansible-playbook pb.yml -e host1=test_11 -e host2=test_13
PLAY [test_11,test_13] ***********************************************************************
TASK [command] *******************************************************************************
changed: [test_13]
changed: [test_11]
TASK [debug] *********************************************************************************
ok: [test_11] =>
server_state.stdout: ACTIVE
ok: [test_13] =>
server_state.stdout: PASSIVE
TASK [debug] *********************************************************************************
ok: [test_11] =>
server_states:
test_11: ACTIVE
test_13: PASSIVE
TASK [debug] *********************************************************************************
ok: [test_11] =>
server_active:
- test_11
TASK [debug] *********************************************************************************
ok: [test_11] =>
server_pasive:
- test_13
TASK [command] *******************************************************************************
changed: [test_11]
changed: [test_13 -> test_11]
TASK [command] *******************************************************************************
changed: [test_11 -> test_13]
changed: [test_13]
TASK [debug] *********************************************************************************
ok: [test_11] =>
msg: |-
test_11: [Shutdown active server] 2022-10-27 11:16:00.766309
test_13: [Shutdown passive server] 2022-10-27 11:16:02.501907
PLAY RECAP ***********************************************************************************
test_11: ok=8 changed=3 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
test_13: ok=4 changed=3 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
From the description of your use case I understand that you like to perform tasks on certain servers which a have service role installed (annot.: Terracotta Server) and based on a certain service state.
Therefore, I like to recommend an approach with Custom facts.
Depending on if you have control about where the currentstate.log is placed or how it is structured, you could use in example something like
cat /tmp/ansible/service/terracotta.fact
[currentstate]
ACTIVE = true
PASSIVE = false
or add dynamic facts by adding executable scripts to facts.d ...
Means, alternatively, you can add the current service state to your host facts by creating and running a script in facts.d, which would just read the content of /tmp/currentstate.log.
Then, a sample playbook like
---
- hosts: localhost
become: false
gather_facts: true
fact_path: /tmp/ansible/service
gather_subset:
- "!all"
- "!min"
- "local"
tasks:
- name: Show Gathered Facts
debug:
msg: "{{ ansible_facts }}"
when: ansible_local.terracotta.currentstate.active | bool
will result into an output of
TASK [Show Gathered Facts] ******
ok: [localhost] =>
msg:
ansible_local:
terracotta:
currentstate:
active: 'true'
passive: 'false'
gather_subset:
- '!all'
- '!min'
- local
module_setup: true
An other approach is to address How the inventory is build and Group the hosts
[terracotta:children]
terracotta_active
terracotta_passive
[terracotta_active]
terracotta1.example.com
[terracotta_passive]
terracotta2.example.com
You can then just easily and simple define where a playbook or task should run, just by Targeting hosts and groups
ansible-inventory -i hosts--graph
#all:
|--#terracotta:
| |--#terracotta_active:
| | |--terracotta1.example.com
| |--#terracotta_passive:
| | |--terracotta2.example.com
|--#ungrouped:
ansible-inventory -i hosts terracotta_active --graph
#terracotta_active:
|--terracotta1.example.com
or Conditionals based on ansible_facts, in example
when: 'terracotta_active' in group_names
... from my understanding, both would be minimal and simple solutions without re-implementing functionality which seems to be already there.

How to execute on multiple hosts in ansible

I have a script that will execute in two parts. First it will execute on localhost and query a database table to get a hostname. second part of the script should run on the host which was registered in the query before. I am not able to set the host with the set_fact I did in the first part of the code.
this is what iam trying to do:
- hosts: localhost
gather_facts: false
become: yes
become_user: oracle
vars_files:
- vars/main.yml
tasks:
- name: Get new hostname
tempfile:
state: file
register: tf
- name: create sql file
template:
src: get_hostname.sql.j2
dest:"{{ tf.path }}"
mode: 0775
- name: login
command:
argv:
- "sqlplus"
- -s
- "#{{ tf.path }}"
environment:
ORACLE_HOME: "oracle/home"
register: command_out
- set_fact:
NEW_HOST: "{{ command_out.stdout }}"
- hosts: "{{ NEW_HOST }}"
gather_facts: false
become: yes
become_user: oracle
vars_file:
- vars/main.yml
tasks:
- name: debug
command: hostname
register: new_host_out
- debug:
msg: "new host is {{ new_host_out.stdout }}"
Everything works fine in the first part of the code, but errors out at the second part saying it cannot find the NEW_HOST.
Use hostvars to reference such a variable. Create a dummy host to keep this variable. For example, given the inventory
shell> cat hosts
dummy
[test]
test_11
test_12
test_13
The playbook creates the variable. See Delegated facts
shell> cat pb.yml
- hosts: localhost
tasks:
- set_fact:
NEW_HOST: test_12
delegate_to: dummy
delegate_facts: true
- debug:
var: hostvars.dummy.NEW_HOST
- hosts: "{{ hostvars.dummy.NEW_HOST }}"
gather_facts: false
tasks:
- debug:
var: inventory_hostname
gives
shell> ansible-playbook pb.yml
PLAY [localhost] ****************************************************************************
TASK [set_fact] *****************************************************************************
ok: [localhost -> dummy]
TASK [debug] ********************************************************************************
ok: [localhost] =>
hostvars.dummy.NEW_HOST: test_12
PLAY [test_12] ******************************************************************************
TASK [debug] ********************************************************************************
ok: [test_12] =>
inventory_hostname: test_12
PLAY RECAP **********************************************************************************
localhost: ok=2 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
test_12 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
You can use localhost for this purpose as well. The playbook below works as expected
- hosts: localhost
tasks:
- set_fact:
NEW_HOST: test_12
- hosts: "{{ hostvars.localhost.NEW_HOST }}"
gather_facts: false
tasks:
- debug:
var: inventory_hostname

Ansible cannot set hosts: value as previously defined variable with set_fact

I have checked this subject on SO, but still cannot fix my issue.
I set variable in play 1, to use in hosts: value in play 2, But Ansible 2.9 says, undefined Variable.
Can someone help pls?
Below is the example :
---
############ Play 1 ###############
- name: Test Play 1
hosts: localhost
gather_facts: no
tasks:
- name: Set facts
set_fact:
action_host: localhost
############ Play 2 ###############
- name: Test Play 2
hosts: "{{ action_host }}"
#hosts: localhost
gather_facts: no
tasks:
- name: Test Play 2
shell: |
echo toto
Output:
PLAY [Test Play 1] ***********************************************************************************************
TASK [Set facts] *************************************************************************************************
ok: [localhost]
ERROR! The field 'hosts' has an invalid value, which includes an undefined variable. The error was: 'action_host' is undefined
Tried This, but Still Error:
############ Play 2 ###############
- name: Test Play 2
#hosts: localhost
hosts: hostvars.localhost.action_host
gather_facts: no
tasks:
- name: Test Play 2
shell: |
echo toto
Output :
[WARNING]: Could not match supplied host pattern, ignoring: hostvars.localhost.action_host
PLAY [Test Play 2] ***********************************************************************************************
skipping: no hosts matched
Update:
So I've done half a way with the solution provided by #Zeitounator .
It seems set_facts works between the plays, as long as It's on the same host. (here localhost).
It's a pity I cannot change dynamically hosts: between plays.
My idea is to find the "correct host" where the VM has specific type of disks, and then run the play 2 only on this host. And this host is already in the inventory.
Need to find a different workaround.
Final yaml is here, if someone has an idea.
---
############ Play 1 ###############
- name: Test Play 1
hosts: localhost
gather_facts: yes
tasks:
- name: Set facts
set_fact:
action_host: localhost # OK with Play 2, only when localhost
#action_host: host0008.net.intra # NOT OK with play 2, even this host is in inventory
test_string: "Test string from Play 1" # OK with Play 2, only when action_host=localhost
run_once: yes # No effect
############ Play 2 ###############
- name: Test Play 2
#hosts: localhost # OK With play 2
# hosts: "{{ action_host }}" # Not OK, never
hosts: "{{ hostvars.localhost.action_host }}" # OK with Play 2, only when action_host=localhost
#hosts: "{{ hostvars[action_host].inventory_hostname }}" # Not OK
tasks:
- name: Test Play 2
shell: |
echo toto
- name: Display Play 1 variables
debug:
msg: "{{ test_string }}"
Output :
PLAY [Test Play 1] ***********************************************************************************************
TASK [Gathering Facts] *******************************************************************************************
ok: [localhost]
TASK [Set facts] *************************************************************************************************
ok: [localhost]
PLAY [Test Play 2] ***********************************************************************************************
TASK [Test Play 2] ***********************************************************************************************
changed: [localhost]
TASK [Display Play 1 variables] **********************************************************************************
ok: [localhost] => {
"msg": "Test string from Play 1"
}
PLAY RECAP *******************************************************************************************************
localhost : ok=4 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
$ ansible-inventory --graph
#all:
|--#all_pure:
| |--#pure_dr:
| | |--host0003.net.intra
| | |--host0008.net.intra
| |--#pure_prod:
| | |--host60003.net.intra
| | |--host60008.net.intra
|--#local:
| |--localhost
|--#ungrouped:

How to delegate facts to localhost from a play targeting remote hosts

ansible version: 2.9.16 running on RHEL 7.9 python ver = 2.7.5 targeting windows 2016 servers. ( should behave the same for linux target servers too)
EDIT: Switched to using host specific variables in inventory to avoid confusion that Iam just trying to print hostnames of a group. Even here its a gross simplification. Pretend that var1 is obtained dynamically for each server instead of being declared in the inventory file.
My playbook has two plays. One targets 3 remote servers ( Note: serial: 0 i.e Concurrently ) and another just the localhost. In play1 I am trying to delegate facts obtained from each of these hosts to the localhost using delegate_facts and delegate_to. The intent is to have these facts delegated to a single host ( localhost ) so I can use it later in a play2 (using hostvars) that targets the localhost. But strangely thats not working. It only has information from the last host from Play1.
Any help will be greatly appreciated.
my inventory file inventory/test.ini looks like this:
[my_servers]
svr1 var1='abc'
svr2 var1='xyz'
svr3 var1='pqr'
My Code:
## Play1
- name: Main play that runs against multiple remote servers and builds a list.
hosts: 'my_servers' # my inventory group that contains 3 servers svr1,svr2,svr3
any_errors_fatal: false
ignore_unreachable: true
gather_facts: true
serial: 0
tasks:
- name: initialize my_server_list as a list and delegate to localhost
set_fact:
my_server_list: []
delegate_facts: yes
delegate_to: localhost
- command: /root/complex_script.sh
register: result
- set_fact:
my_server_list: "{{ my_server_list + hostvars[inventory_hostname]['result.stdout'] }}"
# run_once: true ## Commented as I need to query the hostvars for each host where this executes.
delegate_to: localhost
delegate_facts: true
- name: "Print list - 1"
debug:
msg:
- "{{ hostvars['localhost']['my_server_list'] | default(['NotFound']) | to_nice_yaml }}"
# run_once: true
- name: "Print list - 2"
debug:
msg:
- "{{ my_server_list | default(['NA']) }}"
## Play2
- name: Print my_server_list which was built in Play1
hosts: localhost
gather_facts: true
serial: 0
tasks:
- name: "Print my_server_list without hostvars "
debug:
msg:
- "{{ my_server_list | to_nice_json }}"
# delegate_to: localhost
- name: "Print my_server_list using hostvars"
debug:
msg:
- "{{ hostvars['localhost']['my_server_list'] | to_nice_yaml }}"
# delegate_to: localhost
###Output###
$ ansible-playbook -i inventory/test.ini delegate_facts.yml
PLAY [Main playbook that runs against multiple remote servers and builds a list.] ***********************************************************************************************************
TASK [Gathering Facts] **********************************************************************************************************************************************************************
ok: [svr3]
ok: [svr1]
ok: [svr2]
TASK [initialize] ***************************************************************************************************************************************************************************
ok: [svr1]
ok: [svr2]
ok: [svr3]
TASK [Build a list of servers] **************************************************************************************************************************************************************
ok: [svr1]
ok: [svr2]
ok: [svr3]
TASK [Print list - 1] ***********************************************************************************************************************************************************************
ok: [svr1] =>
msg:
- |-
- pqr
ok: [svr2] =>
msg:
- |-
- pqr
ok: [svr3] =>
msg:
- |-
- pqr
TASK [Print list - 2] ***********************************************************************************************************************************************************************
ok: [svr1] =>
msg:
- - NA
ok: [svr2] =>
msg:
- - NA
ok: [svr3] =>
msg:
- - NA
PLAY [Print my_server_list] *****************************************************************************************************************************************************************
TASK [Gathering Facts] **********************************************************************************************************************************************************************
ok: [localhost]
TASK [Print my_server_list without hostvars] ************************************************************************************************************************************************
ok: [localhost] =>
msg:
- |-
[
"pqr"
]
TASK [Print my_server_list using hostvars] **************************************************************************************************************************************************
ok: [localhost] =>
msg:
- |-
- pqr
PLAY RECAP **********************************************************************************************************************************************************************************
localhost : ok=3 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
svr1 : ok=5 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
svr2 : ok=5 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
svr3 : ok=5 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
Playbook run took 0 days, 0 hours, 0 minutes, 13 seconds
###Expected Output###
I was expecting the last two debug statements in Play2 to contain the values of var1 for all the servers something like this:
TASK [Print my_server_list using hostvars] **************************************************************************************************************************************************
ok: [localhost] =>
msg:
- |-
- abc
- xyz
- pqr
Use Special Variables, e.g.
- hosts: all
gather_facts: false
tasks:
- set_fact:
my_server_list: "{{ ansible_play_hosts_all }}"
run_once: true
delegate_to: localhost
delegate_facts: true
- hosts: localhost
gather_facts: false
tasks:
- debug:
var: my_server_list
gives
ok: [localhost] =>
my_server_list:
- svr1
- svr2
- svr3
There are many other ways how to create the list, e.g.
- hosts: all
gather_facts: false
tasks:
- debug:
msg: "{{ groups.my_servers }}"
run_once: true
- hosts: all
gather_facts: false
tasks:
- debug:
msg: "{{ hostvars|json_query('*.inventory_hostname') }}"
run_once: true
Q: "Fill the list with outputs gathered by running complex commands."
A: Last example above shows how to create a list from hostvars. Register the result from the complex command, e.g.
shell> ssh admin#srv1 cat /root/complex_script.sh
#!/bin/sh
ifconfig wlan0 | grep inet | cut -w -f3
The playbook
- hosts: all
gather_facts: false
tasks:
- command: /root/complex_script.sh
register: result
- set_fact:
my_server_list: "{{ hostvars|json_query('*.result.stdout') }}"
run_once: true
delegate_to: localhost
delegate_facts: true
- hosts: localhost
gather_facts: false
tasks:
- debug:
var: my_server_list
gives
my_server_list:
- 10.1.0.61
- 10.1.0.62
- 10.1.0.63
Q: "Why the logic of delegating facts to localhost and keep appending them to that list does not work?"
A: The code below (simplified) can't work because the right-hand-side msl value still comes from the hostvars of the inventory_host despite the fact delegate_facts: true. This merely puts the created variable msl into the localhost's hostvars
- hosts: my_servers
tasks:
- set_fact:
msl: "{{ msl|default([]) + [inventory_hostname] }}"
delegate_to: localhost
delegate_facts: true
Quoting from Delegating facts
To assign gathered facts to the delegated host instead of the current host, set delegate_facts to true
As a result of such code, the variable msl will keep the last assigned value only.

Ansible run task multiple times based on groups

How do you run a task on each group, not just once for the groups?
I was excepting the Run this on each host tasks would run once for each group_var value. Instead it seems to just be picking one and running it.
I plan on breaking these across multiple servers later but for now it should be able to run on one autoscale, and then easy break it up into multiple autoscale groups later as demand increases.
playbook.yml:
---
# Run with: ansible-playbook -i localhost, playbook.yml
- name: Register Groups
hosts: localhost
connection: local
tasks:
- name: Add the groups
add_host:
name: localhost
ansible_connection: local
groups: rest-api, msg-consumer
- name: Run this on each host
hosts:
- rest-api
- msg-consumer
tasks:
- name: Say type
debug: var=item
with_items: run_type
group_vars/rest-api:
---
run_type: web
group_vars/msg-consumer:
---
run_type: consumer
Output Ansible 1.8.2:
$ ansible-playbook -i localhost, playbook.yml
PLAY [Register Groups] ********************************************************
GATHERING FACTS ***************************************************************
ok: [localhost]
TASK: [Add the groups] ********************************************************
ok: [localhost]
PLAY [Run this on each host] **************************************************
GATHERING FACTS ***************************************************************
ok: [localhost]
TASK: [Say type] **************************************************************
ok: [localhost] => (item=consumer) => {
"item": "consumer"
}
PLAY RECAP ********************************************************************
localhost : ok=4 changed=0 unreachable=0 failed=0
Note: It may be something else. I thought I could also clutter my playbook but breaking up the tasks like like follows:
---
- name: Register Groups
hosts: localhost
connection: local
tasks:
- name: Add the groups
add_host:
name: localhost
ansible_connection: local
groups: rest-api, msg-consumer
- name: Run this on each host
hosts:
- msg-consumer
tasks:
- name: Say type
debug: var=item
with_items: run_type
- name: Run this on each host
hosts:
- rest-api
tasks:
- name: Say type
debug: var=item
with_items: run_type
But the output for the 2nd playbook is:
$ ansible-playbook -i localhost, playbook2.yml
PLAY [Register Groups] ********************************************************
GATHERING FACTS ***************************************************************
ok: [localhost]
TASK: [Add the groups] ********************************************************
ok: [localhost]
PLAY [Run this on each host] **************************************************
GATHERING FACTS ***************************************************************
ok: [localhost]
TASK: [Say type] **************************************************************
ok: [localhost] => (item=consumer) => {
"item": "consumer"
}
PLAY [Run this on each host] **************************************************
GATHERING FACTS ***************************************************************
ok: [localhost]
TASK: [Say type] **************************************************************
ok: [localhost] => (item=consumer) => {
"item": "consumer"
}
PLAY RECAP ********************************************************************
localhost : ok=6 changed=0 unreachable=0 failed=0
Edit: Yet Another attempt to access the data, it looks like group_vars isn't behaving like I expect. The following outputs consumer twice also.
-
# Run with: ansible-playbook -i localhost, playbook.yml
- name: Register Groups
hosts: localhost
connection: local
tasks:
- name: Add the groups
add_host:
name: localhost
ansible_connection: local
groups: rest-api, msg-consumer
- name: Run this on each host
hosts:
- msg-consumer
- rest-api
tasks:
- name: What's your run type
debug: var=hostvars[groups[item][0]]['run_type']
with_items: group_names
The easiest way to do this is to use aliases for the hostnames instead of the real hosts:
---
- name: Register Groups
hosts: localhost
connection: local
tasks:
- name: Add the rest-api alias for my app
add_host:
name: my-app-rest-api
ansible_ssh_host: 127.0.0.1
groups: rest-api
- name: Add the msg-consumer alias for my app
add_host:
name: my-app-msg-consumer
ansible_ssh_host: 127.0.0.1
groups: msg-consumer
- name: Test Run Types
hosts:
- msg-consumer
- rest-api
tasks:
- name: What's your run type
debug: msg="Run Type of {{ ansible_ssh_host }} is {{ run_type }}"
now you can use your group_vars again:
group_vars/rest-api:
---
run_type: web
group_vars/msg-consumer:
---
run_type: consumer
and the output will be:
PLAY [Register Groups] ********************************************************
TASK: [Add the rest-api alias for my app] *************************************
ok: [localhost]
TASK: [Add the msg-consumer alias for my app] *********************************
ok: [localhost]
PLAY [Test Run Types] *********************************************************
TASK: [What's your run type] **************************************************
ok: [my-app-msg-consumer] => {
"msg": "Run Type of 127.0.0.1 is consumer"
}
ok: [my-app-rest-api] => {
"msg": "Run Type of 127.0.0.1 is web"
}
For now this is the best I can come up with:
---
- name: Register Groups
hosts: localhost
connection: local
tasks:
- name: Add new host group
add_host:
name: 127.0.0.1
ansible_connection: local
groups: new-server
run_types:
- rest-api
- msg-consumer
- name: Add another new host group
add_host:
name: 127.0.0.2
ansible_connection: local
groups: new-server
run_types:
- nothing
- name: Test Run Types Server 1
hosts:
- new-server
tasks:
- name: What's your run type
debug: var=item
with_items: run_types
Note: The hosts must be different for this to work, otherwise it will override and use the last variable value used with add_host.
See my answer under Ansible run task once per database-name.
Basically, there is no run_once_per_group, and the closest method I'm aware of is a true run_once that loops over groups. To make matters more cluttered, there is no group_vars dictionary variable.
---
- hosts: all
tasks:
- name: "do this once per group"
delegate_to: localhost
debug:
msg: "do something on {{hostvars[groups[item.key].0]['somevar']}} for group named {{item}}"
run_once: yes
with_dict: groups
when: item.key not in ['all', 'ungrouped']

Resources