Background: I have a dynamic ansible inventory that is built by a previously run process, I don't know the IPs until after this task completes. I have 2 groups: db servers and web servers defined in the inventory file. The specific task I am trying to complete is create some_user#'dynamic_ip_of_webserver_group'.
I think I am close but somethings not quite right. In my dbserver role main task I have:
- name: Create DB User
mysql_user:
name: dbuser
host: "{{ item }}"
password: "{{ mysql_wordpress_password }}"
priv: "someDB.*:ALL"
with_items:
- "{{ ansible_hostname }}"
- 127.0.0.1
- ::1
- localhost
- "{{ hostvars[groups['webservers']] }}"
This errors out with:
TASK [dbservers : Create DB User] *******************************************************************************************************************************************************************
fatal: [10.10.10.13]: FAILED! => {"msg": "ansible.vars.hostvars.HostVars object has no element [u'10.10.10.30', u'10.10.10.240']"}
It is showing the right IPs and there is only 2 so both of those are correct. I think it is trying to access the inventory item as an object instead of the actual input?
Inventory File:
[webservers]
10.10.10.30
10.10.10.240
Simply:
- "{{ groups['webservers'] }}"
This works, because with_items flattens first nested level of lists.
Related
I can't figure this out. I have host_vars that I want to pass into my Ansible role. I can iterate through the items in my playbook, but it doesn't work if I give the item into my Ansible role. My host inventory looks like this:
hosts:
host_one:
domain: one
ip: 172.18.1.1
connection:
- connection_name: two
connection_ip: 172.18.1.2
- connection_index: three
local_hub_ip: 172.18.1.3
host_two:
domain: two
ip: 172.18.1.2
For instance,this works correctly:
tasks:
- debug:
msg: "{{item.connection_name}}"
with_items:
- "{{ connection }}"
will correctly print out the connections.connection_name for each connection I have, "two" and "three". However, if I try to pass it into a role:
tasks:
- name: Adding several connections
include_role:
name: connection-create
with_items:
- "{{ connection }}"
in which my role "connection-create" uses a variable called "connection_name" I get a:
FAILED! => {"msg": "'connection_name' is undefined"}
Why doesn't this work?
The loop with_items: "{{ connection }}" creates the loop variable item. The included role can use
item.connection_name
item.connection_ip
The loop variable can be renamed, if needed. See Loop control
i want to synchronize (copy) files between two remote hosts :.
Under my playbook , i'm running my tasks on several groups relativley to every task case.
My playbook looks like this :
---
- hosts: myHosts
gather_facts: true
become: true
become_user: "{{ ansi_user }}"
vars:
- buildServer_host : "127.78.11.04"
roles:
#... Different roles
...
- { role: myRole }
My inventory file looks like this :
[myHosts]
myGroupA
myGroupB
myGroupC
Under myRole , i ve this task :
- name: Copy jars to API server
synchronize:
src: "{{ Workspace_GEN_COLIS }}/{{item.item}}/target/{{item.stdout}}"
dest: "/opt/application/i99was/{{RCD_DEF_VERSION}}/{{item.item}}.jar"
with_items: "{{ jarFileNames.results }}"
when:
- ansible_host in groups['myGroupB']
delegate_to: "{{buildServer_host }}"
As you can see : i want to transfer file from my "buildServer_host" to the re other remote destination which is only the group "myGroupB"
This fails , and i ve no idea why .
It seems like it doen't understand this : ansible_host in groups['myGroupB'] as a destination host
Suggestions ?
The delegate_to is more than enough you don't need the when condition as the delegation would be done on the host you have selected in the delegate_to value.
I have a dynamic inventory that assigns a "fact" to each host, called a 'cluster_number'.
The cluster numbers are not known in advance, but there is one or more hosts that are assigned the same number. The inventory has hundreds of hosts and 2-3 dozen unique cluster numbers.
I want to run a task for all hosts in the inventory, however I want to execute it only once per each group of hosts sharing the same 'cluster_number' value. It does not matter which specific host is selected for each group.
I feel like there should be a relatively straight forward way to do this with ansible, but can't figure out how. I've looked at group_by, when, loop, delegate_to etc. But no success yet.
An option would be to
group_by the cluster_number
run_once a loop over cluster numbers
and pick the first host from each group.
For example given the hosts
[test]
test01 cluster_number='1'
test02 cluster_number='1'
test03 cluster_number='1'
test04 cluster_number='1'
test05 cluster_number='1'
test06 cluster_number='2'
test07 cluster_number='2'
test08 cluster_number='2'
test09 cluster_number='3'
test10 cluster_number='3'
[test:vars]
cluster_numbers=['1','2','3']
the following playbook
- hosts: all
gather_facts: no
tasks:
- group_by: key=cluster_{{ cluster_number }}
- debug: var=groups['cluster_{{ item }}'][0]
loop: "{{ cluster_numbers }}"
run_once: true
gives
> ansible-playbook test.yml | grep groups
"groups['cluster_1'][0]": "test01",
"groups['cluster_2'][0]": "test06",
"groups['cluster_3'][0]": "test09",
To execute tasks at the targets include_tasks (instead of debug in the loop above) and delegate_to the target
- set_fact:
my_group: "cluster_{{ item }}"
- command: hostname
delegate_to: "{{ groups[my_group][0] }}"
Note: Collect the list cluster_numbers from the inventory
cluster_numbers: "{{ hostvars|json_query('*.cluster_number')|unique }}"
If you don't mind play logs cluttering, here's a way:
- hosts: all
gather_facts: no
serial: 1
tasks:
- group_by:
key: "single_{{ cluster_number }}"
when: groups['single_'+cluster_number] | default([]) | count == 0
- hosts: single_*
gather_facts: no
tasks:
- debug:
msg: "{{ inventory_hostname }}"
serial: 1 is crucial in the first play to reevaluate when statement on for every host.
After first play you'll have N groups for each cluster with only single host in them.
I want to dynamically create an in-memory inventory which is a filter of a standard inventory including only the host where a specific service is installed. The filtered inventory is to be used in a subsequent play.
So I identify the IP address of the host where the service is installed.
- name: find where the service is installed
win_service:
name: "{{ service }}"
register: service_info
This returns a boolean 'exists' value. Using this value as a condition an attempt to add the host where the service is running is made.
- name: create filtered in memory inventory
add_host:
name: "{{ ansible_host }}"
when: service_info.exists
The add_host module bypasses the play host loop and only runs once for all the hosts in the play, as such this only works if the host that add_host runs against is the one that has the service installed.
Below is an attempt to force add_host to iterate across the hosts in the inventory however it appears that the hostvars and therefore service_info.exists are not being passed through to add_host and therefore the conditional 'when' check always returns false.
- name: create filtered in memory inventory
add_host:
name: "{{ ansible_host }}"
when: service_info.exists
with_items: "{{ ansible_play_batch }}"
Is there a way to pass the hosts with their hostvars to add_host as a iterator?
I suggest to create a tasks before add_host to create a temporary file on executor with the list of server matching the condition, and then looping in module add_host over the file.
example taken from Improving use of add_host in ansible that I asked before
---
- hosts: servers
tasks:
- name: find where the service is installed
win_service:
name: "{{ service }}"
register: service_info
- name: write server name in file on control node
lineinfile:
path: /tmp/servers_foo.txt
state: present
line: "{{ inventory_hostname }}"
delegate_to: 127.0.0.1
when: service_info.exists
- name: assign target to group
add_host:
name: "{{ item }}"
groups:
- foo
with_lines: cat /tmp/servers_foo.txt
delegate_to: 127.0.0.1
I want to run an Ansible playbook on multiple hosts and register outputs to a variable. Now using this variable, I want to copy output to single file. The issue is that in the end there is output of only one host in the file. How can I add output of all the hosts in a file one after the other. I don't want to use serial = 1 as it slows down execution considerably if we have multiple hosts.
-
hosts: all
remote_user: cisco
connection: local
gather_facts: no
vars_files:
- group_vars/passwords.yml
tasks:
- name: Show command collection
ntc_show_command:
connection: ssh
template_dir: /ntc-ansible/ntc-templates/templates
platform: cisco_ios
host: "{{ inventory_hostname }}"
username: "{{ ansible_ssh_user }}"
password: "{{ ansible_ssh_pass }}"
command: "{{commands}}"
register: result
- local_action:
copy content="{{result.response}}" dest='/home/user/show_cmd_ouput.txt'
result variable will be registered as a fact on each host the task ntc_show_command was run, thus you should access the value through hostvars dictionary.
- local_action:
module: copy
content: "{{ groups['all'] | map('extract', hostvars, 'result') | map(attribute='response') | list }}"
dest: /home/user/show_cmd_ouput.txt
run_once: true
You also need run_once because the action would still be run as many times as hosts in the group.