My playbook is as follows
- hosts: nodes
become: yes
tasks:
- name: Run Shell Script to get IPs with 4xx and 5xx errors
script: /home/ubuntu/ips.sh
args:
chdir: /home/ubuntu
register: ips
- name:
shell: echo "{{ hostvars[groups['nodes'][0]].ips.stdout}}" > pip.txt
delegate_to: localhost
There are 10 ansible hosts. Is There a way I can access Ips.stdout from all 10 hosts from my local server. I'm able to get the first host by the above command. How can I access all 10 hosts stdout from a single variable?
How can I access all 10 hosts stdout from a single variable?
Yes, using map("extract") followed by map(attribute=):
- shell: echo "{{ groups.nodes | map('extract', hostvars, 'ips') | map(attribute='stdout') | join(', ') }}" > pip.txt
delegate_to: localhost
run_once: true
You'll want run_once: true otherwise, yes, it will delegate to your local machine but it will also do that action once for every host in the inventory, which is wasteful.
If you're interested, you can also use copy: make it more ansible-y since it actually won't change your file if it knows the contents haven't changed:
- copy:
dest: pip.txt
content: "{{ groups.nodes | map('extract', hostvars, 'ips') | map(attribute='stdout') | join(', ') }}"
delegate_to: localhost
run_once: true
Related
The script for my ansible job is located in a gitlab repository.
E.g.: /Ansible/job.yaml
I want to create a new file from my Ansible job that contain the response of another Ansible job that I run in the same location as my job script.
E.g.: /Ansible/ouput.txt
Is it possible? Usually I put the file to the server host but this time I need it to be in GitLab.
Given that GitLab do let you write in that folder indeed, you can:
delegate the task to write to localhost
use the special variable playbook_dir to write it in the same folder as the playbook
So, something like:
- copy:
content: "{{ some_other_registered_task_output }}"
dest: "{{ playbook_dir }}/output.txt"
delegate_to: localhost
Mind that if you are targeting multiple hosts in that playbook, you will have to aggregate all nodes registered output, otherwise you will end up with the output of one node only.
Which you can achieve with something like:
- copy:
content: "{{
hostvars
| dict2items
| selectattr('key', 'in', ansible_play_hosts)
| map(attribute='value.some_other_registered_task_output')
| join('\n\n')
}}"
dest: "{{ playbook_dir }}/output.txt"
delegate_to: localhost
For example, the two tasks:
- command: echo '{{ inventory_hostname }}'
register: some_other_registered_task_output
- copy:
content: "{{
hostvars
| dict2items
| selectattr('key', 'in', ansible_play_hosts)
| map(attribute='value.some_other_registered_task_output.stdout')
| join('\n\n')
}}"
dest: "{{ playbook_dir }}/output.txt"
delegate_to: localhost
Run on nodes called node1, node2 and node3, would create a file output.txt, in the same folder as the playbook, on the controller, containing:
node3
node1
node2
I want to write hostname and ssh_host under a specific group from an inventory file to a file in my localhost.
I have tried this:
my_role
set_fact:
hosts: "{{ hosts_list|default({}) | combine( {inventory_hostname: ansible_ssh_host} ) }}"
when: "'my_group' in group_names"
Playbook
become: no
gather_facts: no
roles:
- my_role
But now if i try to write this to a file in another task, it will create a file for the hosts in inventory file.
Can you please suggest if there is a better way to create a file containing dictionary with inventory_hostname and ansible_ssh_host as key and value respectively, by running playbook on localhost.
An option would be to use lineinfile and delegate_to localhost.
- hosts: test_01
tasks:
- set_fact:
my_inventory_hostname: "{{ ansible_host }}"
- tempfile:
delegate_to: localhost
register: result
- lineinfile:
path: "{{ result.path }}"
line: "inventory_hostname: {{ my_inventory_hostname }}"
delegate_to: localhost
- debug:
msg: "inventory_hostname: {{ ansible_host }}
stored in {{ result.path }}"
Note: There is no need to quote the conditions. Conditions are expanded by default.
when: my_group in group_names
Thanks a lot for your answer#Vladimir Botka
I came up with another requirement to write the hosts belonging to a certain group to a file, below code can help with that (adding to #Vladimir's solution).
- hosts: all
become: no
gather_facts: no
tasks:
- set_fact:
my_inventory_hostname: "{{ ansible_host }}"
- lineinfile:
path: temp/hosts
line: "inventory_hostname: {{ my_inventory_hostname }}"
when: inventory_hostname in groups['my_group']
delegate_to: localhost
- debug:
msg: "inventory_hostname: {{ ansible_host }}"
I want to run an Ansible playbook on multiple hosts and register outputs to a variable. Now using this variable, I want to copy output to single file. The issue is that in the end there is output of only one host in the file. How can I add output of all the hosts in a file one after the other. I don't want to use serial = 1 as it slows down execution considerably if we have multiple hosts.
-
hosts: all
remote_user: cisco
connection: local
gather_facts: no
vars_files:
- group_vars/passwords.yml
tasks:
- name: Show command collection
ntc_show_command:
connection: ssh
template_dir: /ntc-ansible/ntc-templates/templates
platform: cisco_ios
host: "{{ inventory_hostname }}"
username: "{{ ansible_ssh_user }}"
password: "{{ ansible_ssh_pass }}"
command: "{{commands}}"
register: result
- local_action:
copy content="{{result.response}}" dest='/home/user/show_cmd_ouput.txt'
result variable will be registered as a fact on each host the task ntc_show_command was run, thus you should access the value through hostvars dictionary.
- local_action:
module: copy
content: "{{ groups['all'] | map('extract', hostvars, 'result') | map(attribute='response') | list }}"
dest: /home/user/show_cmd_ouput.txt
run_once: true
You also need run_once because the action would still be run as many times as hosts in the group.
I have the following playbook
- hosts: all
gather_facts: False
tasks:
- name: Check status of applications
shell: somecommand
register: result
changed_when: False
always_run: yes
After this task, I want to run a mail task that will mail the accumulated output of all the commands for the above task registered in the variable result. As of right now, when I try and do this, I get mailed for every single host. Is there some way to accumulate the output across multiple hosts and register that to a variable?
You can extract result from hostvars inside a run_once task:
- hosts: mygroup
gather_facts: false
tasks:
- shell: date
register: date_res
changed_when: false
- debug:
msg: "{{ ansible_play_hosts | map('extract', hostvars, 'date_res') | map(attribute='stdout') | list }}"
run_once: yes
This will print out a list of all date_res.stdout from all hosts in the current play and run this task only once.
While trying to copy the results of date_res.stdout to a file on host only single host data is copied not the all host's data is available
- name: copy all
copy:
content: "{{ allhost_out.stdout }}"
dest: "/ngs/app/user/outputsecond-{{ inventory_hostname }}.txt"
I'm creating a playbook with this play:
On hosts hypervisors:
retrieve list of virtual machines from all hosts
use module add_host to add all of them in a new inventory group called guests
My inventory:
[hypervisors]
host1
host2
My playbook:
- hosts: hypervisors
- shell: virsh list | awk 'NR>2' | awk '{print $2}'
register: result_virsh
- add_host:
name: "{{ item }}"
group: "guests"
with_items: "{{ result_virsh.stdout_lines }}"
Module add_host bypasses the play host loop and only runs once for all the hosts in the play.
Then it is called once (for host1), it's a particular case of the use of this module (see link above), as if the variable run_once was implicitly fixed to true.
How can I use it for all hosts in group hypervisors ?
EDIT: Example to reproduce it on your computer with only localhost
Create file /tmp/host1_test to simulate a return of guests vm1 and vm2:
vm1
vm2
Create file /tmp/host2_test to simulate a return of guests vm3 and vm4:
vm3
vm4
Use this inventory (test_add_host.ini) with two hosts, both with fixed IP address 127.0.0.1:
[hypervisors]
host1 ansible_host=127.0.0.1 test_filename=/tmp/host1_test
host2 ansible_host=127.0.0.1 test_filename=/tmp/host2_test
Use this playbook (test_add_host.yml):
- hosts: hypervisors
gather_facts: no
tasks:
- shell: "cat {{ test_filename }}"
register: result_virsh
- add_host:
name: "{{ item }}"
group: "guests"
with_items: "{{ result_virsh.stdout_lines }}"
- hosts: guests
gather_facts: no
tasks:
- local_action: ping
Call this playbook locally with command:
ansible-playbook -c local -i test_add_host.ini test_add_host.yml
First play call hosts host1 and host2
Second play call hosts vm1 and vm2
What should I do to call all hosts (vm1, vm2, vm3 and vm4) in second play ?
As you noted, there's a thing about add_host: BYPASS_HOST_LOOP = True.
So it's a kind of forced run_once.
If you don't mind running over hypervisors in sequential manner, you can simply use serial: 1:
- hosts: hypervisors
serial: 1
tasks:
- shell: virsh list | awk 'NR>2' | awk '{print $2}'
register: result_virsh
- add_host:
name: "{{ item }}"
group: "guests"
with_items: "{{ result_virsh.stdout_lines }}"
This ensures that every play batch consists of only one host, so add_host executes for every host.
If you don't want to run the play serially you can aggregate the results with ansible_play_hosts and map. The results can be used in the next play.
- hosts: all
gather_facts: false
tasks:
- shell: virsh list | awk 'NR>2' | awk '{print $2}'
register: result_virsh
changed_when: false
- add_host:
name: "{{ item }}"
group: guests
changed_when: false
loop: "{{ ansible_play_hosts | map('extract', hostvars, 'result_virsh') | map(attribute='stdout_lines') | flatten }}"
- hosts: guests
gather_facts: false
tasks:
- ping:
This answer was derived from Ansible: Accumulate output across multiple hosts on task run.
I solved this problem (with my localhost example) with following playbook. This solution is very complex, if you have a simpler one, shared it!
I didn't want to use dynamic inventories
# Get list of virtual machines in hostvars[inventory_hostname].vms
- hosts: hypervisors
gather_facts: no
tasks:
- shell: "cat {{ test_filename }}"
register: result_virsh
- set_fact:
vms: "{{ result_virsh.stdout_lines }}"
# Remove previous vm_hosts file
- hosts: localhost
gather_facts: no
tasks:
- file:
path: /tmp/vm_hosts
state: absent
# Build file vm_hosts with list of virtual machines in serial (working in parallele with same file cause some troubles)
- hosts: hypervisors
gather_facts: no
serial: 1
tasks:
- block:
- file:
path: /tmp/vm_hosts
mode: 0644
state: touch
run_once: yes
- lineinfile:
dest: /tmp/vm_hosts
line: '{{ item }}'
with_items: "{{ hostvars[inventory_hostname].vms }}"
delegate_to: localhost
# Add list of virtual machines from file vm_hosts to in-memory inventory
- hosts: localhost
gather_facts: no
tasks:
- add_host:
name: "{{ item }}"
group: "guests"
with_lines: cat /tmp/vm_hosts
- hosts: guests
gather_facts: no
tasks:
- local_action: ping