Ansible and debugging script - ansible

I want to run a python script locally, and print (debug) the output. Running the script outside of ansible gives my this:
$ python ../scripts/test-winrm-v5.py
i-746726df
Here's my playbook:
$ cat test4.yml
- name: list instance ids
hosts: all
gather_facts: false
tasks:
local_action: shell python ../scripts/test-winrm-v5.sh
register: result
debug: msg="{{ result }}"
Here's what I get when I run the playbook:
$ ansible-playbook -i ../inventory/phani test4.yml
PLAY [list instance ids] ******************************************************
PLAY RECAP ********************************************************************
Why doesn't the output from the script get displayed?

Looks like I had some control character on the debug line. I typed that line again, and now it's working. Here's the playbook:
- name: list instance ids
hosts: windows
tasks:
- debug: msg="System {{ inventory_hostname }}"
- local_action: shell python ../scripts/test-winrm-v5.py
register: result
- debug: msg="Result of running script is {{ result.stdout }}"

Related

Ansible - dynamic variable with host_vars and string in the name

I'm trying to create a job role in Ansible to run yum install/update of packages, which will be provided by a 3rd party system as a .yml file to vars directory in a role with following convention: server01.yml, server02.yml, serverX.yml with variable in form packageList_serverNumber: 'list of packages'.
This variable will be read using a task:
- name: server update packages from host_vars
yum:
name: "{{ install_pkgs }}"
state: latest
This should point to host_vars file for specific host:
install_pkgs: "{{ packageList_server01 }}"
As this task should only run when the variable is defined, I was trying to use when clause with variable which will point to packageList_serverNumber. When I hardcode it, like below it is working:
when: packageList_server01 is defined
Can you please advise how to make it dynamic?
I was trying with:
when: packageList_{{hostvars[inventory_hostname]}} is defined
But unfortunately this is not working.
Use lookup plugin vars. Run the command below to see the details
shell> ansible-doc -t lookup vars
Given the vars files
shell> cat roles/test4/vars/server01.yml
packageList_server01: [pkg1, pkg2, pkg3]
shell> cat roles/test4/vars/server02.yml
packageList_server02: [pkg4, pkg5, pkg6]
shell> cat roles/test4/vars/server03.yml
packageList_server03: [pkg7, pkg8, pkg9]
read the vars, declare the variable install_pkgs, and use it
shell> cat roles/test4/tasks/main.yml
- include_vars: "vars/{{ inventory_hostname }}.yml"
- set_fact:
install_pkgs: "{{ lookup('vars', 'packageList_' ~ inventory_hostname) }}"
- debug:
msg: "Install {{ install_pkgs }}"
For example the playbook
- hosts: server01,server02,server03
gather_facts: false
roles:
- test4
gives (abridged)
TASK [test4 : debug] ****
ok: [server01] =>
msg: Install ['pkg1', 'pkg2', 'pkg3']
ok: [server03] =>
msg: Install ['pkg7', 'pkg8', 'pkg9']
ok: [server02] =>
msg: Install ['pkg4', 'pkg5', 'pkg6']

Pause time between hosts in the Ansible Inventory

I am trying the below task in my playbook. but the pause is not executed. i want the play should be paused for 30 sec once after the each host is deleted.
name: delete host from the NagiosXI
shell: curl -k -XDELETE "https://10.000.00.00/nagiosxi/api/v1/config/host?apikey=qdjcwc&pretty=1&host_name={{ item }}&applyconfig=1"
- pause:
seconds: 120
ignore_error: yes
with_items:
- "{{ groups['grp1'] }}"
can someone suggest if this is the right way if doing or propose me the right way. i also used serial=1 module but its still not working.
You can use pause under your loop:
- name: Pause
hosts: all
gather_facts: False
tasks:
- name: delete host from the NagiosXI
shell: curl -k -XDELETE "https://10.000.00.00/nagiosxi/api/v1/config/host?apikey=qdjcwc&pretty=1&host_name={{ item }}&applyconfig=1"
ignore_errors: True
with_items:
- "{{ groups['grp1'] }}"
loop_control:
pause: 120
Unfortunately, applying multiple tasks to with_items is not possible in Ansible at the moment, but is still doable with the include directive. As example,
The main play file would be
---
- hosts: localhost
connection: local
gather_facts: no
remote_user: me
tasks:
- include: sub_play.yml nagios_host={{ item }}
with_items:
- host1
- host2
- host3
The sub_play yml which is included in the main play would be,
---
- shell: echo "{{ nagios_host }}"
- pause:
prompt: "Waiting for {{ nagios_host }}"
seconds: 5
In this case, the include statement is executed over a loop which executes all the tasks in the sub_task yml.

How to get value of --limit argument inside an Ansible playbook?

In ansible, is it possible to get the value of the argument to the "--limit" option within a playbook? I want to do is something like this:
---
- hosts: all
remote user: root
tasks:
- name: The value of the --limit argument
debug:
msg: "argument of --limit is {{ ansible-limit-arg }}"
Then when I run he command:
$ ansible-playbook getLimitArg.yaml --limit webhosts
I'll get this output:
argument of --limit is webhost
Of course, I made up the name of the variable "ansible-limit-arg", but is there a valid way of doing this? I could specify "webhosts" twice, the second time with --extra-args, but that seems a roundabout way of having to do this.
Since Ansible 2.5 you can access the value using a new magic variable ansible_limit, so:
- debug:
var: ansible_limit
Have you considered using the {{ ansible_play_batch }} built-in variable?
- hosts: all
become: "False"
gather_facts: "False"
tasks:
- name: The value of the --limit argument
debug:
msg: "argument of --limit is {{ ansible_play_batch }}"
delegate_to: localhost
It won't tell you exactly what was entered as the argument but it will tell you how Ansible interpreted the --limit arg.
You can't do this without additional plugin/module. If you utterly need this, write action plugin and access options via cli.options (see example).
P.S. If you try to use --limit to run playbooks agains different environments, don't do it, you can accidentally blow your whole infrastructure – use different inventories instead.
Here is the small code block to achieve the same
- block:
- shell: 'echo {{inventory_hostname}} >> /tmp/hosts'
- shell: cat /tmp/hosts
register: servers
- file:
path: /tmp/hosts
state: absent
delegate_to: 127.0.0.1
- debug: var=servers.stdout_lines
Then use stdout_lines output as u wish like mine
- add_host:
name: "{{ item }}"
groups: cluster
ansible_user: "ubuntu"
with_items:
- "{{ servers.stdout_lines }}"

Ansible: Accumulate output across multiple hosts on task run

I have the following playbook
- hosts: all
gather_facts: False
tasks:
- name: Check status of applications
shell: somecommand
register: result
changed_when: False
always_run: yes
After this task, I want to run a mail task that will mail the accumulated output of all the commands for the above task registered in the variable result. As of right now, when I try and do this, I get mailed for every single host. Is there some way to accumulate the output across multiple hosts and register that to a variable?
You can extract result from hostvars inside a run_once task:
- hosts: mygroup
gather_facts: false
tasks:
- shell: date
register: date_res
changed_when: false
- debug:
msg: "{{ ansible_play_hosts | map('extract', hostvars, 'date_res') | map(attribute='stdout') | list }}"
run_once: yes
This will print out a list of all date_res.stdout from all hosts in the current play and run this task only once.
While trying to copy the results of date_res.stdout to a file on host only single host data is copied not the all host's data is available
- name: copy all
copy:
content: "{{ allhost_out.stdout }}"
dest: "/ngs/app/user/outputsecond-{{ inventory_hostname }}.txt"

Ansible writing output from multiple task to a single file

In Ansible, I have written an Yaml playbook that takes list of host name and the executes command for each host. I have registered a variable for these task and at the end of executing a task I append output of each command to a single file.
But every time I try to append to my output file, only the last record is getting persisted.
---
- hosts: list_of_hosts
become_user: some user
vars:
output: []
tasks:
- name: some name
command: some command
register: output
failed_when: "'FAILED' in output"
- debug: msg="{{output | to_nice_json}}"
- local_action: copy content='{{output | to_nice_json}}' dest="/path/to/my/local/file"
I even tried to append using lineinfile using insertafter parameter yet was not successful.
Anything that I am missing?
You can try something like this:
- name: dummy
hosts: myhosts
serial: 1
tasks:
- name: create file
file:
dest: /tmp/foo
state: touch
delegate_to: localhost
- name: run cmd
shell: echo "{{ inventory_hostname }}"
register: op
- name: append
lineinfile:
dest: /tmp/foo
line: "{{ op }}"
insertafter: EOF
delegate_to: localhost
I have used serial: 1 as I am not sure if lineinfile tasks running in parallel will garble the output file.
Ansible doc recommend use copy:
- name: get jstack
shell: "/usr/lib/jvm/java/bin/jstack -l {{PID_JAVA_APP}}"
args:
executable: /bin/bash
register: jstackOut
- name: write jstack
copy:
content: "{{jstackOut.stdout}}"
dest: "tmp/jstack.txt"
If you want write local file, add this:
delegate_to: localhost
Why to complicate things ?
I did this like that and it worked:
ansible-playbook your_playbook.yml >> /file/you/want/to/redirect/output.txt
you can also try some parsing with grep or some other stuff with tee -a.

Resources