The script for my ansible job is located in a gitlab repository.
E.g.: /Ansible/job.yaml
I want to create a new file from my Ansible job that contain the response of another Ansible job that I run in the same location as my job script.
E.g.: /Ansible/ouput.txt
Is it possible? Usually I put the file to the server host but this time I need it to be in GitLab.
Given that GitLab do let you write in that folder indeed, you can:
delegate the task to write to localhost
use the special variable playbook_dir to write it in the same folder as the playbook
So, something like:
- copy:
content: "{{ some_other_registered_task_output }}"
dest: "{{ playbook_dir }}/output.txt"
delegate_to: localhost
Mind that if you are targeting multiple hosts in that playbook, you will have to aggregate all nodes registered output, otherwise you will end up with the output of one node only.
Which you can achieve with something like:
- copy:
content: "{{
hostvars
| dict2items
| selectattr('key', 'in', ansible_play_hosts)
| map(attribute='value.some_other_registered_task_output')
| join('\n\n')
}}"
dest: "{{ playbook_dir }}/output.txt"
delegate_to: localhost
For example, the two tasks:
- command: echo '{{ inventory_hostname }}'
register: some_other_registered_task_output
- copy:
content: "{{
hostvars
| dict2items
| selectattr('key', 'in', ansible_play_hosts)
| map(attribute='value.some_other_registered_task_output.stdout')
| join('\n\n')
}}"
dest: "{{ playbook_dir }}/output.txt"
delegate_to: localhost
Run on nodes called node1, node2 and node3, would create a file output.txt, in the same folder as the playbook, on the controller, containing:
node3
node1
node2
Related
I created a Worflow job in awx containing 2 jobs:
Job 1 is using the credentials of the windows server where we get the json file from. It reads the content and put it in a variable using set_stats
Job2 is using the credential of the server where to upload the json file. It reads the content of the variable set in the job 1 in the set_stats task and creates a json file with the content.
First job:
- name: get content
win_shell: 'type {{ file_dir }}{{ file_name }}'
register: content
- name: write content
debug:
msg: "{{ content.stdout_lines }} "
register: result
- set_fact:
this_local: "{{ content.stdout_lines }}"
- set_stats:
data:
test_stat: "{{ this_local }}"
- name: set hostname in a variable
set_stats:
data:
current_hostname: "{{ ansible_hostname }}"
per_host: no
Second job
- name: convert to json and copy the file to destination control node.
copy:
content: "{{ test_stat | to_json }}"
dest: "/tmp/{{ current_hostname }}.json"
How can I get the current_hostname, so that the the created json file is named <original_hostname>.json? In my case its concatenating the two hosts which I passed in the first job.
In my case its concatenating the two hosts which I passed in the first job
... which is precisely what you asked for since you used per_host: no as parameter to set_stats to gather the current_hostname stat globally for all host and that aggregate: yes is the default.
Anyhow, this is not exactly the intended use of set_stats and you are making this overly complicated IMO.
You don't need two jobs. In this particular case, you can delegate the write task to a linux host in the middle of a play dedicated to windows hosts (and one awx job can use several credentials).
Here is a pseudo untested playbook to give you the idea. You'll want to read the slurp module documentation which I used to replace your shell task to read the file (which is a bad practice).
Assuming your inventory looks something like:
---
windows_hosts:
hosts:
win1:
win2:
linux_hosts:
hosts:
json_file_target_server:
The playbook would look like:
- name: Gather jsons from win and write to linux target
hosts: windows_hosts
tasks:
- name: Get file content
slurp:
src: "{{ file_dir }}{{ file_name }}"
register: json_file
- name: Push json content to target linux
copy:
content: "{{ json_file.content | b64decode | to_json }}"
dest: "/tmp/{{ inventory_hostname }}.json"
delegate_to: json_file_target_server
My playbook is as follows
- hosts: nodes
become: yes
tasks:
- name: Run Shell Script to get IPs with 4xx and 5xx errors
script: /home/ubuntu/ips.sh
args:
chdir: /home/ubuntu
register: ips
- name:
shell: echo "{{ hostvars[groups['nodes'][0]].ips.stdout}}" > pip.txt
delegate_to: localhost
There are 10 ansible hosts. Is There a way I can access Ips.stdout from all 10 hosts from my local server. I'm able to get the first host by the above command. How can I access all 10 hosts stdout from a single variable?
How can I access all 10 hosts stdout from a single variable?
Yes, using map("extract") followed by map(attribute=):
- shell: echo "{{ groups.nodes | map('extract', hostvars, 'ips') | map(attribute='stdout') | join(', ') }}" > pip.txt
delegate_to: localhost
run_once: true
You'll want run_once: true otherwise, yes, it will delegate to your local machine but it will also do that action once for every host in the inventory, which is wasteful.
If you're interested, you can also use copy: make it more ansible-y since it actually won't change your file if it knows the contents haven't changed:
- copy:
dest: pip.txt
content: "{{ groups.nodes | map('extract', hostvars, 'ips') | map(attribute='stdout') | join(', ') }}"
delegate_to: localhost
run_once: true
I want to run an Ansible playbook on multiple hosts and register outputs to a variable. Now using this variable, I want to copy output to single file. The issue is that in the end there is output of only one host in the file. How can I add output of all the hosts in a file one after the other. I don't want to use serial = 1 as it slows down execution considerably if we have multiple hosts.
-
hosts: all
remote_user: cisco
connection: local
gather_facts: no
vars_files:
- group_vars/passwords.yml
tasks:
- name: Show command collection
ntc_show_command:
connection: ssh
template_dir: /ntc-ansible/ntc-templates/templates
platform: cisco_ios
host: "{{ inventory_hostname }}"
username: "{{ ansible_ssh_user }}"
password: "{{ ansible_ssh_pass }}"
command: "{{commands}}"
register: result
- local_action:
copy content="{{result.response}}" dest='/home/user/show_cmd_ouput.txt'
result variable will be registered as a fact on each host the task ntc_show_command was run, thus you should access the value through hostvars dictionary.
- local_action:
module: copy
content: "{{ groups['all'] | map('extract', hostvars, 'result') | map(attribute='response') | list }}"
dest: /home/user/show_cmd_ouput.txt
run_once: true
You also need run_once because the action would still be run as many times as hosts in the group.
I have the following playbook
- hosts: all
gather_facts: False
tasks:
- name: Check status of applications
shell: somecommand
register: result
changed_when: False
always_run: yes
After this task, I want to run a mail task that will mail the accumulated output of all the commands for the above task registered in the variable result. As of right now, when I try and do this, I get mailed for every single host. Is there some way to accumulate the output across multiple hosts and register that to a variable?
You can extract result from hostvars inside a run_once task:
- hosts: mygroup
gather_facts: false
tasks:
- shell: date
register: date_res
changed_when: false
- debug:
msg: "{{ ansible_play_hosts | map('extract', hostvars, 'date_res') | map(attribute='stdout') | list }}"
run_once: yes
This will print out a list of all date_res.stdout from all hosts in the current play and run this task only once.
While trying to copy the results of date_res.stdout to a file on host only single host data is copied not the all host's data is available
- name: copy all
copy:
content: "{{ allhost_out.stdout }}"
dest: "/ngs/app/user/outputsecond-{{ inventory_hostname }}.txt"
Below is a part of a playbook in Ansible 2.1:
- hosts: localhost
any_errors_fatal: true
tasks:
- name: Bla Bla
file: path=/var/tmp/somedir state=directory
#ignore_errors: no
- name: Create directory for every host
file: path=/var/tmp/somedir/{{ item }} state=directory
with_items: "{{ groups['XYZ'] }}"
- name: Get File contents of NewFile
shell: cat NewFile.txt executable=/bin/bash
register: file_contents
- hosts: XYZ
#any_errors_fatal: true
vars:
num_hosts: "{{ groups['XYZ'] | length }}"
serial: num_hosts
tasks:
- name: Copy files to corresponding directories
vars:
path: /var/tmp/somedir/{{ item[0] }}
synchronize: mode=pull src={{ item[1] }} dest={{ path }}
with_nested:
- "{{ groups['XYZ'] }}"
- with_lines: cat NewFile.txt
This does not work.
Now the problem is i am not able to reference file_contents which has been registered under localhost and Ansible is not supporting to cat the NewFile from the hosts: XYZ
Is there any way to do this in some simple manner? I need to check contents of the NewFile in this playbook only and then use the same to copy files from remote to local.
As mentioned in the comments, facts (or all variables) are stored on a host basis. If you have registered a values from a task running on localhost, you can access it from any task running in context of other hosts through the global hostvars dict. All hosts and their facts are stored in there:
hostvars['localhost']['file_contents']
I am not entirely sure simply registered variables are available in the hostvars dict. If not, you have to use set_fact in the first play to store it as a fact.