My Ansible playbook deploys to both database and webservers and I need to use some shared variables between them. The answer from this question almost gives me what I need:
---
- hosts: all
tasks:
- set_fact: my_global_var='hello'
- hosts: db
tasks:
- debug: msg={{my_global_var}}
- hosts: web
tasks:
- debug: msg={{my_global_var}}
However, in my case the variable is a password that is generated randomly by the playbook on each run and then has to be shared:
---
- hosts: all
tasks:
- name: Generate new password
shell: "tr -dc _[:alnum:] < /dev/urandom | head -c${1:-20}"
register: new_password
- name: Set password as fact
set_fact:
my_global_var: "{{ new_password.stdout }}"
- hosts: db
tasks:
- debug: msg={{my_global_var}}
- hosts: web
tasks:
- debug: msg={{my_global_var}}
This above example doesn't work as the password is now re-generated and completely different for each host in the all hosts (unless you coincidentally use the same machine/hostname for your db and web servers).
Ideally I don't want someone to have to remember to pass a good random password in on the command-line using --extra-vars, it should be generated and handled by the playbook.
Is there any suggested mechanism in Ansible for creating variables within a playbook and having it accessible to all hosts within that playbook?
You may want to try to generate pass on localhost and then copy it to every other host:
---
- hosts: localhost
tasks:
- name: Generate new password
shell: "tr -dc _[:alnum:] < /dev/urandom | head -c${1:-20}"
register: new_password
- hosts: all
tasks:
- name: Set password as fact
set_fact:
my_global_var: "{{ hostvars['localhost'].new_password.stdout }}"
- hosts: db
tasks:
- debug: msg={{my_global_var}}
- hosts: web
tasks:
- debug: msg={{my_global_var}}
I know this is an old question but I settled on an alternative method that combines the two answers provided here and this issue, by using an implicit localhost reference and doing everything within the same play. I think it's a bit more elegant. Tested with 2.8.4.
This is the working solution in my context, where I wanted a common timestamped backup directory across all my hosts, for later restore:
---
tasks:
- name: Set local fact for the universal backup string
set_fact:
thisHostTimestamp: "{{ ansible_date_time.iso8601 }}"
delegate_to: localhost
delegate_facts: true
- name: Distribute backup datestring to all hosts in group
set_fact:
backupsTimeString: "{{ hostvars['localhost']['thisHostTimestamp'] }}"
I believe this should translate to the OP's original example like this, but I have not tested it:
---
- hosts: all
tasks:
- name: Generate new password
shell: "tr -dc _[:alnum:] < /dev/urandom | head -c${1:-20}"
register: new_password
delegate_to: localhost
delegate_facts: true
- name: Set password as fact
set_fact:
my_global_var: "{{ hostvars['localhost'].new_password.stdout }}"
Related
I have the following task set:
- name: Initialise inventory_data variable
set_fact:
inventory_data: ""
- name: Get Instance Inventory
remote_user: me
ansible.builtin.script: scripts/inventory.sh
register: inventory
- name: Set inventory variable
set_fact:
inventory_data: "{{ inventory_data }} {{ inventory.stdout_lines | join('\n')}}"
- name: Send to API
remote_user: me
ansible.builtin.uri:
url: https://myapi.com/endpoint
method: POST
body: "{{ inventory_data }}"
status_code: 200
The desired result is that i need to gather the results from inventory.sh and send them only once at the end of the run.
I've tried different variations, with run_once, delegate_to etc.. but i cannot seem to get this!
Edit:
I am trying to gather some data from my script which is ran on every host, however i wish to aggregate the results from all hosts, and send it once to an API.
First, if your play looks something like this:
- hosts: all
tasks:
- name: Initialise inventory_data variable
set_fact:
inventory_data: ""
- name: Get Instance Inventory
remote_user: me
ansible.builtin.script: scripts/inventory.sh
register: inventory
- name: Set inventory variable
set_fact:
inventory_data: "{{ inventory_data }} {{ inventory.stdout_lines | join('\n')}}"
It's not going to do you any good. Your inventory.sh script will run on each host, which will set the inventory variable for that host, and the subsequent task will append inventory.stdout_lines to inventory_data for that host. This won't collect the output from multiple hosts. You need to restructure your playbook. First, run the inventory script on each host:
- hosts: all
gather_facts: false
tasks:
- name: Get Instance Inventory
ansible.builtin.script: scripts/inventory.sh
register: inventory
Then in a second play targeting localhost, build your merged inventory variable and send the data to the API:
- hosts: localhost
gather_facts: false
tasks:
- name: create merged inventory
set_fact:
inventory_data: "{{ inventory_data + hostvars[item].inventory.stdout }}"
vars:
inventory_data: ""
loop: "{{ groups.all }}"
- name: Send to API
remote_user: me
ansible.builtin.uri:
url: https://myapi.com/endpoint
method: POST
body: "{{ inventory_data }}"
status_code: 200
This way, (a) you build the inventory_data variable correctly and (b) you only make a single API call.
I've made a complete, runnable example of this solution available here.
I am using set_fact and hostvars to pass variables between plays within a playbook. My code looks something like this:
- name: Staging play
hosts: localhost
gather_facts: no
vars_prompt:
- name: hostname
prompt: "Enter hostname or group"
private: no
- name: vault
prompt: "Enter vault name"
private: no
- name: input
prompt: "Enter input for role"
private: no
tasks:
- set_fact:
target_host: "{{ hostname }}"
target_vault: "{{ vault }}"
for_role: "{{ input }}"
- name: Execution play
hosts: "{{ hostvars['localhost']['target_host'] }}"
gather_facts: no
vars_files:
- "vault/{{ hostvars['localhost']['target_vault'] }}.yml"
tasks:
- include_role:
name: target_role
vars:
param: "{{ hostvars['localhost']['for_role'] }}"
This arrangement has worked without issue for months. However, our environment has changed and now I need to take a timestamp and pass that to the role as well as the other variable, so I made the following changes (denoted by comments):
- name: Staging play
hosts: localhost
gather_facts: yes # Changed from 'no' to 'yes'
vars_prompt:
- name: hostname
prompt: "Enter hostname or group"
private: no
- name: vault
prompt: "Enter vault name"
private: no
- name: input
prompt: "Enter input for role"
private: no
tasks:
- set_fact:
target_host: "{{ hostname }}"
target_vault: "{{ vault }}"
for_role: "{{ input }}"
current_time: "{{ ansible_date_time.iso8601 }}" # Added fact for current time
- name: Execution play
hosts: "{{ hostvars['localhost']['target_host'] }}"
gather_facts: no
vars_files:
- "vault/{{ hostvars['localhost']['target_vault'] }}.yml"
tasks:
- include_role:
name: target_role
vars:
param: "{{ hostvars['localhost']['for_role'] }}"
timestamp: "{{ hostvars['localhost']['current_time'] # Passed current_time to
Execution Play via hostvars
Now, when I execute, I get the error that the vault hostvars variable is undefined in the Execution Play. After some experimenting, I've found that setting gather_facts: yes in the Staging Play is what is triggering the issue.
However, I need gather_facts enabled in order to use ansible_time_date. I've already verified via debug that the facts are being recorded properly and can be called by hostvars within the Staging Play; just not in the following Execution Play. After hours of research, I can't find any reasoning for why gathering facts in the Staging Play should affect hostvars for the Execution Play or any idea on how to fix it.
At the end of the day, all I need is the current time passed to the included role. Anyone who can come up with a solution that actually works in this use case wins Employee of the Month. Bonus points if you can explain the initial issue with gather_facts.
Thanks!
So, I had to reinvent the wheel a bit, but came up with a much cleaner solution. I simply created a default value for a timestamp in the role itself and added a setup call for date/time at the appropriate point, conditional on there being no existing value for the variable in question.
- name: Gather date and time.
setup:
gather_subset: date_time
when: timestamp is undefined and ansible_date_time is undefined
I was able to leave gather_facts set to no in the dependent playbook but I still have no idea why setting it to yes broke anything in the first place. Any insight in this regard would be appreciated.
... if you can explain the initial issue with gather_facts ... Any insight in this regard would be appreciated.
This is caused by variable precedence and because Ansible do not "overwrite or set a new value" for a variable. So it will depend on when and where they become defined.
You may test with the following example
---
- hosts: localhost
become: false
gather_facts: false
tasks:
- name: Show Gathered Facts
debug:
msg: "{{ hostvars['localhost'].ansible_facts }}" # will be {} only
- name: Gather date and time only
setup:
gather_subset:
- 'date_time'
- '!min'
- name: Show Gathered Facts
debug:
msg: "{{ ansible_facts }}" # from hostvars['localhost'] again
and "try to break it" by adding
- name: Set Fact
set_fact:
ansible_date_time:
date: '1970-01-01'
- name: Show Facts
debug:
msg: "{{ hostvars['localhost'] }}"
Just like to note that for your use case you should use
gather_subset:
- 'date_time'
- '!min'
since your are interested in ansible_date_time only. See what is the exact list of Ansible setup min?.
Be also aware of caching facts since "When created with set_facts’s cacheable option, variables have the high precedence in the play, but are the same as a host facts precedence when they come from the cache."
I have an ansible playbook that interacts with the management card in a bunch of servers, and then produces a report based on that information. Structurally it looks like:
- hosts: all
tasks:
- name: do something with redfish
uri:
...
register: something
- hosts: localhost
tasks:
- name: produce report
template:
...
loop: "{{ SOME_LIST_OF_HOSTS }}"
Originally, the template task in the second was looping over groups.all, but that causes a number of complications if we limited the target hosts using -l on the command line (like ansible-playbook -l only_cluster_a ...). In that case, I would like the template task to loop over only the hosts targeted by the first play. In other words, I want to know ansible_play_hosts_all from the previous play.
This is what I've come up with:
- hosts: all
gather_facts: false
tasks:
- delegate_to: localhost
delegate_facts: true
run_once: true
set_fact:
saved_play_hosts: "{{ ansible_play_hosts_all }}"
...other tasks go here...
- hosts: localhost
gather_facts: false
tasks:
- debug:
msg:
play_hosts: "{{ saved_play_hosts }}"
Is that the best way of doing this?
you could use add_host module: at the end of first play you add a task:
- name: add variables to dummy host
add_host:
name: "variable_holder"
shared_variable: "{{ saved_play_hosts }}"
and you could trap the value in second play:
- name: second play
hosts: localhost
vars:
play_hosts: "{{ hostvars['variable_holder']['shared_variable'] }}"
tasks:
:
:
I'm trying to create a playbook that has to complete the following tasks:
retrieve hostnames and releases from a file file
connect to those hostnames one by one
retrieve the contents of another file another_file in each host that will give us the environment (dev, qa, prod)
so my first task is to retrieve the names of the hosts I need to connect to
- name: retrieve nodes
shell: cat file | grep ";;" | grep foo | grep -v "bar" | sort | uniq
register: nodes
- adding nodes:
add_host:
name: "{{ item }}"
group: "servers"
with_items: "{{ nodes.stdout_lines }}"
the next task connects to the hosts
- hosts: "{{ nodes }}"
become: yes
become_user: user1
become_method: sudo
gather_facts: no
tasks:
- name: check remote file
slurp:
src: /xyz/directory/another_file
register: thing
- debug: msg="{{ thing['content'] | b64decode }}"
but it doesn't work, when I add verbosity I still see the task being executed with the user running the playbook (local_user)
<node> ESTABLISH SSH CONNECTION FOR USER: local_user
what am I doing wrong?
ansible version is 2.7.1 over rhel 6.1
UPDATE
I've also used remote_user: user1
- hosts: "{{ nodes }}"
remote_user: user1
gather_facts: no
tasks:
- name: check remote file
slurp:
src: /xyz/directory/another_file
register: thing
- debug: msg="{{ thing['content'] | b64decode }}"
no luck so far, same error
You need to set remote_user, which is the one that controls what username is used when connecting to the server. Only after the connection is established, become_user is used.
https://docs.ansible.com/ansible/latest/user_guide/playbooks_intro.html#hosts-and-users
if you use become_user it is like using sudo command on the remote machine.
You want to connect to your machines as a specific user. And after the connection is established issue the commands with sudo.
This question has some answers that should help you.
I am creating a playbook which first creates a new username. I then want to run "moretasks.yml" as that new user that I just created. Currently, I'm setting remote_user for every task. Is there a way I can set it for the entire set of tasks once? I couldn't seem to find examples of this, nor did any of my attempts to move remote_user around help.
Below is main.yml:
---
- name: Configure Instance(s)
hosts: all
remote_user: root
gather_facts: true
tags:
- config
- configure
tasks:
- include: createuser.yml new_user=username
- include: moretasks.yml new_user=username
- include: roottasks.yml #some tasks unrelated to username.
moretasks.yml:
---
- name: Task1
copy:
src: /vagrant/FILE
dest: ~/FILE
remote_user: "{{newuser}}"
- name: Task2
copy:
src: /vagrant/FILE
dest: ~/FILE
remote_user: "{{newuser}}"
First of all you surely want to use sudo_user (remote user is the one that logs in, sudo_user is the one who executes the task).
In your case you want to execute the task as another user (the one previously created) just set:
- include: moretasks.yml
sudo: yes
sudo_user: "{{ newuser }}"
and those tasks will be executed as {{ newuser }} (Don't forget the quotes)
Remark: In most cases you should consider remote_user as a host parameter. It is the user that is allowed to login on the machine and that has sufficient rights to do things. For operational stuff you should use sudo / sudo_user
You could split this up into to separate plays? (playbooks can contain multiple plays)
---
- name: PLAY 1
hosts: all
remote_user: root
gather_facts: true
tasks:
- include: createuser.yml new_user=username
- include: roottasks.yml #some tasks unrelated to username.
- name: PLAY 2
hosts: all
remote_user: username
gather_facts: false
tasks:
- include: moretasks.yml new_user=username
There is a gotcha using separate plays: you can't use variables set with register: or set_fact: in the first play to do things in the second play (this statement is not entirely true, the variables are available in hostvars, but I recommend not using variables between roles). Defined variables like in group_vars and host_vars work just fine.
Another tip I'd like to give is to look into using roles http://docs.ansible.com/playbooks_roles.html. While it might seem more complicated at first, it's much easier to re-use them (as you seem to be doing with the "createuser.yml"). Looking at the type of things you are trying to achieve, the 'include all the things' path won't last much longer.
Kind of inline with your issue. Hope it helps. While updating my playbooks for Ansible 2.5 support for Cisco IOS network_cli connection
Credential file created with ansible-vault: auth/secrets.yml
---
creds:
username: 'ansible'
password: 'user_password'
Playbook:
---
- hosts: ios
gather_facts: yes
connection: network_cli
become: yes
become_method: enable
ignore_errors: yes
tasks:
- name: obtain login credentials
include_vars: auth/secrets.yml
- name: Set Username/ Password
set_fact:
remote_user: "{{ creds['username'] }}"
ansible_ssh_pass: "{{ creds['password'] }}"
- name: Find info for "{{ inventory_hostname }}" via ios_facts
ios_facts:
gather_subset: all
register: hardware_fact
Running playbook without auth/secrets.yml creds:
ansible-playbook -u ansible -k playbook.yml -l inventory_hostname