How to use variable in different playbook? - ansible

How to use a variable in a different playbook? (ansible 2.7.10)
username.yml
- hosts: host
vars_prompt:
- name: username
prompt: 'Username...'
private: no
tasks:
- name: Show username
debug:
msg: "{{username}}"
- import_playbook: dns.yml
dns.yml
- hosts: DNS
tasks:
- name: Mesaj
debug:
msg: "{{username}}"
FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: 'username' is undefined

The scope of a variable declared by vars_prompt is the play. Use set_fact to share such variable in the whole playbook.
The set_fact module takes key=value pairs as variables to set in the playbook scope.
- hosts: host
vars_prompt:
- name: username
prompt: 'Username...'
private: no
tasks:
- name: Show username
debug:
msg: "{{ username }}"
- set_fact:
username: "{{ username }}"
In the second play (dns.yml) use hostvars to reference the variables cached by host in the first play.
- hosts: DNS
tasks:
- name: Mesaj
debug:
msg: "{{ hostvars['host'].username }}"

Without disturbing the dns.yml playbook. So, you can pass -e username=myuser and execute it separately if needed.
username.yml
Adding set fact with different variable name (uname) and passing it to the playbook (dns.yml)
- hosts: localhost
vars_prompt:
- name: username
prompt: 'Username...'
private: no
tasks:
- name: Show username
debug:
msg: "{{username}}"
- name: Set fact
set_fact:
uname: "{{ username }}"
- import_playbook: dns.yml
vars:
username: "{{ uname }}"
dns.yml
No change to this playbook.
- hosts: DNS
tasks:
- name: Mesaj
debug:
msg: "{{username}}"

Related

Ansible: Get Variable with inventory_hostname

I have the following passwords file vault.yml:
---
server1: "pass1"
server2: "pass2"
server3: "pass3"
I am loading these values in a variable called passwords:
- name: Get Secrets
set_fact:
passwords: "{{ lookup('template', './vault.yml')|from_yaml }}"
delegate_to: localhost
- name: debug it
debug:
var: passwords.{{ inventory_hostname }}
The result of the debugging task shows me the result I want to get: The password for the specific host.
But if I set the following in a variables file:
---
ansible_user: root
ansible_password: passwords.{{ inventory_hostname }}
This will not give me the desired result. The ansible_password takes "passwords" literally and not as a variable.
How can I achieve the same result I got when debugging the passwords.{{ inventory_hostname }}?
Regarding the part
... if I set the following in a variables file ...
I am not sure since I miss some information about your use case and data flow. However, in general the syntax ansible_password: "{{ PASSWORDS[inventory_hostname] }}" might work for you.
---
- hosts: localhost
become: false
gather_facts: false
vars:
PASSWORDS:
SERVER1: "pass1"
SERVER2: "pass2"
SERVER3: "pass3"
localhost: "pass_local"
tasks:
- name: Debug var
debug:
var: PASSWORDS
- name: Set Fact 'ansible_password'
set_fact:
ansible_password: "{{ PASSWORDS[inventory_hostname] }}"
- name: Debug var
debug:
var: ansible_password
In that way you can access a element by name.

How to check the OS version of host which in dynamically added to inventory

I'm trying to get server name as user input and if the server OS is RHEL7 it will proceed for further tasks. I'm trying with hostvars but it is not helping, kindly help me to find the OS version with when condition:
---
- name: Add hosts
hosts: localhost
vars:
- username: test
password: test
vars_prompt:
- name: server1
prompt: Server_1 IP or hostname
private: no
- name: server2
prompt: Server_2 IP or hostname
private: no
tasks:
- add_host:
name: "{{ server1 }}"
groups:
- cluster_nodes
- primary
- management
ansible_user: "{{ username }}"
ansible_password: "{{ password}}"
- add_host:
name: "{{ server2 }}"
groups:
- cluster_nodes
- secondary
ansible_user: "{{ username }}"
ansible_password: "{{ password}}"
- debug:
msg: "{{ hostvars['server1'].ansible_distribution_major_version }}"
When I execute the playbook, I'm getting below error:
fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: \"hostvars['server1']\" is undefined\n\nThe error appears to be in '/var/lib/awx/projects/pacemaker_RHEL_7_ST/main_2.yml': line 33, column 7, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n - debug:\n ^ here\n"}
You need to gather_facts on the newly added host before you consume the variable. As an example, this will do it with automatic facts gathering.
---
- name: Add hosts
hosts: localhost
vars:
- username: test
password: test
vars_prompt:
- name: server1
prompt: Server_1 IP or hostname
private: no
- name: server2
prompt: Server_2 IP or hostname
private: no
tasks:
- add_host:
name: "{{ server1 }}"
groups:
- cluster_nodes
- primary
- management
ansible_user: "{{ username }}"
ansible_password: "{{ password}}"
- add_host:
name: "{{ server2 }}"
groups:
- cluster_nodes
- secondary
ansible_user: "{{ username }}"
ansible_password: "{{ password}}"
- name: Gather facts for newly added targets
hosts: cluster_nodes
# gather_facts: true <= this is the default
- name: Do <whatever> targeting localhost again
hosts: localhost
gather_facts: false # already gathered in play1
tasks:
# Warning!! bad practice. Looping on a group usually
# shows you should have a play targeting that specific group
- debug:
msg: "OS version for {{ item }} is 7"
when: hostvars[item].ansible_distribution_major_version | int == 7
loop: "{{ groups['cluster_nodes'] }}"
If you don't want to rely on automatic gathering, you can manually play the setup module, e.g. for the second play:
- name: Gather facts for newly added targets
hosts: cluster_nodes
gather_facts: false
tasks:
- name: get facts from targets
setup:

Making prompted vars usable across the playbook

I have one playbook with multiple tasks that have to run on different hosts.
In the beginning of the play I want to prompt the operator for their credentials which are the same for every host in the play. I want to have those credentials "stored" somewhere so they can be used across the tasks to log in on the provided host(s).
Playbook looks as followed,
---
- name: Ask for credentials
vars_prompt:
- name: username
prompt: "Username?"
- name: password
prompt: "Password?"
tasks:
- set_fact:
username: "{{username}}"
- set_fact:
password: "{{password}}"
- hosts: Host1
vars:
ansible_user: "{{ username }}"
ansible_password: "{{ password }}"
tasks:
- name: Do stuff
- hosts: Host2
vars:
ansible_user: "{{username}}"
ansible_password: "{{password}}"
tasks:
- name: Do stuff
...
From the moment the play hits the first task it will fail with the flowing error,
msg: 'The field ''remote_user'' has an invalid value, which includes an undefined variable. The error was: ''username'' is undefined'
Anyone that has experience in making prompted vars usable across the whole play and all tasks?
Q: "Make prompted vars usable across the whole play and all tasks>"
A: Run the first play with the group of all hosts that should be connected later. Run once the set_fact task. This will create the variables username and password for all hosts in the group.
For example if the group test_jails comprises hosts test_01, test_02, test_03 the play
- hosts: test_jails
vars_prompt:
- name: "username"
prompt: "Username?"
- name: "password"
prompt: "Password?"
tasks:
- set_fact:
username: "{{ username }}"
password: "{{ password }}"
run_once: true
- hosts: test_01
vars:
ansible_user: "{{ username }}"
ansible_password: "{{ password }}"
tasks:
- debug:
msg: "{{ ansible_user }} {{ ansible_password }}"
gives
ok: [test_01] => {
"msg": "admin 1234"
}

How to set fact witch is visible on all hosts in Ansible role

I'm setting fact in a role:
- name: Check if manager already configured
shell: >
docker info | perl -ne 'print "$1" if /Swarm: (\w+)/'
register: swarm_status
- name: Init cluster
shell: >-
docker swarm init
--advertise-addr "{{ ansible_default_ipv4.address }}"
when: "'active' not in swarm_status.stdout_lines"
- name: Get worker token
shell: docker swarm join-token -q worker
register: worker_token_result
- set_fact:
worker_token: "{{ worker_token_result.stdout }}"
Then I want to access worker_token on another hosts. Here's my main playbook, the fact is defined in the swarm-master role
- hosts: swarm_cluster
become: yes
roles:
- docker
- hosts: swarm_cluster:&manager
become: yes
roles:
- swarm-master
- hosts: swarm_cluster:&node
become: yes
tasks:
- debug:
msg: "{{ worker_token }}"
I'm getting undefined variable. How to make it visible globally?
Of course it works perfectly if I run debug on the same host.
if your goal is just to access worker_token from on another host, you can use hostvars variable and iterate through the group where you've defined your variable like this:
- hosts: swarm_cluster:&node
tasks:
- debug:
msg: "{{ hostvars[item]['worker_token'] }}"
with_items: "{{ groups['manager'] }}"
If your goal is to define the variable globally, you can add a step to define a variable on all hosts like this:
- hosts: all
tasks:
- set_fact:
worker_token_global: "{{ hostvars[item]['worker_token'] }}"
with_items: "{{ groups['manager'] }}"
- hosts: swarm_cluster:&node
tasks:
- debug:
var: worker_token_global

How to use variables defined through vars_prompt of one host in another host vars?

I am having 2 plays in a playbook. One of them prompts for input from user and I want to use that variable in another play. Please suggest me how to do that
---
- hosts: workers
gather_facts: false
sudo: true
vars_prompt:
- name: "server_ip"
prompt: "Enter the Server IP"
private: no
roles:
- client-setup-worker
- hosts: master
gather_facts: false
sudo: true
vars:
server: "{{ hostvars['workers']['server_ip'] }}"
roles:
- client-setup-master
In the above playbook I want to use server_ip defined in workers hosts to be used in master hosts.
I am facing the error "The error was: \"hostvars['workers']\" is undefined" while doing so
I am facing the error "The error was: \"hostvars['workers']\" is undefined" while doing so
That's because workers is evidently a group, and not a host, which is the only thing one will find declared in hostvars
You'll need to grab one of the hosts at random from the workers group, and then extract its fact; I believe this will do that:
update after seeing the mostly correct answer by #VladimirBotka
- hosts: workers
vars_prompt:
# as before ...
pre_tasks:
- name: re-export the vars_prompt for cross playbook visibility
set_fact:
server_ip: '{{ server_ip }}'
roles:
# as before
- hosts: masters
vars:
server: '{{ hostvars[(groups.workers|first)].server_ip }}'
The scope of variables declared in vars_prompt is the play.
Such variable may be put into the hostvars by set_facts "to use that variable in another play". For example the play below
- hosts: test_01
gather_facts: false
vars_prompt:
- name: "server_ip"
prompt: "Enter the Server IP"
private: no
tasks:
- set_fact:
stored_server_ip: "{{ server_ip }}"
- debug:
var: stored_server_ip
- hosts: test_02
gather_facts: false
vars:
server: "{{ hostvars.test_01.stored_server_ip }}"
tasks:
- debug:
var: server
gives (abridged):
ok: [test_01] => {
"stored_server_ip": "10.1.0.10"
}
ok: [test_02] => {
"server": "10.1.0.10"
}

Resources