ansible does not read in variables when connected to another host - ansible

I have a playbook which reads in a list of variables:
vars_files:
- vars/myvariables.yml
tasks:
- name: Debug Variable List
debug:
msg: "An item: {{item}}"
with_list: "{{ myvariables }}"
This prints out the list of "myvariables" from a file variables.yml, which contains:
---
myvariables:
- variable1
- variable2
I get the following as expected.
"msg": "An item: variable1"
"msg": "An item: variable2"
However, when I connect to another host, and run the same Debug statement, it throws an error:
vars_files:
- vars/myvariables.yml
tasks:
- name: Configure instance(s)
hosts: launched
become: True
remote_user: ubuntu
port: 22
gather_facts: False
tasks:
- name: Wait for SSH to come up
delegate_to: ***
remote_user: ubuntu
connection: ssh
register: item
- name: Debug Variable List
debug:
msg: "An item: {{item}}"
with_list: "{{ myvariables }}"
OUTPUT:
"msg": "'myvariables' is undefined"
How do I define the variables file when connecting to another host that is not localhost?
Any help on this would be greatly appreciated.

With "hosts: launched" you started new playbook. Put the vars_files: into the scope of this playbook (see below).
- name: Configure instance(s)
hosts: launched
become: True
remote_user: ubuntu
port: 22
gather_facts: False
vars_files:
- vars/myvariables.yml
tasks:
Review the Scoping variables.

Related

How to check the OS version of host which in dynamically added to inventory

I'm trying to get server name as user input and if the server OS is RHEL7 it will proceed for further tasks. I'm trying with hostvars but it is not helping, kindly help me to find the OS version with when condition:
---
- name: Add hosts
hosts: localhost
vars:
- username: test
password: test
vars_prompt:
- name: server1
prompt: Server_1 IP or hostname
private: no
- name: server2
prompt: Server_2 IP or hostname
private: no
tasks:
- add_host:
name: "{{ server1 }}"
groups:
- cluster_nodes
- primary
- management
ansible_user: "{{ username }}"
ansible_password: "{{ password}}"
- add_host:
name: "{{ server2 }}"
groups:
- cluster_nodes
- secondary
ansible_user: "{{ username }}"
ansible_password: "{{ password}}"
- debug:
msg: "{{ hostvars['server1'].ansible_distribution_major_version }}"
When I execute the playbook, I'm getting below error:
fatal: [localhost]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: \"hostvars['server1']\" is undefined\n\nThe error appears to be in '/var/lib/awx/projects/pacemaker_RHEL_7_ST/main_2.yml': line 33, column 7, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n - debug:\n ^ here\n"}
You need to gather_facts on the newly added host before you consume the variable. As an example, this will do it with automatic facts gathering.
---
- name: Add hosts
hosts: localhost
vars:
- username: test
password: test
vars_prompt:
- name: server1
prompt: Server_1 IP or hostname
private: no
- name: server2
prompt: Server_2 IP or hostname
private: no
tasks:
- add_host:
name: "{{ server1 }}"
groups:
- cluster_nodes
- primary
- management
ansible_user: "{{ username }}"
ansible_password: "{{ password}}"
- add_host:
name: "{{ server2 }}"
groups:
- cluster_nodes
- secondary
ansible_user: "{{ username }}"
ansible_password: "{{ password}}"
- name: Gather facts for newly added targets
hosts: cluster_nodes
# gather_facts: true <= this is the default
- name: Do <whatever> targeting localhost again
hosts: localhost
gather_facts: false # already gathered in play1
tasks:
# Warning!! bad practice. Looping on a group usually
# shows you should have a play targeting that specific group
- debug:
msg: "OS version for {{ item }} is 7"
when: hostvars[item].ansible_distribution_major_version | int == 7
loop: "{{ groups['cluster_nodes'] }}"
If you don't want to rely on automatic gathering, you can manually play the setup module, e.g. for the second play:
- name: Gather facts for newly added targets
hosts: cluster_nodes
gather_facts: false
tasks:
- name: get facts from targets
setup:

Ansible vars_prompt for host list into Import_playbook

I want to use a specific host / host list for an imported playbook which I get from a vars_prompt input. How can I do this? So far I wasn´t able to get this running.
I have two playbooks which I need to run separately and ios_check_routerports.yaml is the parent playbook:
ios_check_routerports.yaml
---
- hosts: '{{ branch_number }}'
connection: network_cli
gather_facts: False
any_errors_fatal: no
throttle: 75
vars_prompt:
- name: "branch_number"
prompt: "Which branch do you want to check?"
default: all
private: no
tasks:
- name: Check facts
ios_facts:
gather_subset: hardware
- name: Create directory
file:
path: /root/ansible/pb-outputs/ios_check_routerports/
state: directory
delegate_to: 127.0.0.1
- name: Run playbook
import_playbook: ios_check_routerports_main.yaml
ios_check_routerports_main.yaml
---
- hosts: '{{ branch_number }}'
connection: network_cli
gather_facts: False
any_errors_fatal: no
throttle: 75
tasks:
- name: Check default-gateway
ios_command:
commands: sh run | i default-gateway
register: default_gateway
I tried to set a fact for the var {{ branch_number }} like this:
ios_check_routerports.yaml
- set_fact:
devices: "{{ branch_number }}"
ios_check_routerports_main.yaml
---
- hosts: '{{ devices }}'
connection: network_cli
The playbook always runs into an error because the hosts var is not defined. What am I doing wrong here?
try this: no need to create a new variable devices but a dummy host
in ios_check_routerports.yaml add a task:
- name: Register dummy host with variable
add_host:
name: "DUMMY_HOST"
DEVICES: "{{ branch_number }}"
then :
- hosts: "{{ hostvars['DUMMY_HOST']['DEVICES'] }}"
connection: network_cli
as you create a new host, i suggest you to delete it if you havent need the variable branch_number so remove_host doesnt exit:
either you do a first task - meta: refresh_inventory
or you modify your host like this:
- hosts: "{{ hostvars['DUMMY_HOST']['DEVICES'] }},!DUMMY_HOST"

Deactivate the Current Ansible User with Ansible

In setting up a new Raspberry Pi with Ansible, I would like to perform the following actions:
Using the default pi user, create a new user named my_new_admin
Using the new my_new_admin user, deactivate the default pi user
Continue executing the playbook as my_new_admin
I am finding this difficult to achieve in a single playbook. Is it even possible to switch the active user like this in Ansible?
# inventory.yaml
---
all:
children:
rpis:
hosts:
myraspberrypi.example.com:
ansible_user: my_new_admin # or should `pi` go here?
...
# initialize.yaml
- hosts: rpis
remote_user: 'pi'
become: true
tasks:
- name: 'create new user'
user:
name: 'my_new_admin'
append: true
groups:
- 'sudo'
- name: 'add SSH key to my_new_admin'
*snip*
- name: 'lock default user'
remote_user: 'my_new_admin'
user:
name: 'pi'
expires: '{{ ("1970-01-02 00:00:00" | to_datetime).timestamp() | float }}'
password_lock: true
...
If you want to switch users, the easiest solution is to start another play. For example, the following playbook will run the first play as user pi and the second play as user root:
- hosts: pi
gather_facts: false
remote_user: pi
tasks:
- command: whoami
register: whoami
- debug:
msg: "{{ whoami.stdout }}"
- hosts: pi
gather_facts: false
remote_user: root
tasks:
- command: whoami
register: whoami
- debug:
msg: "{{ whoami.stdout }}"
In this playbook I'm being explicit about remote_user in both plays, but you could also set a user in your inventory and only override it when necessary. E.g., if I have:
pi ansible_host=raspberrypi.local ansible_user=root
Then I could rewrite the above playbook like this:
- hosts: pi
gather_facts: false
vars:
ansible_user: pi
tasks:
- command: whoami
register: whoami
- debug:
msg: "{{ whoami.stdout }}"
- hosts: pi
gather_facts: false
tasks:
- command: whoami
register: whoami
- debug:
msg: "{{ whoami.stdout }}"
Note that I'm setting the ansible_user variable here rather than using remote_user, because it looks as if ansible_user has precedence.

How to use variables defined through vars_prompt of one host in another host vars?

I am having 2 plays in a playbook. One of them prompts for input from user and I want to use that variable in another play. Please suggest me how to do that
---
- hosts: workers
gather_facts: false
sudo: true
vars_prompt:
- name: "server_ip"
prompt: "Enter the Server IP"
private: no
roles:
- client-setup-worker
- hosts: master
gather_facts: false
sudo: true
vars:
server: "{{ hostvars['workers']['server_ip'] }}"
roles:
- client-setup-master
In the above playbook I want to use server_ip defined in workers hosts to be used in master hosts.
I am facing the error "The error was: \"hostvars['workers']\" is undefined" while doing so
I am facing the error "The error was: \"hostvars['workers']\" is undefined" while doing so
That's because workers is evidently a group, and not a host, which is the only thing one will find declared in hostvars
You'll need to grab one of the hosts at random from the workers group, and then extract its fact; I believe this will do that:
update after seeing the mostly correct answer by #VladimirBotka
- hosts: workers
vars_prompt:
# as before ...
pre_tasks:
- name: re-export the vars_prompt for cross playbook visibility
set_fact:
server_ip: '{{ server_ip }}'
roles:
# as before
- hosts: masters
vars:
server: '{{ hostvars[(groups.workers|first)].server_ip }}'
The scope of variables declared in vars_prompt is the play.
Such variable may be put into the hostvars by set_facts "to use that variable in another play". For example the play below
- hosts: test_01
gather_facts: false
vars_prompt:
- name: "server_ip"
prompt: "Enter the Server IP"
private: no
tasks:
- set_fact:
stored_server_ip: "{{ server_ip }}"
- debug:
var: stored_server_ip
- hosts: test_02
gather_facts: false
vars:
server: "{{ hostvars.test_01.stored_server_ip }}"
tasks:
- debug:
var: server
gives (abridged):
ok: [test_01] => {
"stored_server_ip": "10.1.0.10"
}
ok: [test_02] => {
"server": "10.1.0.10"
}

Ansible-Execute a last mandatory role in the playbook even though any previous roles in the execution flow fails

I have ansible playbook something like this
---
- hosts: localhost
connection: local
tasks:
- set_fact:
build_date_time: "{{ ansible_date_time }}"
- hosts: localhost
connection: local
gather_facts: false
vars:
role: Appvariables
base_image_tag: Base
roles:
- role: begin-building-ami
- hosts: just_created
remote_user: "{{ default_user }}"
vars:
role: AppName
roles:
- { role: role1, become: yes }
- role: role2
- { role: role3, become: yes }
- { role: role4, become: yes }
- { role: role5, become: yes }
- hosts: localhost
connection: local
gather_facts: false
vars:
role: Appname
ansible_date_time: "{{ build_date_time }}"
roles:
- finish-building-ami
Here in this case we have a situation to execute the finish-building-ami role where we terminate the instance after baking the ami. If any reason any of the previous role1-role5 fails in the flow fails It stops the playbook and we have the failed instance which we needed to terminate automatically.Right now we are going and terminating it manually if it fails.
So needed to run finish-building-ami(mandatory role where we stop the instance and take ami and terminate the instance at last ) if even any of the role1-role5 fails in the above mentioned playbook.
You can rewrite your existing play to use import_role or include_role tasks instead of the roles section. This allows you to use blocks:
---
- hosts: localhost
gather_facts: false
tasks:
- block:
- import_role:
name: role1
- import_role:
name: role2
become: true
- import_role:
name: role3
rescue:
- set_fact:
role_failed: true
- hosts: localhost
gather_facts: false
tasks:
- debug:
msg: This task runs after our roles.

Resources