Using a variable as a default value in vars_prompt in Ansible - ansible

I was trying to use vars_prompt in Ansible with default values taken from facts (or otherwise a previously defined variable). The playbook is intended be used as an ad-hoc one for initial provisioning.
My playbook:
---
- hosts: server01
gather_facts: True
vars_prompt:
- name: new_hostname
prompt: please enter the name for the target
default: "{{ ansible_hostname }}"
private: no
tasks:
- debug: msg="{{ new_hostname }}"
Current result:
please enter the name for the target [{{ ansible_hostname }}]:
ERROR! 'ansible_hostname' is undefined
Expected results (assuming ansible_hostname=server01:
please enter the name for the target [server01]:
Is it possible to achieve in Ansible?

This can be implemented using the pause module:
---
- hosts: server01
gather_facts: True
tasks:
- pause:
prompt: please enter the name for the target [{{ ansible_hostname }}]
register: prompt
- debug:
msg: "{{ prompt.user_input if prompt.user_input else ansible_hostname }}"

Related

'gather_facts' seems to break 'set_fact' and 'hostvars'

I am using set_fact and hostvars to pass variables between plays within a playbook. My code looks something like this:
- name: Staging play
hosts: localhost
gather_facts: no
vars_prompt:
- name: hostname
prompt: "Enter hostname or group"
private: no
- name: vault
prompt: "Enter vault name"
private: no
- name: input
prompt: "Enter input for role"
private: no
tasks:
- set_fact:
target_host: "{{ hostname }}"
target_vault: "{{ vault }}"
for_role: "{{ input }}"
- name: Execution play
hosts: "{{ hostvars['localhost']['target_host'] }}"
gather_facts: no
vars_files:
- "vault/{{ hostvars['localhost']['target_vault'] }}.yml"
tasks:
- include_role:
name: target_role
vars:
param: "{{ hostvars['localhost']['for_role'] }}"
This arrangement has worked without issue for months. However, our environment has changed and now I need to take a timestamp and pass that to the role as well as the other variable, so I made the following changes (denoted by comments):
- name: Staging play
hosts: localhost
gather_facts: yes # Changed from 'no' to 'yes'
vars_prompt:
- name: hostname
prompt: "Enter hostname or group"
private: no
- name: vault
prompt: "Enter vault name"
private: no
- name: input
prompt: "Enter input for role"
private: no
tasks:
- set_fact:
target_host: "{{ hostname }}"
target_vault: "{{ vault }}"
for_role: "{{ input }}"
current_time: "{{ ansible_date_time.iso8601 }}" # Added fact for current time
- name: Execution play
hosts: "{{ hostvars['localhost']['target_host'] }}"
gather_facts: no
vars_files:
- "vault/{{ hostvars['localhost']['target_vault'] }}.yml"
tasks:
- include_role:
name: target_role
vars:
param: "{{ hostvars['localhost']['for_role'] }}"
timestamp: "{{ hostvars['localhost']['current_time'] # Passed current_time to
Execution Play via hostvars
Now, when I execute, I get the error that the vault hostvars variable is undefined in the Execution Play. After some experimenting, I've found that setting gather_facts: yes in the Staging Play is what is triggering the issue.
However, I need gather_facts enabled in order to use ansible_time_date. I've already verified via debug that the facts are being recorded properly and can be called by hostvars within the Staging Play; just not in the following Execution Play. After hours of research, I can't find any reasoning for why gathering facts in the Staging Play should affect hostvars for the Execution Play or any idea on how to fix it.
At the end of the day, all I need is the current time passed to the included role. Anyone who can come up with a solution that actually works in this use case wins Employee of the Month. Bonus points if you can explain the initial issue with gather_facts.
Thanks!
So, I had to reinvent the wheel a bit, but came up with a much cleaner solution. I simply created a default value for a timestamp in the role itself and added a setup call for date/time at the appropriate point, conditional on there being no existing value for the variable in question.
- name: Gather date and time.
setup:
gather_subset: date_time
when: timestamp is undefined and ansible_date_time is undefined
I was able to leave gather_facts set to no in the dependent playbook but I still have no idea why setting it to yes broke anything in the first place. Any insight in this regard would be appreciated.
... if you can explain the initial issue with gather_facts ... Any insight in this regard would be appreciated.
This is caused by variable precedence and because Ansible do not "overwrite or set a new value" for a variable. So it will depend on when and where they become defined.
You may test with the following example
---
- hosts: localhost
become: false
gather_facts: false
tasks:
- name: Show Gathered Facts
debug:
msg: "{{ hostvars['localhost'].ansible_facts }}" # will be {} only
- name: Gather date and time only
setup:
gather_subset:
- 'date_time'
- '!min'
- name: Show Gathered Facts
debug:
msg: "{{ ansible_facts }}" # from hostvars['localhost'] again
and "try to break it" by adding
- name: Set Fact
set_fact:
ansible_date_time:
date: '1970-01-01'
- name: Show Facts
debug:
msg: "{{ hostvars['localhost'] }}"
Just like to note that for your use case you should use
gather_subset:
- 'date_time'
- '!min'
since your are interested in ansible_date_time only. See what is the exact list of Ansible setup min?.
Be also aware of caching facts since "When created with set_facts’s cacheable option, variables have the high precedence in the play, but are the same as a host facts precedence when they come from the cache."

Getting ansible_play_hosts from previous play?

I have an ansible playbook that interacts with the management card in a bunch of servers, and then produces a report based on that information. Structurally it looks like:
- hosts: all
tasks:
- name: do something with redfish
uri:
...
register: something
- hosts: localhost
tasks:
- name: produce report
template:
...
loop: "{{ SOME_LIST_OF_HOSTS }}"
Originally, the template task in the second was looping over groups.all, but that causes a number of complications if we limited the target hosts using -l on the command line (like ansible-playbook -l only_cluster_a ...). In that case, I would like the template task to loop over only the hosts targeted by the first play. In other words, I want to know ansible_play_hosts_all from the previous play.
This is what I've come up with:
- hosts: all
gather_facts: false
tasks:
- delegate_to: localhost
delegate_facts: true
run_once: true
set_fact:
saved_play_hosts: "{{ ansible_play_hosts_all }}"
...other tasks go here...
- hosts: localhost
gather_facts: false
tasks:
- debug:
msg:
play_hosts: "{{ saved_play_hosts }}"
Is that the best way of doing this?
you could use add_host module: at the end of first play you add a task:
- name: add variables to dummy host
add_host:
name: "variable_holder"
shared_variable: "{{ saved_play_hosts }}"
and you could trap the value in second play:
- name: second play
hosts: localhost
vars:
play_hosts: "{{ hostvars['variable_holder']['shared_variable'] }}"
tasks:
:
:

In ansible variable for hosts from vars_prompt no longer accepted [duplicate]

I want to write a bootstrapper playbook for new machines in Ansible which will reconfigure the network settings. At the time of the first execution target machines will have DHCP-assigned address.
The user who is supposed to execute the playbook knows the assigned IP address of a new machine. I would like to prompt the user for is value.
vars_prompt module allows getting input from the user, however it is defined under hosts section effectively preventing host address as the required value.
Is it possible without using a wrapper script modifying inventory file?
The right way to do this is to create a dynamic host with add_host and place it in a new group, then start a new play that targets that group. That way, if you have other connection vars that need to be set ahead of time (credentials/keys/etc) you could set them on an empty group in inventory, then add the host to it dynamically. For example:
- hosts: localhost
gather_facts: no
vars_prompt:
- name: target_host
prompt: please enter the target host IP
private: no
tasks:
- add_host:
name: "{{ target_host }}"
groups: dynamically_created_hosts
- hosts: dynamically_created_hosts
tasks:
- debug: msg="do things on target host here"
You could pass it with extra-vars instead.
Simply make your hosts section a variable such as {{ hosts_prompt }} and then pass the host on the command line like so:
ansible-playbook -i inventory/environment playbook.yml --extra-vars "hosts_prompt=192.168.1.10"
Or if you are using the default inventory file location of /etc/ansible/hosts you could simply use:
ansible-playbook playbook.yml --extra-vars "hosts_prompt=192.168.1.10"
Adding to Matt's answer for multiple hosts.
input example would be 192.0.2.10,192.0.2.11
- hosts: localhost
gather_facts: no
vars_prompt:
- name: target_host
prompt: please enter the target host IP
private: no
tasks:
- add_host:
name: "{{ item }}"
groups: dynamically_created_hosts
with_items: "{{ target_host.split(',') }}"
- hosts: dynamically_created_hosts
tasks:
- debug: msg="do things on target host here"
Disclaimer: The accepted answer offers the best solution to the problem. While this one is working it is based on a hack and I leave it as a reference.
I found out it was possible use a currently undocumented hack (credit to Bruce P for pointing me to the post) that turns the value of -i / --inventory parameter into an ad hoc list of hosts (reference). With just the hostname/ip address and a trailing space (like below) it refers to a single host without the need for the inventory file to exist.
Command:
ansible-playbook -i "192.168.1.21," playbook.yml
And accordingly playbook.yml can be run against all hosts (which in the above example will be equal to a single host 192.168.1.21):
- hosts: all
The list might contain more than one ip address -i "192.168.1.21,192.168.1.22"
Adding to Jacob's and Matt's examples, with the inclusion of a username and password prompt:
---
- hosts: localhost
pre_tasks:
- name: verify_ansible_version
assert:
that: "ansible_version.full is version_compare('2.10.7', '>=')"
msg: "Error: You must update Ansible to at least version 2.10.7 to run this playbook..."
vars_prompt:
- name: target_hosts
prompt: |
Enter Target Host IP[s] or Hostname[s] (comma separated)
(example: 1.1.1.1,myhost.example.com)
private: false
- name: username
prompt: Enter Target Host[s] Login Username
private: false
- name: password
prompt: Enter Target Host[s] Login Password
private: true
tasks:
- add_host:
name: "{{ item }}"
groups: host_groups
with_items:
- "{{ target_hosts.split(',') }}"
- add_host:
name: login
username: "{{ username }}"
password: "{{ password }}"
- hosts: host_groups
remote_user: "{{ hostvars['login']['username'] }}"
vars:
ansible_password: "{{ hostvars['login']['password'] }}"
ansible_become: yes
ansible_become_method: sudo
ansible_become_pass: "{{ hostvars['login']['password'] }}"
roles:
- my_role

lineinfile ansible module skips a line

I have a need to know the index of host names in the inventory. I am using the below code to create a variable file that I can use in a subsequent play book
- name: Debug me
hosts: hosts
tasks:
- debug: msg="{{ inventory_hostname }}"
- debug: msg="{{ play_hosts.index(inventory_hostname) }}"
- local_action: 'lineinfile create=yes dest=/tmp/test.conf
line="host{{ play_hosts.index(inventory_hostname) }}=
{{ inventory_hostname }}"'
I have the following inventory file
[hosts]
my.host1.com
my.host2.com
Now when I run this, the test.conf that gets generated under /tmp sometimes has both hostnames like this
host1= my.host2.com
host0= my.host1.com
when I run the same playbook a few times each time emptying the test.conf before running. quite a few times the file only has one entry
host1= my.host2.com
or
host0= my.host1.com
how come the same ansible playbook behaving differently?
I believe the issue is your running two threads against different hosts, and using local_action is not thread safe.
Try using the serial keyword:
- name: Debug me
hosts: hosts
serial: 1
tasks:
- debug: msg="{{ inventory_hostname }}"
- debug: msg="{{ play_hosts.index(inventory_hostname) }}"
- local_action: 'lineinfile create=yes dest=/tmp/test.conf
line="host{{ play_hosts.index(inventory_hostname) }}=
{{ inventory_hostname }}"'
Edit: A better way to do this if just trying to operate on the list of host in inventory on the localhost, would be to avoid doing the action on the host and using local_action in the first place.
- name: Debug me
hosts: localhost
tasks:
- lineinfile:
create: yes
dest: /tmp/test.conf
line: "host{{ groups['hosts'].index(item)}}={{ item }}"
with_items: " {{ groups['hosts'] }}"
This will get you the results you desire. Then you can add another play to do operations against the hosts themselves.
The solution I am trying to avoid problems with race conditions with non-thread safe Local_action: lineinfile to write gathered data to local file. Split it across 2 different plays in the same file.
eg:
- name: gather_date
hosts: all
any_errors_fatal: false
gather_facts: no
tasks:
- name: get_Aptus_device_count_list
shell: gather_data.sh
become: true
register: Aptus_device_count_list
changed_when: false
- name: Log_gathered_date
hosts: all
any_errors_fatal: false
gather_facts: no
tasks:
- name: log_gathered_info
local_action:
module: lineinfile
dest: /home/rms-mit/MyAnsible/record_Device_count_collection.out
line: "\n--- {{ inventory_hostname }} --- \n
{{ Aptus_device_count_list.stdout }} \n.\n---\n"
changed_when: false

How to prompt user for a target host in Ansible?

I want to write a bootstrapper playbook for new machines in Ansible which will reconfigure the network settings. At the time of the first execution target machines will have DHCP-assigned address.
The user who is supposed to execute the playbook knows the assigned IP address of a new machine. I would like to prompt the user for is value.
vars_prompt module allows getting input from the user, however it is defined under hosts section effectively preventing host address as the required value.
Is it possible without using a wrapper script modifying inventory file?
The right way to do this is to create a dynamic host with add_host and place it in a new group, then start a new play that targets that group. That way, if you have other connection vars that need to be set ahead of time (credentials/keys/etc) you could set them on an empty group in inventory, then add the host to it dynamically. For example:
- hosts: localhost
gather_facts: no
vars_prompt:
- name: target_host
prompt: please enter the target host IP
private: no
tasks:
- add_host:
name: "{{ target_host }}"
groups: dynamically_created_hosts
- hosts: dynamically_created_hosts
tasks:
- debug: msg="do things on target host here"
You could pass it with extra-vars instead.
Simply make your hosts section a variable such as {{ hosts_prompt }} and then pass the host on the command line like so:
ansible-playbook -i inventory/environment playbook.yml --extra-vars "hosts_prompt=192.168.1.10"
Or if you are using the default inventory file location of /etc/ansible/hosts you could simply use:
ansible-playbook playbook.yml --extra-vars "hosts_prompt=192.168.1.10"
Adding to Matt's answer for multiple hosts.
input example would be 192.0.2.10,192.0.2.11
- hosts: localhost
gather_facts: no
vars_prompt:
- name: target_host
prompt: please enter the target host IP
private: no
tasks:
- add_host:
name: "{{ item }}"
groups: dynamically_created_hosts
with_items: "{{ target_host.split(',') }}"
- hosts: dynamically_created_hosts
tasks:
- debug: msg="do things on target host here"
Disclaimer: The accepted answer offers the best solution to the problem. While this one is working it is based on a hack and I leave it as a reference.
I found out it was possible use a currently undocumented hack (credit to Bruce P for pointing me to the post) that turns the value of -i / --inventory parameter into an ad hoc list of hosts (reference). With just the hostname/ip address and a trailing space (like below) it refers to a single host without the need for the inventory file to exist.
Command:
ansible-playbook -i "192.168.1.21," playbook.yml
And accordingly playbook.yml can be run against all hosts (which in the above example will be equal to a single host 192.168.1.21):
- hosts: all
The list might contain more than one ip address -i "192.168.1.21,192.168.1.22"
Adding to Jacob's and Matt's examples, with the inclusion of a username and password prompt:
---
- hosts: localhost
pre_tasks:
- name: verify_ansible_version
assert:
that: "ansible_version.full is version_compare('2.10.7', '>=')"
msg: "Error: You must update Ansible to at least version 2.10.7 to run this playbook..."
vars_prompt:
- name: target_hosts
prompt: |
Enter Target Host IP[s] or Hostname[s] (comma separated)
(example: 1.1.1.1,myhost.example.com)
private: false
- name: username
prompt: Enter Target Host[s] Login Username
private: false
- name: password
prompt: Enter Target Host[s] Login Password
private: true
tasks:
- add_host:
name: "{{ item }}"
groups: host_groups
with_items:
- "{{ target_hosts.split(',') }}"
- add_host:
name: login
username: "{{ username }}"
password: "{{ password }}"
- hosts: host_groups
remote_user: "{{ hostvars['login']['username'] }}"
vars:
ansible_password: "{{ hostvars['login']['password'] }}"
ansible_become: yes
ansible_become_method: sudo
ansible_become_pass: "{{ hostvars['login']['password'] }}"
roles:
- my_role

Resources