I am having a play "partial_upgrade.yml" that has vars_prompt to prompt the user for input.
I am importing this playbook in another playbook "choose_play.yml" which imports based on condition.
Even though the "partial_upgrade.yml " is skipped it prompts for user input.
1. choose_play.yml
---
- hosts: localhost
vars_prompt:
- name: "option"
prompt: |
>>> 1. Partial Upgrade
>>> 2. Full Upgrade
>>> Enter the option which you want to run -
private: no
tasks:
- set_fact:
option: "{{ option }}"
- debug:
var: option
- import_playbook: partial_upgrade.yml
vars:
partial_upgrade: true
full_upgrade: false
when: hostvars["localhost"]["option"]|int == 1
- import_playbook: full-upgrade.yml
vars:
partial_upgrade: false
full_upgrade: true
when: hostvars["localhost"]["option"]|int == 2
2. deploy.yml
---
- hosts: nodes
gather_facts: false
sudo: true
vars_prompt:
- name: "server_ip"
prompt: "Enter Server IP"
private: no
- name: "server_path"
prompt: "Enter Server path"
private: no
roles:
- setup-master
- setup-worker
When I run the "choose_play.yml" and press 2 then it skips the "partial_upgrade.yml" but prompts for "Server IP" and "Server Path".
I don't want to enter the details for the skipped play.
Please help me to disable the vars_prompt when the play is skipped.
Related
I have this simple playbook named delete.yml
- hosts: all
become: false
tasks:
- pause:
prompt: "Are you sure you want to delete \" EVERYTHING \"? Please confirm with \"yes\". Abort with \"no\" or Ctrl+c and then \"a\""
register: confirm_delete
- set_fact:
confirm_delete_fact: "{{ confirm_delete.user_input | bool }}"
- hosts: all
become: false
roles:
- {role: destroy when: confirm_delete_fact }
my inventory is
[my_group]
192.168.10.10
192.168.10.11
192.168.10.12
so I run the playbook with
ansible-playbook delete.yml -i inventoryfile -l my_group
Everything works but only for one host, the others in my_group are skipped because of the conditional check
What is wrong?
you could try that:
- hosts: all
become: false
tasks:
- pause:
prompt: "Are you sure you want to delete \" EVERYTHING \"? Please confirm with \"yes\". Abort with \"no\" or Ctrl+c and then \"a\""
register: confirm_delete
- name: Register dummy host with variable
add_host:
name: "DUMMY_HOST"
confirm_delete_fact: "{{ confirm_delete.user_input | bool }}"
- hosts: all
become: false
vars:
confirm_delete_fact: "{{ hostvars['DUMMY_HOST']['confirm_delete_fact'] }}"
roles:
- {role: destroy when: confirm_delete_fact }
if you dont want error on DUMMY_HOST (try to connect ssh), just do
- hosts: all,!DUMMY_HOST
explanations:
if you put your prompt in task, it will be used one time and belongs to hostvars of the first host, so i create a new dummy host and pass variable to other playbook.
you could avoid that:
by putting the prompt over the tasks and testing the variable hostvars:
- hosts: all
become: false
vars_prompt:
- name: confirm_delete
prompt: "Are you sure you want to delete \" EVERYTHING \"? Please confirm with \"yes\". Abort with \"no\" or Ctrl+c and then \"a\""
private: no
default: no
tasks:
- set_fact:
confirm_delete_fact: "{{ confirm_delete | bool }}"
- hosts: all
become: false
roles:
- {role: destroy when: hostvars[inventory_hostname]['confirm_delete_fact'] }
you could use the second solution because, you have the same hosts in both playbook. If different, i suggest you to use the first solution.
When I run ansible-playbook --tags tag2, why does not it skip vars_prompt from tag1? Anyhow it skips debug msg from tag1. Please help. This is making me write 2 different playbooks.
---
- name: variable print using var
hosts: all
gather_facts: no
tags: tag1
vars_prompt:
- name: ask_user
prompt: enter your name
private: no
tasks:
- debug:
msg: "{{ ask_user}} works in ABC company"
- hosts: all
gather_facts: no
tags: tag2
tasks:
- name: normal message
debug:
msg: "This is 2nd tag"
Q: "--tags does not skip vars_prompt in Ansible"
A: vars_prompt is not a task and therefore can't be skipped. Quoting from Tags
Using tags to execute or skip selected tasks is a two-step process:
Add tags to your tasks, either individually or with tag inheritance from a block, play, role, or import.
Select or skip tags when you run your playbook.
If you want to skip the prompting for a variable use pause instead of vars_prompt. For example, the playbook below does what you want
- hosts: localhost
gather_facts: false
tags: tag1
tasks:
- pause:
prompt: Enter your name
echo: true
register: result
- set_fact:
ask_user: "{{ result.user_input }}"
- debug:
msg: "{{ ask_user }} works in ABC company"
- hosts: localhost
gather_facts: false
tags: tag2
tasks:
- debug:
msg: This is tag2
In setting up a new Raspberry Pi with Ansible, I would like to perform the following actions:
Using the default pi user, create a new user named my_new_admin
Using the new my_new_admin user, deactivate the default pi user
Continue executing the playbook as my_new_admin
I am finding this difficult to achieve in a single playbook. Is it even possible to switch the active user like this in Ansible?
# inventory.yaml
---
all:
children:
rpis:
hosts:
myraspberrypi.example.com:
ansible_user: my_new_admin # or should `pi` go here?
...
# initialize.yaml
- hosts: rpis
remote_user: 'pi'
become: true
tasks:
- name: 'create new user'
user:
name: 'my_new_admin'
append: true
groups:
- 'sudo'
- name: 'add SSH key to my_new_admin'
*snip*
- name: 'lock default user'
remote_user: 'my_new_admin'
user:
name: 'pi'
expires: '{{ ("1970-01-02 00:00:00" | to_datetime).timestamp() | float }}'
password_lock: true
...
If you want to switch users, the easiest solution is to start another play. For example, the following playbook will run the first play as user pi and the second play as user root:
- hosts: pi
gather_facts: false
remote_user: pi
tasks:
- command: whoami
register: whoami
- debug:
msg: "{{ whoami.stdout }}"
- hosts: pi
gather_facts: false
remote_user: root
tasks:
- command: whoami
register: whoami
- debug:
msg: "{{ whoami.stdout }}"
In this playbook I'm being explicit about remote_user in both plays, but you could also set a user in your inventory and only override it when necessary. E.g., if I have:
pi ansible_host=raspberrypi.local ansible_user=root
Then I could rewrite the above playbook like this:
- hosts: pi
gather_facts: false
vars:
ansible_user: pi
tasks:
- command: whoami
register: whoami
- debug:
msg: "{{ whoami.stdout }}"
- hosts: pi
gather_facts: false
tasks:
- command: whoami
register: whoami
- debug:
msg: "{{ whoami.stdout }}"
Note that I'm setting the ansible_user variable here rather than using remote_user, because it looks as if ansible_user has precedence.
I am having 2 plays in a playbook. One of them prompts for input from user and I want to use that variable in another play. Please suggest me how to do that
---
- hosts: workers
gather_facts: false
sudo: true
vars_prompt:
- name: "server_ip"
prompt: "Enter the Server IP"
private: no
roles:
- client-setup-worker
- hosts: master
gather_facts: false
sudo: true
vars:
server: "{{ hostvars['workers']['server_ip'] }}"
roles:
- client-setup-master
In the above playbook I want to use server_ip defined in workers hosts to be used in master hosts.
I am facing the error "The error was: \"hostvars['workers']\" is undefined" while doing so
I am facing the error "The error was: \"hostvars['workers']\" is undefined" while doing so
That's because workers is evidently a group, and not a host, which is the only thing one will find declared in hostvars
You'll need to grab one of the hosts at random from the workers group, and then extract its fact; I believe this will do that:
update after seeing the mostly correct answer by #VladimirBotka
- hosts: workers
vars_prompt:
# as before ...
pre_tasks:
- name: re-export the vars_prompt for cross playbook visibility
set_fact:
server_ip: '{{ server_ip }}'
roles:
# as before
- hosts: masters
vars:
server: '{{ hostvars[(groups.workers|first)].server_ip }}'
The scope of variables declared in vars_prompt is the play.
Such variable may be put into the hostvars by set_facts "to use that variable in another play". For example the play below
- hosts: test_01
gather_facts: false
vars_prompt:
- name: "server_ip"
prompt: "Enter the Server IP"
private: no
tasks:
- set_fact:
stored_server_ip: "{{ server_ip }}"
- debug:
var: stored_server_ip
- hosts: test_02
gather_facts: false
vars:
server: "{{ hostvars.test_01.stored_server_ip }}"
tasks:
- debug:
var: server
gives (abridged):
ok: [test_01] => {
"stored_server_ip": "10.1.0.10"
}
ok: [test_02] => {
"server": "10.1.0.10"
}
I have a playbook which reads in a list of variables:
vars_files:
- vars/myvariables.yml
tasks:
- name: Debug Variable List
debug:
msg: "An item: {{item}}"
with_list: "{{ myvariables }}"
This prints out the list of "myvariables" from a file variables.yml, which contains:
---
myvariables:
- variable1
- variable2
I get the following as expected.
"msg": "An item: variable1"
"msg": "An item: variable2"
However, when I connect to another host, and run the same Debug statement, it throws an error:
vars_files:
- vars/myvariables.yml
tasks:
- name: Configure instance(s)
hosts: launched
become: True
remote_user: ubuntu
port: 22
gather_facts: False
tasks:
- name: Wait for SSH to come up
delegate_to: ***
remote_user: ubuntu
connection: ssh
register: item
- name: Debug Variable List
debug:
msg: "An item: {{item}}"
with_list: "{{ myvariables }}"
OUTPUT:
"msg": "'myvariables' is undefined"
How do I define the variables file when connecting to another host that is not localhost?
Any help on this would be greatly appreciated.
With "hosts: launched" you started new playbook. Put the vars_files: into the scope of this playbook (see below).
- name: Configure instance(s)
hosts: launched
become: True
remote_user: ubuntu
port: 22
gather_facts: False
vars_files:
- vars/myvariables.yml
tasks:
Review the Scoping variables.