Deactivate the Current Ansible User with Ansible - ansible

In setting up a new Raspberry Pi with Ansible, I would like to perform the following actions:
Using the default pi user, create a new user named my_new_admin
Using the new my_new_admin user, deactivate the default pi user
Continue executing the playbook as my_new_admin
I am finding this difficult to achieve in a single playbook. Is it even possible to switch the active user like this in Ansible?
# inventory.yaml
---
all:
children:
rpis:
hosts:
myraspberrypi.example.com:
ansible_user: my_new_admin # or should `pi` go here?
...
# initialize.yaml
- hosts: rpis
remote_user: 'pi'
become: true
tasks:
- name: 'create new user'
user:
name: 'my_new_admin'
append: true
groups:
- 'sudo'
- name: 'add SSH key to my_new_admin'
*snip*
- name: 'lock default user'
remote_user: 'my_new_admin'
user:
name: 'pi'
expires: '{{ ("1970-01-02 00:00:00" | to_datetime).timestamp() | float }}'
password_lock: true
...

If you want to switch users, the easiest solution is to start another play. For example, the following playbook will run the first play as user pi and the second play as user root:
- hosts: pi
gather_facts: false
remote_user: pi
tasks:
- command: whoami
register: whoami
- debug:
msg: "{{ whoami.stdout }}"
- hosts: pi
gather_facts: false
remote_user: root
tasks:
- command: whoami
register: whoami
- debug:
msg: "{{ whoami.stdout }}"
In this playbook I'm being explicit about remote_user in both plays, but you could also set a user in your inventory and only override it when necessary. E.g., if I have:
pi ansible_host=raspberrypi.local ansible_user=root
Then I could rewrite the above playbook like this:
- hosts: pi
gather_facts: false
vars:
ansible_user: pi
tasks:
- command: whoami
register: whoami
- debug:
msg: "{{ whoami.stdout }}"
- hosts: pi
gather_facts: false
tasks:
- command: whoami
register: whoami
- debug:
msg: "{{ whoami.stdout }}"
Note that I'm setting the ansible_user variable here rather than using remote_user, because it looks as if ansible_user has precedence.

Related

Ansible vars_prompt for host list into Import_playbook

I want to use a specific host / host list for an imported playbook which I get from a vars_prompt input. How can I do this? So far I wasn´t able to get this running.
I have two playbooks which I need to run separately and ios_check_routerports.yaml is the parent playbook:
ios_check_routerports.yaml
---
- hosts: '{{ branch_number }}'
connection: network_cli
gather_facts: False
any_errors_fatal: no
throttle: 75
vars_prompt:
- name: "branch_number"
prompt: "Which branch do you want to check?"
default: all
private: no
tasks:
- name: Check facts
ios_facts:
gather_subset: hardware
- name: Create directory
file:
path: /root/ansible/pb-outputs/ios_check_routerports/
state: directory
delegate_to: 127.0.0.1
- name: Run playbook
import_playbook: ios_check_routerports_main.yaml
ios_check_routerports_main.yaml
---
- hosts: '{{ branch_number }}'
connection: network_cli
gather_facts: False
any_errors_fatal: no
throttle: 75
tasks:
- name: Check default-gateway
ios_command:
commands: sh run | i default-gateway
register: default_gateway
I tried to set a fact for the var {{ branch_number }} like this:
ios_check_routerports.yaml
- set_fact:
devices: "{{ branch_number }}"
ios_check_routerports_main.yaml
---
- hosts: '{{ devices }}'
connection: network_cli
The playbook always runs into an error because the hosts var is not defined. What am I doing wrong here?
try this: no need to create a new variable devices but a dummy host
in ios_check_routerports.yaml add a task:
- name: Register dummy host with variable
add_host:
name: "DUMMY_HOST"
DEVICES: "{{ branch_number }}"
then :
- hosts: "{{ hostvars['DUMMY_HOST']['DEVICES'] }}"
connection: network_cli
as you create a new host, i suggest you to delete it if you havent need the variable branch_number so remove_host doesnt exit:
either you do a first task - meta: refresh_inventory
or you modify your host like this:
- hosts: "{{ hostvars['DUMMY_HOST']['DEVICES'] }},!DUMMY_HOST"

Ansible - prompt for a confirmation to run tasks and share the fact by multiple hosts

I have this simple playbook named delete.yml
- hosts: all
become: false
tasks:
- pause:
prompt: "Are you sure you want to delete \" EVERYTHING \"? Please confirm with \"yes\". Abort with \"no\" or Ctrl+c and then \"a\""
register: confirm_delete
- set_fact:
confirm_delete_fact: "{{ confirm_delete.user_input | bool }}"
- hosts: all
become: false
roles:
- {role: destroy when: confirm_delete_fact }
my inventory is
[my_group]
192.168.10.10
192.168.10.11
192.168.10.12
so I run the playbook with
ansible-playbook delete.yml -i inventoryfile -l my_group
Everything works but only for one host, the others in my_group are skipped because of the conditional check
What is wrong?
you could try that:
- hosts: all
become: false
tasks:
- pause:
prompt: "Are you sure you want to delete \" EVERYTHING \"? Please confirm with \"yes\". Abort with \"no\" or Ctrl+c and then \"a\""
register: confirm_delete
- name: Register dummy host with variable
add_host:
name: "DUMMY_HOST"
confirm_delete_fact: "{{ confirm_delete.user_input | bool }}"
- hosts: all
become: false
vars:
confirm_delete_fact: "{{ hostvars['DUMMY_HOST']['confirm_delete_fact'] }}"
roles:
- {role: destroy when: confirm_delete_fact }
if you dont want error on DUMMY_HOST (try to connect ssh), just do
- hosts: all,!DUMMY_HOST
explanations:
if you put your prompt in task, it will be used one time and belongs to hostvars of the first host, so i create a new dummy host and pass variable to other playbook.
you could avoid that:
by putting the prompt over the tasks and testing the variable hostvars:
- hosts: all
become: false
vars_prompt:
- name: confirm_delete
prompt: "Are you sure you want to delete \" EVERYTHING \"? Please confirm with \"yes\". Abort with \"no\" or Ctrl+c and then \"a\""
private: no
default: no
tasks:
- set_fact:
confirm_delete_fact: "{{ confirm_delete | bool }}"
- hosts: all
become: false
roles:
- {role: destroy when: hostvars[inventory_hostname]['confirm_delete_fact'] }
you could use the second solution because, you have the same hosts in both playbook. If different, i suggest you to use the first solution.

How to set fact witch is visible on all hosts in Ansible role

I'm setting fact in a role:
- name: Check if manager already configured
shell: >
docker info | perl -ne 'print "$1" if /Swarm: (\w+)/'
register: swarm_status
- name: Init cluster
shell: >-
docker swarm init
--advertise-addr "{{ ansible_default_ipv4.address }}"
when: "'active' not in swarm_status.stdout_lines"
- name: Get worker token
shell: docker swarm join-token -q worker
register: worker_token_result
- set_fact:
worker_token: "{{ worker_token_result.stdout }}"
Then I want to access worker_token on another hosts. Here's my main playbook, the fact is defined in the swarm-master role
- hosts: swarm_cluster
become: yes
roles:
- docker
- hosts: swarm_cluster:&manager
become: yes
roles:
- swarm-master
- hosts: swarm_cluster:&node
become: yes
tasks:
- debug:
msg: "{{ worker_token }}"
I'm getting undefined variable. How to make it visible globally?
Of course it works perfectly if I run debug on the same host.
if your goal is just to access worker_token from on another host, you can use hostvars variable and iterate through the group where you've defined your variable like this:
- hosts: swarm_cluster:&node
tasks:
- debug:
msg: "{{ hostvars[item]['worker_token'] }}"
with_items: "{{ groups['manager'] }}"
If your goal is to define the variable globally, you can add a step to define a variable on all hosts like this:
- hosts: all
tasks:
- set_fact:
worker_token_global: "{{ hostvars[item]['worker_token'] }}"
with_items: "{{ groups['manager'] }}"
- hosts: swarm_cluster:&node
tasks:
- debug:
var: worker_token_global

ansible does not read in variables when connected to another host

I have a playbook which reads in a list of variables:
vars_files:
- vars/myvariables.yml
tasks:
- name: Debug Variable List
debug:
msg: "An item: {{item}}"
with_list: "{{ myvariables }}"
This prints out the list of "myvariables" from a file variables.yml, which contains:
---
myvariables:
- variable1
- variable2
I get the following as expected.
"msg": "An item: variable1"
"msg": "An item: variable2"
However, when I connect to another host, and run the same Debug statement, it throws an error:
vars_files:
- vars/myvariables.yml
tasks:
- name: Configure instance(s)
hosts: launched
become: True
remote_user: ubuntu
port: 22
gather_facts: False
tasks:
- name: Wait for SSH to come up
delegate_to: ***
remote_user: ubuntu
connection: ssh
register: item
- name: Debug Variable List
debug:
msg: "An item: {{item}}"
with_list: "{{ myvariables }}"
OUTPUT:
"msg": "'myvariables' is undefined"
How do I define the variables file when connecting to another host that is not localhost?
Any help on this would be greatly appreciated.
With "hosts: launched" you started new playbook. Put the vars_files: into the scope of this playbook (see below).
- name: Configure instance(s)
hosts: launched
become: True
remote_user: ubuntu
port: 22
gather_facts: False
vars_files:
- vars/myvariables.yml
tasks:
Review the Scoping variables.

Ansible-Execute a last mandatory role in the playbook even though any previous roles in the execution flow fails

I have ansible playbook something like this
---
- hosts: localhost
connection: local
tasks:
- set_fact:
build_date_time: "{{ ansible_date_time }}"
- hosts: localhost
connection: local
gather_facts: false
vars:
role: Appvariables
base_image_tag: Base
roles:
- role: begin-building-ami
- hosts: just_created
remote_user: "{{ default_user }}"
vars:
role: AppName
roles:
- { role: role1, become: yes }
- role: role2
- { role: role3, become: yes }
- { role: role4, become: yes }
- { role: role5, become: yes }
- hosts: localhost
connection: local
gather_facts: false
vars:
role: Appname
ansible_date_time: "{{ build_date_time }}"
roles:
- finish-building-ami
Here in this case we have a situation to execute the finish-building-ami role where we terminate the instance after baking the ami. If any reason any of the previous role1-role5 fails in the flow fails It stops the playbook and we have the failed instance which we needed to terminate automatically.Right now we are going and terminating it manually if it fails.
So needed to run finish-building-ami(mandatory role where we stop the instance and take ami and terminate the instance at last ) if even any of the role1-role5 fails in the above mentioned playbook.
You can rewrite your existing play to use import_role or include_role tasks instead of the roles section. This allows you to use blocks:
---
- hosts: localhost
gather_facts: false
tasks:
- block:
- import_role:
name: role1
- import_role:
name: role2
become: true
- import_role:
name: role3
rescue:
- set_fact:
role_failed: true
- hosts: localhost
gather_facts: false
tasks:
- debug:
msg: This task runs after our roles.

Resources