Ansible Vault Password in variable - ansible

Is there a way to access the vault password as a variable in an Ansible playbook? I am looking for something like this:
---
debug: var=ansible_vault_password

I ended up solving this by copying the local vault password file to the server. The task to do that looks like that:
- name: setup ansible vault password file
copy:
src: /path/to/local/vault_pass
dest: /root/.vault_pass
mode: 0600
owner: root
group: root
And then the root user will execute the ansible-pull command.

Try to save the password into a different file and use "vars_files" to include the password. Example:
In Password.yml:
ansible_vault_password: redhat
In Playbook.yml:
Host: xyz
vars_files: password.yml
tasks:
debug:
var: "{{ ansible_vault_password }}"
Try this and please let me know.

Related

It's possible to parse template file on ansible role having the role itself as destination

I'm trying to parse a template file with Ansible, but I don't want this file to be created in any of my remote hosts, but instead I just want to create in my role_path.
I have the following in my role.
---
- name: Create configuration.yml on ansible role
ansible.builtin.template:
src: configuration.j2
dest: "{{role_path | default('')}}{{stack_name | mandatory}}/configuration.yml"
vars:
stack_env: "dev"
app_network: "my_network"
- name: Run tasks/main.yml from compose role
ansible.builtin.include_role:
name: compose
vars:
stack_name: "logging"
stack_path: "{{ ansible_base_path }}/"
When I run, my pipeline says that the directory doesn't exist, which is correct, because this directory exists outside my host, and not inside.
I basically want to parse this template file into my role, to be used by another role dependency.
Anyone knows if this is possible?
I found by myself the solution. It's possible to make use of local_action.
This is how my final playbook looks like.
- name: Create configuration.yml parsing variables
local_action:
module: template
src: configuration.j2
dest: "{{ role_path }}/logging/configuration.yml"
- name: Run tasks/main.yml from compose role
ansible.builtin.include_role:
name: compose
vars:
stack_name: "logging"
stack_path: "{{ ansible_base_path }}/"

Switching user for delegation to host outside of inventory with Ansible/awx

I am trying to do the following using Ansible 2.8.4 and awx:
Read some facts from Cisco IOS devices (works)
Put results into a local file using a template (works)
Copy/Move the resulting file to a different server
Since I have to use a different user to access IOS devices and servers, and the servers in question aren't part of the inventory used for the playbook, I am trying to achieve this using become_user and delegate_to.
The initial user (defined in the awx template) is allowed to connect to the IOS devices, while different_user can connect to servers using a ssh private key.
The playbook:
---
- name: Read Switch Infos
hosts: all
gather_facts: no
tasks:
- name: Gather IOS Facts
ios_facts:
- debug: var=ansible_net_version
- name: Set Facts IOS
set_fact:
ios_version: "{{ ansible_net_version }}"
- name: Create Output file
file: path=/tmp/test state=directory mode=0755
delegate_to: 127.0.0.1
run_once: true
- name: Run Template
template:
src: ios_firmware_check.j2
dest: /tmp/test/output.txt
delegate_to: 127.0.0.1
run_once: true
- name: Set up keys
become: yes
become_method: su
become_user: different_user
authorized_key:
user: different_user
state: present
key: "{{ lookup('file', '/home/different_user/.ssh/key_file') }}"
delegate_to: 127.0.0.1
run_once: true
- name: Copy to remote server
remote_user: different_user
copy:
src: /tmp/test/output.txt
dest: /tmp/test/output.txt
delegate_to: remote.server.fqdn
run_once: true
When run, the playbook fails in the Set up keys task trying to access the home directory with the ssh key:
TASK [Set up keys] *************************************************************
task path: /tmp/awx_2206_mz90qvh9/project/IOS/ios_version.yml:23
[WARNING]: Unable to find '/home/different_user/.ssh/key_file' in expected paths
(use -vvvvv to see paths)
File lookup using None as file
fatal: [host]: FAILED! => {
"msg": "An unhandled exception occurred while running the lookup plugin 'file'. Error was a <class 'ansible.errors.AnsibleError'>, original message: could not locate file in lookup: /home/different_user/.ssh/key_file"
}
I'm assuming my mistake is somehow related to which user is trying to access the /home/ directory on which device.
Is there a better/more elegant/working way of connecting to a different server using an ssh key to move around files?
I know one possibility would be to just scp using the shell module, but that always feels a bit hacky.
(sort of) solved using encrypted variables in hostvars with Ansible vault.
How to get there:
Encrypting the passwords:
This needs to be done from any commandline with Ansible installed, for some reason this can't be done in tower/awx
ansible-vault encrypt_string "password"
You'll be prompted for a password to encrypt/decrypt.
If you're doing this for Cisco devices, you'll want to encrypt both the ssh and the enable password using this method.
Add encrypted passwords to inventory
For testing, I put it in hostvars for a single switch, should be fine to put it into groupvars and use it on multiple switches as well.
ansible_ssh_pass should be the password to access the switch, ansible_become_pass is the enable password.
---
all:
children:
Cisco:
children:
switches:
switches:
hosts:
HOSTNAME:
ansible_host: ip-address
ansible_user: username
ansible_ssh_pass: !vault |
$ANSIBLE_VAULT;1.1;AES256
[encrypted string]
ansible_connection: network_cli
ansible_network_os: ios
ansible_become: yes
ansible_become_method: enable
ansible_become_pass: !vault |
$ANSIBLE_VAULT;1.1;AES256
[encrypted string]
Adding the vault password to tower/awx
Add a new credential with credential type "Vault" and the password you used earlier to encrypt the strings.
Now, all you need to do is add the credential to your job template (the template can have one "normal" credential (machine, network, etc.) and multiple vaults).
The playbook then automagically accesses the vault credential to decrypt the strings in the inventory.
Playbook to get Switch Infos and drop template file on a server
The playbook now looks something like below, and does the following:
Gather Facts on all Switches in Inventory
Write all facts into a .csv using a template, save the file on the ansible host
Copy said file to a different server using a different user
The template is configured with the user able to access the server, the user used to access switches with a password is stored in the inventory as seen above.
---
- name: Read Switch Infos
hosts: all
gather_facts: no
tasks:
- name: Create Output file
file: path=/output/directory state=directory mode=0755
delegate_to: 127.0.0.1
run_once: true
- debug:
var: network
- name: Gather IOS Facts
remote_user: username
ios_facts:
- debug: var=ansible_net_version
- name: Set Facts IOS
set_fact:
ios_version: "{{ ansible_net_version }}"
- name: Run Template
template:
src: ios_firmware_check.csv.j2
dest: /output/directory/filename.csv
delegate_to: 127.0.0.1
run_once: true
- name: Create Destination folder on remote server outside inventory
remote_user: different_username
file: path=/destination/directory mode=0755
delegate_to: remote.server.fqdn
run_once: true
- name: Copy to remote server outside inventory
remote_user: different_username
copy:
src: /output/directory/filename.csv
dest: /destination/directory/filename.csv
delegate_to: remote.server.fqdn
run_once: true

Ansible connect via ssh after certain tasks have populated passwords

In Ansible I have a need to execute a set of tasks and obtain the passwords from a third party (this part was handled) and then use those SSH credentials to connect.
The problem is it seems when I am doing this the best way to loop through my inventory is to include a list of tasks, that's great. The major problem is that I can only get this if I specify hosts in my main.yml playbook to localhost. (Or set to the name of the server group and specify connection: local) this makes the command module execute locally, which defeats the purpose.
I have tried looking into the SSH module but it looks like it is not registering to give me a no_action detected. I am aware I am likely overlooking something glaring.
I will be posting closer to exact code later but what I have now is
main.yml
---
- hosts: localhost
tasks:
- name: subplay
include: secondary.yml
vars:
user:myUser
address:"{{hostvars[item].address}}"
with_items: hostvars['mygroup']
secondary.yml
---
- name: fetch password
[...fethchMyPassword, it works]
register: password
- name:
[...Need to connect with fetched user for this task and this task only..]
command: /my/local/usr/task.sh
I am wanting to connect and execute the script there but it seems no matter what I try it either fails to execute at all or executes locally.
Additionally, I might note I checked out https://docs.ansible.com/ansible/latest/plugins/connection/paramiko_ssh.html and
https://docs.ansible.com/ansible/latest/plugins/connection/ssh.html but must be doing something wrong
it looks like to me that only your fetch task needs to be delegated to localhost, the rest on my_group, and when you have all your connection info, setup connection with set_facts by setting values to ansible_{user, ssh_pass, password} try this :
main.yml
---
- hosts: mygroup # inventory_hostname will loop through all your hosts in my_group
tasks:
- name: subplay
include: secondary.yml
vars:
user:myUser
address:"{{hostvars[inventory_hostname].address}}"
secondary.yml
---
- name: fetch password
[...fethchMyPassword, it works]
delegate_to: localhost # this task is only run on localhost
register: password
- set_fact: # use registered password and vars to setup connection
ansible_user: "{{ user}}"
ansible_ssh_pass: "{{ password }}"
ansible_host: "{{ address }}"
- name: Launch task # this task is run on each hosts of my_group
[...Need to connect with fetched user for this task and this task only..]
command: /my/local/usr/task.sh
launch this with
ansible-playbook main.yml
try to write a role with your secondary.yml, and a playbook witht your main.yml

roles overides tasks in playbook

I have ansible playbook which look similar to the code below :
---
- hosts: localhost
connection: local
tasks:
- name: "Create custom fact directory
file:
path: "/etc/ansible/facts.d"
state: "directory"
- name: "Insert custom fact file"
copy:
src: custom_fact.fact
dest: /etc/ansible/facts.d/custom_fact.fact
mode: 0755
roles:
- role1
- role2
once i am running the playbook with ansible-playbook command
only the roles is running ,but the tasks is not getting ran
if i am remarking the roles from the playbook,the task gets ran
how can i make the task to run before the roles ?
Put the tasks in a section pre_tasks which are run before roles.
You may also find post_tasks useful which run tasks after roles.
Correct the indentation
- hosts: localhost
connection: local
tasks:
- name: "Create custom fact directory
file:
path: ...

How create users per enviroment using ansible Inventory and module "htpasswd"

I'm newbie in ansible. I wrote ansible role for creating user and password in "/etc/httpd/.htpasswd" like that:
- name: htpasswd
htpasswd:
path: /etc/httpd/.htpasswd
name: dev
password: dev
group: apache
mode: 0640
become: true
Now, I'm trying to understand, how I can set user and password placeholder variable per environment for this model using inventory(or any other way). Like, if I ran "ansible playbook -i inventories/dev" so in role of this model could be set:
- name: htpasswd
htpasswd:
path: /etc/httpd/.htpasswd
name: "{{ inventory.htpasswd.name }}"
password: "{{ inventory.htpasswd.password }}"
group: apache
mode: 0640
become: true
And in inventory folder per environment will be file "htpasswd" with name and password content like that:
name: dev
password: dev
Does Ansible have something like that? Or can someone explain me what best practices?
By default, each host is assigned to a all group by Ansible. With the following structure you can define group vars based on inventory.
inventories/dev/hosts
inventories/dev/group_vars/all.yml
inventories/staging/hosts
inventories/staging/group_vars/all.yml
In inventories/dev/group_vars/all.yml:
name: dev
password: dev
In inventories/staging/group_vars/all.yml:
name: staging
password: staging
And then in your tasks, reference the vars with their names:
- name: htpasswd
htpasswd:
path: /etc/httpd/.htpasswd
name: "{{ name }}"
password: "{{ password }}"
group: apache
mode: 0640
become: true

Resources