Inline encrypted variable not JSON serializable - ansible

I'm trying to understand how to encrypt single variables with vault. First I encrypt the string with ansible-vault encrypt_string -n -p, then I write the output into my playbook. When I execute the playbook it says that the decrypted string isn't JSON serializable.
Encrypted string: "inline_name"
I also tried it with inline_name and inlinename, every time with the same result.
My playbook:
---
- name: Build System
hosts: dev
tasks:
- name: Create
mysql_db:
state: present
name: !vault |
$ANSIBLE_VAULT;1.1;AES256
39613261386438623937643062636166663638633062323939343734306334346537613233623064
3761633832326365356231633338396132646532313861350a316666376566616633376238313636
39343833306462323534623238333639663734626662623731666239366566643636386261643164
3861363730336331660a316165633232323732633364346636363764623639356562336536636136
6364
login_host: "{{ mysql_host }}"
login_user: "{{ mysql_user }}"
login_password: "{{ mysql_pass }}"
- name: Check if can access plain text vars
debug:
msg: "{{ my_plain_txt }}"
Error message:
An exception occurred during task execution. To see the full traceback, use -vvv.
The error was: TypeError: u'"inline_name"' is not JSON serializable
fatal: [127.0.0.1]: FAILED! => {"failed": true, "msg": "Unexpected failure during module execution.", "stdout": ""}

Add task-level variable:
- name: Create
mysql_db:
state: present
name: "{{ mysql_name }}"
login_host: "{{ mysql_host }}"
login_user: "{{ mysql_user }}"
login_password: "{{ mysql_pass }}"
vars:
mysql_name: !vault |
$ANSIBLE_VAULT;1.1;AES256
39613261386438623937643062636166663638633062323939343734306334346537613233623064
3761633832326365356231633338396132646532313861350a316666376566616633376238313636
39343833306462323534623238333639663734626662623731666239366566643636386261643164
3861363730336331660a316165633232323732633364346636363764623639356562336536636136
6364

Double-quotes could explain this error but not for me. Look at the entire error/warning to see what is attempting to parse json. In my case....
[WARNING]: Failure using method (v2_runner_on_ok) in callback plugin
(): u'secret_value' is not JSON serializable
An older AWX callback plugin called json.load and logged a warning along with secrets in plain text. It needed an upgrade.

i have implemented same for sending email using mail module and it's working as expected.
ansible-vault encrypt_string yourgmailapppassword --name gmail_password
use above method to encrypt gmail app password using ansible vault string option and define encrypted variable into the playbook.
cat fetch-users-deatils.yml
- name: Linux servers user audit report preparation
hosts: "{{ HOSTS }}"
roles:
- user-collections
- name: Refreshing user Dashboard & sending email from localhost
hosts: localhost
become: false
vars:
- gmail_password: !vault |
$ANSIBLE_VAULT;1.1;AES256
62613232383962323430633831113465356231563163366235353034393230656331663436646233
3266353862303738303737383530313664356135336661390a336562613436626665333833323030
61393135643433313930643337363465343332353716333831222766376137396430426361663633
6233313433633231320a663435636230636431643731333166366435346564316331323361633566
38622138392437888466666535323432653034323936353961646233613437343831
tasks:
- name: Collecting the user details information and recreating the users dashboard
script: dashboard_user.sh
tags: user_dashboard
- name: User Audit data output file stored on below location
debug:
msg:
/tmp/user_collection/user_details.csv
- name: 'Sending Ansible users report email'
mail:
host: smtp.gmail.com
subtype: html
port: 587
password: "{{ gmail_password }}"
to: abcdefghijkl#gmail.com
from: abcdefghijkl#gmail.com
username: abcdefghijkl#gmail.com
subject: User details report
attach: /tmp/user_collection/user_details.csv
body: <pre> {{ lookup('file', '/tmp/user_collection/user_details.csv') }} </pre>
delegate_to: localhost
below is ansible playbook execution command
ansible-playbook fetch-users-deatils.yml -e "HOSTS=all" --ask-vault-pass

Related

Ansible & Juniper Junos - Unable to make a PyEZ connection: ConnectError() [duplicate]

I am trying to use juniper_junos_facts from the Ansible Junos module to query some VM's that I provisioned using Vagrant. However I am getting the following error.
fatal: [r1]: FAILED! => {"changed": false, "msg": "Unable to make a PyEZ connection: ConnectUnknownHostError(r1)"}
fatal: [r2]: FAILED! => {"changed": false, "msg": "Unable to make a PyEZ connection: ConnectUnknownHostError(r2)"}
I see in the following document Here on juniper.net that this error occurs when you don't have the host defined correctly in the inventory file. I don't believe this to be an issue with my inventory file because when I run ansible-inventory --host all appears to be in order
~/vagrant-projects/junos$ ansible-inventory --host r1
{
"ansible_ssh_host": "127.0.0.1",
"ansible_ssh_port": 2222,
"ansible_ssh_private_key_file": ".vagrant/machines/r1/virtualbox/private_key",
"ansible_ssh_user": "root"
}
~/vagrant-projects/junos$ ansible-inventory --host r2
{
"ansible_ssh_host": "127.0.0.1",
"ansible_ssh_port": 2200,
"ansible_ssh_private_key_file": ".vagrant/machines/r2/virtualbox/private_key",
"ansible_ssh_user": "root"
}
My playbook is copied from the following document which I got from Here on juniper.net.
My Inventory File
[vsrx]
r1 ansible_ssh_host=127.0.0.1 ansible_ssh_port=2222 ansible_ssh_private_key_file=.vagrant/machines/r1/virtualbox/private_key
r2 ansible_ssh_host=127.0.0.1 ansible_ssh_port=2200 ansible_ssh_private_key_file=.vagrant/machines/r2/virtualbox/private_key
[vsrx:vars]
ansible_ssh_user=root
My Playbook
---
- name: show version
hosts: vsrx
roles:
- Juniper.junos
connection: local
gather_facts: no
tasks:
- name: retrieve facts
juniper_junos_facts:
host: "{{ inventory_hostname }}"
savedir: "{{ playbook_dir }}"
- name: print version
debug:
var: junos.version
As you're using connection: local you need to give the module full connection details (usually packaged in a provider dictionary at the play level to reduce repetition):
- name: retrieve facts
juniper_junos_facts:
host: "{{ ansible_ssh_host }}"
port: "{{ ansible_ssh_port }}"
user: "{{ ansible_ssh_user }}"
passwd: "{{ ansible_ssh_pass }}"
ssh_private_key_file: "{{ ansible_ssh_private_key_file }}"
savedir: "{{ playbook_dir }}"
Full docs are here (watch out for the correct role version in the URL): https://junos-ansible-modules.readthedocs.io/en/2.1.0/juniper_junos_facts.html where you can also see what the defaults are.
To fully explain the "provider" method, your playbook should look something like this:
---
- name: show version
hosts: vsrx
roles:
- Juniper.junos
connection: local
gather_facts: no
vars:
connection_info:
host: "{{ ansible_ssh_host }}"
port: "{{ ansible_ssh_port }}"
user: "{{ ansible_ssh_user }}"
passwd: "{{ ansible_ssh_pass }}"
ssh_private_key_file: "{{ ansible_ssh_private_key_file }}"
tasks:
- name: retrieve facts
juniper_junos_facts:
provider: "{{ connection_info }}"
savedir: "{{ playbook_dir }}"
- name: print version
debug:
var: junos.version
This answer for people who will find this question by error message.
If you use connection plugin different from local, it can, and usually caused by this bug related to variables ordering
Bug already fixed in Release 2.2.1 and later, try to update module from Galaxy.

How to use the lookup plugin to get the directory path and file in Ansible

I have a two playbooks where one creates SSH Keys and the other one creates a new user and deploys the public ssh key for the new user created.
My issue is I created a task that create a new directory with a timestamp to store the relevant data, I was able to get the path to a variable where I added it as a dummy host so that I can be able to call that path with all my plays but it seems like I am unable to use the same variable in lookup so that I can be able to deploy the ssh key. Kindly assist, below are the relevant tasks.
# Create the directory with timestamp
- name: Create Directory with timestamp to store data that was run multiple times that day
when: inventory_hostname in groups['local']
file:
path: "{{store_files_path}}/{{ansible_date_time.date}}/{{ansible_date_time.time}}"
state: directory
mode: "0755"
register: dir_path
# Add the directory path to dummy host called save so that I can call it from other plays
- name: Add dir path:"{{dir_path.path}}" as a 'save' host
when: inventory_hostname in groups['local']
add_host:
name: "save"
dir: "{{dir_path.path}}"
# Deploying SSH Key I tried this -->
- name: Deploy Public Key to the server
when: inventory_hostname in groups['Servers']
authorized_key:
user: "{{hostvars['new-user']['user']}}"
state: present
key: "{{dir_path.path}}/SSH-Key.pub"
# ...this -->
- name: Deploy Public Key to the server
when: inventory_hostname in groups['Servers']
authorized_key:
user: "{{hostvars['new-user']['user']}}"
state: present
key: "{{ lookup('file','{{dir_path.path}}/SSH-Key.pub') }}"
# .... and this -->
- name: Deploy Public Key to the server
when: inventory_hostname in groups['Servers']
authorized_key:
user: "{{hostvars['new-user']['user']}}"
state: present
key: "{{ lookup('file','{{hostvars['save']['dir']}}/SSH-Key.pub') }}"
None of them worked, what am I doing wrong?
If you put a Jinja expression into a string in a Jinja expression, then you indeed end up with a your variable not being interpreted.
A basic example of this is:
- hosts: all
gather_facts: no
tasks:
- debug:
msg: "{{ '{{ foo }}' }}"
vars:
foo: bar
Which gives:
ok: [localhost] => {
"msg": "{{ foo }}"
}
When
- hosts: all
gather_facts: no
tasks:
- debug:
msg: "{{ foo }}"
vars:
foo: bar
Gives thes expected:
ok: [localhost] => {
"msg": "bar"
}
So in order to achieve what you want here, you should use the concatenation operator of Jinja: ~, in order to let Jinja interpret your variable and concatenate it with the rest of your "hardcoded" string.
Effectively ending with the instruction:
key: "{{ lookup('file', hostvars['save']['dir'] ~ '/SSH-Key.pub') }}"

Making prompted vars usable across the playbook

I have one playbook with multiple tasks that have to run on different hosts.
In the beginning of the play I want to prompt the operator for their credentials which are the same for every host in the play. I want to have those credentials "stored" somewhere so they can be used across the tasks to log in on the provided host(s).
Playbook looks as followed,
---
- name: Ask for credentials
vars_prompt:
- name: username
prompt: "Username?"
- name: password
prompt: "Password?"
tasks:
- set_fact:
username: "{{username}}"
- set_fact:
password: "{{password}}"
- hosts: Host1
vars:
ansible_user: "{{ username }}"
ansible_password: "{{ password }}"
tasks:
- name: Do stuff
- hosts: Host2
vars:
ansible_user: "{{username}}"
ansible_password: "{{password}}"
tasks:
- name: Do stuff
...
From the moment the play hits the first task it will fail with the flowing error,
msg: 'The field ''remote_user'' has an invalid value, which includes an undefined variable. The error was: ''username'' is undefined'
Anyone that has experience in making prompted vars usable across the whole play and all tasks?
Q: "Make prompted vars usable across the whole play and all tasks>"
A: Run the first play with the group of all hosts that should be connected later. Run once the set_fact task. This will create the variables username and password for all hosts in the group.
For example if the group test_jails comprises hosts test_01, test_02, test_03 the play
- hosts: test_jails
vars_prompt:
- name: "username"
prompt: "Username?"
- name: "password"
prompt: "Password?"
tasks:
- set_fact:
username: "{{ username }}"
password: "{{ password }}"
run_once: true
- hosts: test_01
vars:
ansible_user: "{{ username }}"
ansible_password: "{{ password }}"
tasks:
- debug:
msg: "{{ ansible_user }} {{ ansible_password }}"
gives
ok: [test_01] => {
"msg": "admin 1234"
}

How add hosts from user's input in a ansible playbook?

I need to add a host from the user's input. Now I'm trying to use the ansible in-memory inventory, add_host module and prompt to add the target host to execute the remaining tasks. This is the content of my playbook:
Deploy.yml
- name: Adding the host server
hosts: localhost
- vars_prompt:
- name: "Server IP"
prompt: "Server"
private: no
- name: "Username (default: Ubuntu)"
prompt: "User"
default: "Ubuntu"
private: no
- name: "Password"
prompt: "Passwd"
private: yes
encrypt: "sha512_crypt"
- name: "Identity file path"
prompt: "IdFile"
private: no
when: Passwd is undefined
tasks:
- name: Add host server
add_host:
name: "{{ Server }}"
ansible_ssh_user: "{{ User }}"
ansible_ssh_private_key_file: "{{ IdFile }}"
when: IdFile is defined
- name: Add host server
add_host:
name: "{{ Server }}"
ansible_ssh_user: "{{ User }}"
ansible_ssh_pass: "{{ Passwd }}"
when: Passwd is defined
- hosts: "{{ Server }}"
tasks:
- name: Copy the script file to the server
copy:
src: script.sh
dest: "{{ ansible_env.HOME }}/folder/"
mode: 755
force: yes
attr:
- +x
When I run this playbook with this command $ ansible-playbook Deploy.yml, The output is:
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all'
PLAY [Adding the host server] ***********************************************************************************************
TASK [Gathering Facts] ******************************************************************************************************
ok: [localhost]
Server: <server-ip>
User [Ubuntu]:
Passwd:
IdFile: <path/to/id/file>
ERROR! the field 'hosts' is required but was not set
I don't know why it throws this error:
ERROR! the field 'hosts' is required but was not set
How can I do what I need to do?
UPDATE:
It still not working. This is the content of my playbook:
Deploy.yml
- name: Adding the host server
hosts: localhost
vars_prompt:
- name: "Server"
prompt: "Server IP"
private: no
- name: "User"
prompt: "Username"
default: "Ubuntu"
private: no
- name: "Passwd"
prompt: "Password"
private: yes
encrypt: "sha512_crypt"
- name: "IdFile"
prompt: "Identity file path"
private: no
when: Passwd is undefined
tasks:
- name: Add host server
add_host:
name: "{{ Server }}"
ansible_ssh_user: "{{ User }}"
ansible_ssh_private_key_file: "{{ IdFile }}"
when: IdFile is defined
- name: Add host server
add_host:
name: "{{ Server }}"
ansible_ssh_user: "{{ User }}"
ansible_ssh_pass: "{{ Passwd }}"
when: IdFile is undefined
- hosts: "{{ Server }}"
tasks:
- name: Copy the script file to the server
copy:
src: script.sh
dest: "{{ ansible_env.HOME }}/folder/"
mode: 755
force: yes
attr:
- +x
When I run this playbook with this command $ ansible-playbook Deploy.yml, The output is:
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all'
PLAY [Adding the host server] ***********************************************************************************************
TASK [Gathering Facts] ******************************************************************************************************
ok: [localhost]
Server IP: <server-ip>
Username [Ubuntu]:
Password:
Identity file path: <path/to/id/file>
ERROR! the field 'hosts' is required but was not set
I don't know why it throws this error:
ERROR! The field 'hosts' has an invalid value, which includes an undefined variable. The error was: 'Server' is undefined
Here is a flowchart of how the playbook should works:
+------------------+ +---------------+ +-----------------+
|Use ansible to run| |Get host IP fom| |Get ssh User from|
| this playbook +---->+ user's input +---->+ user's input |
+------------------+ +---------------+ +--------+--------+
|
v
+------------+--------+
|Get ssh password from|
| user's input |
+------------+--------+
|
v
+---------------+ *************************
|Add a host with| Yes | Did the user inputted |
v----------+ password +<---+| a password? |
+----------------+ +---------------+ ***************+*********
||Run some tasks|| |No
||in recently || v
||added host || +---------------+ +------------+--------+
+----------------+ |Add a host with| |Get ssh identity file|
^----------+ identity file +<------+ from user's input |
+---------------+ +---------------------+
Ok I've updated my answer to suit the changes in your question, with the original answer left for historic reasons.
To solve the substitution error you are seeing, which results in an empty host list in your second play, I would instead use an inventory group.
There are also two other syntax errors in the second play
The file mode needs to be octal (i.e. 0700)
The attribute is invalid. My assumption is you are trying to make the file executable, so fix the file mode and remove the attribute.
Here is an updated playbook:
- name: Adding the host server
hosts: localhost
vars_prompt:
- name: "Server"
prompt: "Server IP"
private: no
- name: "User"
prompt: "Username"
default: "Ubuntu"
private: no
- name: "Passwd"
prompt: "Password"
private: yes
encrypt: "sha512_crypt"
- name: "IdFile"
prompt: "Identity file path"
private: no
when: Passwd is undefined
tasks:
- name: Add host server
add_host:
name: "{{ Server }}"
ansible_ssh_user: "{{ User }}"
ansible_ssh_private_key_file: "{{ IdFile }}"
group: added_hosts
when: IdFile is defined
- name: Add host server
add_host:
name: "{{ Server }}"
ansible_ssh_user: "{{ User }}"
ansible_ssh_pass: "{{ Passwd }}"
group: added_hosts
when: IdFile is undefined
- hosts: added_hosts
tasks:
- name: Copy the script file to the server
copy:
src: script.sh
dest: "{{ ansible_env.HOME }}/folder/"
mode: 0755
force: yes
=== OLD ANSWER ===
User input is stored in the whatever variable you are using for the name attribute in each of the variable prompts.
You need to switch around your name and prompt values under vars_prompt
There are also YAML formatting issues
For example:
- vars_prompt:
- name: "Server IP"
prompt: "Server"
private: no
should be:
vars_prompt:
- name: "server"
prompt: "Server IP"
private: no
Then you can refer to the {{ server }} variable in your tasks
Your ansible script is having a problem.
vars_prompt:
remove - from vars_prompt line it will work properly.
I tried in my local server the same script is working properly.
- name: Adding the host server
hosts: localhost
vars_prompt:
- name: "Server"
prompt: "Server IP"
private: no
- name: "User"
prompt: "Username"
default: "Ubuntu"
private: no
- name: "Passwd"
prompt: "Password"
private: yes
encrypt: "sha512_crypt"
- name: "IdFile"
prompt: "Identity file path"
private: no
when: Passwd is undefined
tasks:
- name: Add host server
add_host:
name: "{{ Server }}"
ansible_ssh_user: "{{ User }}"
ansible_ssh_private_key_file: "{{ IdFile }}"
when: IdFile is defined
- name: Add host server
add_host:
name: "{{ Server }}"
ansible_ssh_user: "{{ User }}"
ansible_ssh_pass: "{{ Passwd }}"
when: Passwd is defined
- name: Create a file
shell: touch newfile
delegate_to: "{{ Server }}"
In the last task update to your task and run it.
- name: Create a file
shell: touch newfile
delegate_to: "{{ Server }}"

Unable to make a PyEZ connection: ConnectUnknownHostError

I am trying to use juniper_junos_facts from the Ansible Junos module to query some VM's that I provisioned using Vagrant. However I am getting the following error.
fatal: [r1]: FAILED! => {"changed": false, "msg": "Unable to make a PyEZ connection: ConnectUnknownHostError(r1)"}
fatal: [r2]: FAILED! => {"changed": false, "msg": "Unable to make a PyEZ connection: ConnectUnknownHostError(r2)"}
I see in the following document Here on juniper.net that this error occurs when you don't have the host defined correctly in the inventory file. I don't believe this to be an issue with my inventory file because when I run ansible-inventory --host all appears to be in order
~/vagrant-projects/junos$ ansible-inventory --host r1
{
"ansible_ssh_host": "127.0.0.1",
"ansible_ssh_port": 2222,
"ansible_ssh_private_key_file": ".vagrant/machines/r1/virtualbox/private_key",
"ansible_ssh_user": "root"
}
~/vagrant-projects/junos$ ansible-inventory --host r2
{
"ansible_ssh_host": "127.0.0.1",
"ansible_ssh_port": 2200,
"ansible_ssh_private_key_file": ".vagrant/machines/r2/virtualbox/private_key",
"ansible_ssh_user": "root"
}
My playbook is copied from the following document which I got from Here on juniper.net.
My Inventory File
[vsrx]
r1 ansible_ssh_host=127.0.0.1 ansible_ssh_port=2222 ansible_ssh_private_key_file=.vagrant/machines/r1/virtualbox/private_key
r2 ansible_ssh_host=127.0.0.1 ansible_ssh_port=2200 ansible_ssh_private_key_file=.vagrant/machines/r2/virtualbox/private_key
[vsrx:vars]
ansible_ssh_user=root
My Playbook
---
- name: show version
hosts: vsrx
roles:
- Juniper.junos
connection: local
gather_facts: no
tasks:
- name: retrieve facts
juniper_junos_facts:
host: "{{ inventory_hostname }}"
savedir: "{{ playbook_dir }}"
- name: print version
debug:
var: junos.version
As you're using connection: local you need to give the module full connection details (usually packaged in a provider dictionary at the play level to reduce repetition):
- name: retrieve facts
juniper_junos_facts:
host: "{{ ansible_ssh_host }}"
port: "{{ ansible_ssh_port }}"
user: "{{ ansible_ssh_user }}"
passwd: "{{ ansible_ssh_pass }}"
ssh_private_key_file: "{{ ansible_ssh_private_key_file }}"
savedir: "{{ playbook_dir }}"
Full docs are here (watch out for the correct role version in the URL): https://junos-ansible-modules.readthedocs.io/en/2.1.0/juniper_junos_facts.html where you can also see what the defaults are.
To fully explain the "provider" method, your playbook should look something like this:
---
- name: show version
hosts: vsrx
roles:
- Juniper.junos
connection: local
gather_facts: no
vars:
connection_info:
host: "{{ ansible_ssh_host }}"
port: "{{ ansible_ssh_port }}"
user: "{{ ansible_ssh_user }}"
passwd: "{{ ansible_ssh_pass }}"
ssh_private_key_file: "{{ ansible_ssh_private_key_file }}"
tasks:
- name: retrieve facts
juniper_junos_facts:
provider: "{{ connection_info }}"
savedir: "{{ playbook_dir }}"
- name: print version
debug:
var: junos.version
This answer for people who will find this question by error message.
If you use connection plugin different from local, it can, and usually caused by this bug related to variables ordering
Bug already fixed in Release 2.2.1 and later, try to update module from Galaxy.

Resources