Ansible - Reference 'ansible_ssh_pass' in a task? - ansible

I understand that the ansible_ssh_pass (and similarly ansible_become_pass) variables are settable via inventories. E.g.:
[some_group:vars]
ansible_ssh_pass=some_password
But is the same referencable from a task without explicitly setting it in an inventory? E.g. if I simply provide the password with --ask-pass?
The use case would be to mount a CIFS share with an authorized account (which would simply be a user's SSH account as we have Active Directory in our environment). I've tried using the documented variables, e.g.:
- name: Mount a drive
sudo: true
mount: state="mounted" fstype="cifs" opts="username={{ ansible_ssh_user }}, password={{ ansible_ssh_pass }} src=..."
But this results in an error:
fatal: [some.machine] => One or more undefined variables: 'ansible_ssh_pass' is undefined

Regarding your question
But is the same referencable from a task without explicitly setting it in an inventory? E.g. if I simply provide the password with --ask-pass?
the short answer is yes. A test remote playbook
---
- hosts: test
become: yes
gather_facts: no
tasks:
- name: Show variables
debug:
msg:
- "Provided user: {{ ansible_user }}"
- "Provided password: {{ ansible_password }}"
called via
ansible-playbook --user ${ADMIN_USER} --ask-pass remote.yml
results into an output of
TASK [Show variables] ***********
ok: [test.example.com] =>
msg:
- 'Provided user: admin_user'
- 'Provided password: 12345678'
just providing the given password.

Related

Passing hostname to ansible playbook through extravars

I have to pass the host on which the Ansible command will be executed through extra vars.
I don't know in advance to which hosts the tasks will be applied to, and, therefore, my inventory file is currently missing the hosts: variable.
If I understood from the article "How to pass extra variables to an Ansible playbook" correctly, overwriting hosts is only possible by having already composed groups of hosts.
From the post Ansible issuing warning about localhost I gathered that referencing hosts to be managed in an Ansible inventory is a must, however, I still have doubts about it since the usage of extra vars was not mentioned in the given question.
So my question is: What can i do in order to make this playbook work?
- hosts: "{{ host }}"
tasks:
- name: KLIST COMMAND
command: klist
register: klist_result
- name: TEST COMMAND
ansible.builtin.shell: echo hi > /tmp/test_result.txt
... referencing hosts to be managed in an Ansible inventory is a must
Yes, that's the case. Regarding your question
What can I do in order to make this playbook work? (annot. without a "valid" inventory file)
you could try with the following workaround.
---
- hosts: localhost
become: false
gather_facts: false
tasks:
- add_host:
hostname: "{{ target_hosts }}"
group: dynamic
- hosts: dynamic
become: true
gather_facts: true
tasks:
- name: Show hostname
shell:
cmd: "hostname && who am i"
register: result
- name: Show result
debug:
var: result
A call with
ansible-playbook hosts.yml --extra-vars="target_hosts=test.example.com"
resulting into execution on
TASK [add_host] ***********
changed: [localhost]
PLAY [dynamic] ************
TASK [Show hostname] ******
changed: [test.example.com]
In any case it is recommended to check how to build your inventory.
Further Documentation
add_host module – Add a host (and alternatively a group) to the ansible-playbook in-memory inventory

Run Ansible playbook task with predefined username and password

This is code of my ansible script .
---
- hosts: "{{ host }}"
remote_user: "{{ user }}"
ansible_become_pass: "{{ pass }}"
tasks:
- name: Creates directory to keep files on the server
file: path=/home/{{ user }}/fabric_shell state=directory
- name: Move sh file to remote
copy:
src: /home/pankaj/my_ansible_scripts/normal_script/installation/install.sh
dest: /home/{{ user }}/fabric_shell/install.sh
- name: Execute the script
command: sh /home/{{ user }}/fabric_shell/install.sh
become: yes
I am running the ansible playbook using command>>>
ansible-playbook send_run_shell.yml --extra-vars "user=sakshi host=192.168.0.238 pass=Welcome01" .
But I don't know why am getting error
ERROR! 'ansible_become_pass' is not a valid attribute for a Play
The error appears to have been in '/home/pankaj/go/src/shell_code/send_run_shell.yml': line 2, column 3, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
---
- hosts: "{{ host }}"
^ here
We could be wrong, but this one looks like it might be an issue with
missing quotes. Always quote template expression brackets when they
start a value. For instance:
with_items:
- {{ foo }}
Should be written as:
with_items:
- "{{ foo }}"
Please guide , what I am doing wrong.
Thanks in advance ...
ansible_become_pass is a connection parameter which you can set as variable:
---
- hosts: "{{ host }}"
remote_user: "{{ user }}"
vars:
ansible_become_pass: "{{ pass }}"
tasks:
# ...
That said, you can move remote_user to variables too (refer to the whole list of connection parameters), save it to a separate host_vars- or group_vars-file and encrypt with Ansible Vault.
Take a look on this thread thread and Ansible Page. I propose to use become_user in this way:
- hosts: all
tasks:
- include_tasks: task/java_tomcat_install.yml
when: activity == 'Install'
become: yes
become_user: "{{ aplication_user }}"
Try do not use pass=Welcome01,
When speaking with remote machines, Ansible by default assumes you are using SSH keys. SSH keys are encouraged but password authentication can also be used where needed by supplying the option --ask-pass. If using sudo features and when sudo requires a password, also supply --ask-become-pass (previously --ask-sudo-pass which has been deprecated).

In ansible variable for hosts from vars_prompt no longer accepted [duplicate]

I want to write a bootstrapper playbook for new machines in Ansible which will reconfigure the network settings. At the time of the first execution target machines will have DHCP-assigned address.
The user who is supposed to execute the playbook knows the assigned IP address of a new machine. I would like to prompt the user for is value.
vars_prompt module allows getting input from the user, however it is defined under hosts section effectively preventing host address as the required value.
Is it possible without using a wrapper script modifying inventory file?
The right way to do this is to create a dynamic host with add_host and place it in a new group, then start a new play that targets that group. That way, if you have other connection vars that need to be set ahead of time (credentials/keys/etc) you could set them on an empty group in inventory, then add the host to it dynamically. For example:
- hosts: localhost
gather_facts: no
vars_prompt:
- name: target_host
prompt: please enter the target host IP
private: no
tasks:
- add_host:
name: "{{ target_host }}"
groups: dynamically_created_hosts
- hosts: dynamically_created_hosts
tasks:
- debug: msg="do things on target host here"
You could pass it with extra-vars instead.
Simply make your hosts section a variable such as {{ hosts_prompt }} and then pass the host on the command line like so:
ansible-playbook -i inventory/environment playbook.yml --extra-vars "hosts_prompt=192.168.1.10"
Or if you are using the default inventory file location of /etc/ansible/hosts you could simply use:
ansible-playbook playbook.yml --extra-vars "hosts_prompt=192.168.1.10"
Adding to Matt's answer for multiple hosts.
input example would be 192.0.2.10,192.0.2.11
- hosts: localhost
gather_facts: no
vars_prompt:
- name: target_host
prompt: please enter the target host IP
private: no
tasks:
- add_host:
name: "{{ item }}"
groups: dynamically_created_hosts
with_items: "{{ target_host.split(',') }}"
- hosts: dynamically_created_hosts
tasks:
- debug: msg="do things on target host here"
Disclaimer: The accepted answer offers the best solution to the problem. While this one is working it is based on a hack and I leave it as a reference.
I found out it was possible use a currently undocumented hack (credit to Bruce P for pointing me to the post) that turns the value of -i / --inventory parameter into an ad hoc list of hosts (reference). With just the hostname/ip address and a trailing space (like below) it refers to a single host without the need for the inventory file to exist.
Command:
ansible-playbook -i "192.168.1.21," playbook.yml
And accordingly playbook.yml can be run against all hosts (which in the above example will be equal to a single host 192.168.1.21):
- hosts: all
The list might contain more than one ip address -i "192.168.1.21,192.168.1.22"
Adding to Jacob's and Matt's examples, with the inclusion of a username and password prompt:
---
- hosts: localhost
pre_tasks:
- name: verify_ansible_version
assert:
that: "ansible_version.full is version_compare('2.10.7', '>=')"
msg: "Error: You must update Ansible to at least version 2.10.7 to run this playbook..."
vars_prompt:
- name: target_hosts
prompt: |
Enter Target Host IP[s] or Hostname[s] (comma separated)
(example: 1.1.1.1,myhost.example.com)
private: false
- name: username
prompt: Enter Target Host[s] Login Username
private: false
- name: password
prompt: Enter Target Host[s] Login Password
private: true
tasks:
- add_host:
name: "{{ item }}"
groups: host_groups
with_items:
- "{{ target_hosts.split(',') }}"
- add_host:
name: login
username: "{{ username }}"
password: "{{ password }}"
- hosts: host_groups
remote_user: "{{ hostvars['login']['username'] }}"
vars:
ansible_password: "{{ hostvars['login']['password'] }}"
ansible_become: yes
ansible_become_method: sudo
ansible_become_pass: "{{ hostvars['login']['password'] }}"
roles:
- my_role

Ansible become_user with variable

I am using Ansible 2.1.0.0
I try to use become_user with a variable in a task, but I receive the following message:
fatal: [host]: FAILED! => {"failed": true, "msg": "'ansible_user' is undefined"}
The task executing this is
- name: Config git user name
git_config: name=user.name scope=global value={{ ansible_host }}
become: Yes
become_user: "{{ansible_user}}"
And the playbook has the following line to define the remote user:
- name: Foo
hosts: foo
vars:
http_port: 80
remote_user: admin
I've seen this response which seems to be the same problem, but this does not work for me.
I have seen also a set_fact solution but I would like to use the remote_user var if possible so no extra lines must be added if a playbook already has the remote_user var set.
Does anyone know how to do this or what I am doing wrong?
What about that:
- name: Foo
hosts: foo
vars:
http_port: 80
my_user: admin
remote_user: "{{my_user}}"
then:
- name: Config git user name
git_config: name=user.name scope=global value={{ ansible_host }}
become: Yes
become_user: "{{my_user}}"
I think I found it:
become_user: "{{ansible_ssh_user}}"
In fact the remote_user: admin is another way of defining the variable ansible_ssh_user, I dont know why remote_user is not accessible as a variable, but what I know is that when you set remote_user, it changes the variable ansible_ssh_user
Not sure if it's a clean solution though, but it works
I had a similar problem thrying to use {{ ansible_ssh_user }}
fatal: [xxx]: FAILED! => {"msg": "The field 'become_user' has an
invalid value, which includes an undefined variable. The error was:
'ansible_user' is undefined"}
I fixed this error using this approach:
- name: Backups - Start backups service
shell:
cmd: systemctl --user enable backups.service && systemctl --user restart backups.service
executable: /bin/bash
become: true
become_method: sudo
become_user: "{{ lookup('env','USER') }}"
I hope this helps.

Ansible: variable interpolation in task name

I cannot get this seemingly simple example to work in Ansible 1.8.3. The variable interpolation does not kick in the task name. All examples I have seen seem to suggest this should work. Given that the variable is defined in the vars section I expected the task name to print the value of the variable. Why doesn't this work?
Even the example from the Ansible documentation seems to not print the variable value.
---
- hosts: 127.0.0.1
gather_facts: no
vars:
vhost: "foo"
tasks:
- name: create a virtual host file for {{ vhost }}
debug: msg="{{ vhost }}"
This results in the following output:
PLAY [127.0.0.1]
**************************************************************
TASK: [create a virtual host file for {{ vhost }}]
****************************
ok: [127.0.0.1] => {
"msg": "foo"
}
PLAY RECAP
********************************************************************
127.0.0.1 : ok=1 changed=0 unreachable=0 failed=0
Update
This works with 1.7.2 but does not work with 1.8.3. So either this is a bug or a feature.
Variables are not resolved inside the name. Only inside the actual tasks/conditions etc. the placeholders will be resolved. I guess this is by design. Imagine you have a with_items loop and use the {{ item }}in the name. The tasks name will only be printed once, but the {{ item }} would change in every iteration.
I see the examples, even the one in the doc you linked to, use variables in the name. But that doesn't mean the result would be like you expected it. The docs are community managed. It might be someone just put that line there w/o testing it - or maybe it used to work like that in a previous version of Ansible and the docs have not been updated then. (I'm only using Ansible since about one year). But even though it doesn't work like we wish it would, I'm still using variables in my name's, just to indicate that the task is based on dynamic parameters. Might be the examples have been written with the same intention.
An interesting observation I recently made (Ansible 1.9.4) is, default values are written out in the task name.
- name: create a virtual host file for {{ vhost | default("foo") }}
When executed, Ansible would show the task title as:
TASK: [create a virtual host file for foo]
This way you can avoid ugly task names in the output.
Explanation
Whether the variable gets interpolated depends on where it has been declared.
Imagine You have two hosts: A and B.
If variable foo has only per-host values, when Ansible runs the play, it cannot decide which value to use.
On the other hand, if it has a global value (global in a sense of host invariance), there is no confusion which value to use.
Source: https://github.com/ansible/ansible/issues/3103#issuecomment-18835432
Hands on playbook
ansible_user is an inventory variable
greeting is an invariant variable
- name: Test variable substitution in names
hosts: localhost
connection: local
vars:
greeting: Hello
tasks:
- name: Sorry {{ ansible_user }}
debug:
msg: this won't work
- name: You say '{{ greeting }}'
debug:
var: ansible_user
I experienced the same problem today in one of my Ansible roles and I noticed something interesting.
When I use the set_fact module before I use the vars in the task name, they actually get translated to their correct values.
In this example I wanted to set the password for a remote user:
Notice that I use the vars test_user and user_password that I set as facts before.
- name: Prepare to set user password
set_fact:
user_password: "{{ linux_pass }}"
user_salt: "s0m3s4lt"
test_user: "{{ ansible_user }}"
- name: "Changing password for user {{ test_user }} to {{ user_password }}"
user:
name: "{{ ansible_user }}"
password: "{{ user_password | password_hash('sha512', user_salt) }}"
state: present
shell: /bin/bash
update_password: always
This gives me the following output:
TASK [install : Changing password for user linux to LiNuXuSeRPaSs#]
So this solved my problem.
It might be ugly, but you can somewhat workaround with something like this:
- name: create a virtual host file
debug:
msg: "Some command result"
loop: "{{ [ vhost ] }}"
or
- name: create a virtual host file
debug:
msg: "Some command result"
loop_control:
label: "{{ vhost }}"
loop: [1]
I wouldn't do this in general, but it shows how you can use items or label to give information outside of the command result. While it might not
Source: https://docs.ansible.com/ansible/latest/user_guide/playbooks_loops.html

Resources