I need to access a few servers whose passwords are built using servers hostname, so, let's suppose hostname is: fancyHost and password prefix is "principio", then the sudoer user's (contoso-r in this example) password is going to be: "principiofancyHost". This in order to assign a new password to each host.
Problem is I managed to connect and stablish a custom password as a variable, and scalate to sudoer, but seems like ansible become_pass is not meant to be manipulated inside the task, which makes difficult to concatenate the prefix with the hostname.
- hosts: test
vars:
prefix: "principio"
ansible_become_pass: “hardToGuessPassword123”
tasks:
- name: getting current server hostname
command: hostname
register: hostname
- name: getting current user name
command: whoami
register: current_username
- name: print current server hostname
debug:
msg: "Current user: {{current_username.stdout}} ||| Hostname is: {{ hostname.stdout}} ||| Password should be: {{ prefix + hostname.stdout}}"
- name: changing contoso-r' s password
become: True
become_user: contoso-r
become_method: su
user:
name: contoso-r
password: "{{ (prefix + hostname.stdout) | password_hash('sha512') }}"
state: present
shell: /bin/bash
system: no
createhome: no
How could I set a custom password relative to each host instead a global become_pass or ansible_become_pass?
Thanks for your time
You stated:
I managed to connect ... and escalate to sudoer, ...
It's not necessary to "become_user: contoso-r" when "changing contoso-r' s password".
- name: changing contoso-r' s password
user:
name: contoso-r
password: "{{ (prefix + hostname.stdout) | password_hash('sha512') }}"
Related
I'm new to ansible and trying to create new user with encrypted password using ansible-vault. The taget system is OpenBsd, and I'm using ansible 2.10 on Ubuntu 20.04
.
The "problem" is once the playbook finished, I get this message in output
"passord": "NOT_LOGGING_PASSWORD" and the password is not set/update.
I first create and edit my vault file using ansble-vault.
Content of my vault file:
user_pass: pass
Here is my playbook:
- name: Add new user
hosts: all
vars_files:
- "../vars/pass.yml"
tasks:
- name: Add regular user
user:
name: foo
update_password: always
password: "{{ vault_user_pass | password_hash('sha512') }}"
create_home: yes
shell: /bin/sh
generate_ssh_key: yes
ssh_key_type: rsa
ssh_key_bits: 2048
ssh_key_passphrase: ''
become_user: root
Do you have any idea why the password is not set/update ? I tried to print the vault variable to check if var is readable or not, using debug module and yes, it is. The user is created but with another password. I also tried to hash the password using mkpasswd but same results.
If you need further informations, don't hesitate :).
Thank you in advance.
The variable name is user_pass, even though your variable is in a vault file you don't need to use the vault prefix.
Try as below
- name: Add new user
hosts: all
vars_files:
- "../vars/pass.yml"
tasks:
- name: Add regular user
user:
name: foo
update_password: always
password: "{{ user_pass | password_hash('sha512') }}"
create_home: yes
shell: /bin/sh
generate_ssh_key: yes
ssh_key_type: rsa
ssh_key_bits: 2048
ssh_key_passphrase: ''
become_user: root
I am creating a playbook that login to a Virtual Machine and perform initial configuration. The image used to create the VM has a default user name that need change of password on initial login. I am looking for a way how to handle this in Ansible?
I found solution for my problem as follow using ansible module "expect" https://docs.ansible.com/ansible/latest/collections/ansible/builtin/expect_module.html
- name: Change password on initial login
delegate_to: 127.0.0.1
become: no
expect:
command: ssh {{ ansible_ssh_common_args }} {{ user_expert }}#{{ inventory_hostname }}
timeout: 20
responses:
Password: "{{ user_expert_password }}"
UNIX password: "{{ user_expert_password }}"
New password: "{{ user_expert_password_new }}"
new password: "{{ user_expert_password_new }}"
"\\~\\]\\$": exit
register: status
You are looking for the user module, especially the password option.
Keep in mind, that the password option needs the hash of the actual password. Check here how to get that it. (It needs to be hashed, otherwise you would have a cleartext password in your playbook or inventory which would be a security risk.)
Example:
- name: ensure password of default user is changed
user:
name: yourdefaultuser
password: '$6$QkjC8ur2WfMfYGA$ZNUxTGoe5./F0b4GJGrcEA.ff9An473wmPsmU4xv00nSrN4D/Nxk8aKro/E/LlQVkUJLbLL6qk2/Lxw5Oxs2m.'
Note that the password hash was generated with mkpasswd --method=sha-512 for the password somerandompassword.
I'm trying to automate hostname creation for 10x machines using ansible roles. What I want when executing playbook, as this wait for enter the user name manually.
I tried with vars_prompt module for satisfying the requirement. But here for a single *.yml file, I'm able to see the expected results
Without role - I can see the input is taken to variable host. but this works fine.
#host.yml .
---
- hosts: ubuntu
user: test
sudo: yes
vars_prompt:
- name: host
prompt: "Specify host name?"
private: no
tasks:
- debug:
msg: ' log as {{ host }}'
- name: Changing hostname
hostname:
name: '{{ host }}'
With role, < this is not working > { vars_prompt not working }
#role.yml
---
- hosts: ubuntu
user: test
sudo: yes
roles:
# - hostname
#hostname/tasks/main.yml
- name: testing prompt
vars_prompt:
- name: host
prompt: "Specify host name?"
- name: Ansible prompt example
debug:
msg: "{{ host }}"
#Changing hostname
- name: Changing hostname
hostname:
name: "{{ host }}"
Here I'm getting error as
ERROR! no action detected in task. This often indicates a misspelled module name, or incorrect module path.
The error appears to have been in '/root/roles/hostname/tasks/main.yml': line 1, column 3, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
- name: testing prompt
^ here
My expectation is to set some parameters need to set manually as input while the execution of playbook. Need to work this vars_prompt module in role.
Run 1st play with serial: 1 and enter the hostnames. 2nd play (in the same playbook) will use the facts.
- hosts: ubuntu
serial: 1
user: test
sudo: yes
tasks:
- pause:
prompt: "Specify hostname for {{ inventory_hostname }}?"
echo: yes
register: result
- set_fact:
host: "{{ result.user_input }}"
- hosts: ubuntu
user: test
sudo: yes
roles:
- hostname
I have some servers which I want to administer with ansible. Currently I need to create user acounts on all of them. On some of them, some accounts are already present. I want to create the users with a default password, but if the user exist don't change his password.
Can someone help me with this condition ?
Here is my playbook :
---
- hosts: all
become: yes
vars:
sudo_users:
# password is generated through "mkpasswd" command from 'whois' package
- login: user1
password: hashed_password
- login: user1
password: hashed_password
tasks:
- name: Make sure we have a 'sudo' group
group:
name: sudo
state: present
- user:
name: "{{ item.login }}"
#password: "{{ item.password }}"
shell: /bin/bash
groups: "{{ item.login }},sudo"
append: yes
with_items: "{{ sudo_users }}"
From the docs of user module:
update_password (added in 1.3) always/on_create
always will update passwords if they differ. on_create will only set the password for newly created users.
I want to write a ansible playbook where we can provide a username and ansible will display the authorized keys for that user. The path to the authorized keys is {{user_home_dir}}/.ssh/authorized_keys.
I tried with shell module like below:
---
- name: Get authorized_keys
shell: cat "{{ user_home_dir }}"/.ssh/authorized_keys
register: read_key
- name: Prints out authorized_key
debug: var=read_key.stdout_lines
The problem is, it will show me the file inside /home/ansible/.ssh/authorized_keys. "ansible" is the user that I am using to connect to remote machine.
Below is vars/main.yml
---
authorized_user: username
user_home_dir: "{{ lookup('env','HOME') }}"
Any idea? FYI I am new to ansible and tried this link already.
In your vars file, you have
user_home_dir: "{{ lookup('env','HOME') }}"
Thanks to Konstantin for pointing it out... All lookups are executed on the control host. So the lookup to env HOME will always resolve to the home directory of the user, from which ansible is being invoked.
You could use the getent module from ansible to retrieve an user's info. The below snippet should help
---
- hosts: localhost
connection: local
remote_user: myuser
gather_facts: no
vars:
username: myuser
tasks:
- name: get user info
getent:
database: passwd
key: "{{ username }}"
register: info
- shell: "echo {{ getent_passwd[username][4] }}"
Below worked. We need to have become too otherwise we will get permission denied error.
---
- hosts: local
remote_user: ansible
gather_facts: no
become: yes
become_method: sudo
vars:
username: myuser
tasks:
- name: get user info
getent:
split: ":"
database: passwd
key: "{{ username }}"
- name: Get authorized_keys
shell: cat "{{ getent_passwd[username][4] }}"/.ssh/authorized_keys
register: read_key
- name: Prints out authorized_key
debug: var=read_key.stdout_lines