Ansible get specified --key-file in task - ansible

I'm specifying the custom ssh key file with the following command:
ansible-playbook playbook.yml -i inventory.cfg --key-file ssh_keyfile
Now I want to add the same key to the host I'm managing with Ansible with the following task:
- name: add ssh key
authorized_key:
user: "{{ user }}"
state: present
key: "{{ lookup('file', ssh_keyfile) }}"
How can I get the value of the --key-file specified in command line when running playbook, as a variable inside the playbook?

The --key-file ssh_keyfile is a private key file path which will be used to authenticate to the remote server.
Ansible authorized_key module will look for public key so you have to use lookup for that

Related

How to add SSH fingerprints to known hosts in Ansible?

I have the following inventory file:
[all]
192.168.1.107
192.168.1.108
192.168.1.109
I want to add fingerprints for these hosts to known_hosts file on local machine.
I know that I can use the ansible.builtin.known_hosts but based on the docs:
Name parameter must match with "hostname" or "ip" present in key
attribute.
it seems like I must already have keys generated and I must have three sets of keys - one set per host. I would like to have just one key for all my hosts.
Right now I can use this:
- name: accept new remote host ssh fingerprints at the local host
shell: "ssh-keyscan -t 'ecdsa' {{item}} >> {{ssh_dir}}known_hosts"
with_inventory_hostnames:
- all
but the problem with this approach is that it is not idempotent - if I run it three times it will add three similar lines in the known_hosts file.
Another solution would be to check the known_hosts file for presence of a host ip and add it only if it is not present, but I could not figure out how to use variables in when condition to check for more than one host.
So the question is how can I add hosts fingerprints to local known_hosts file before generating a set of private/public keys in idempotent manner?
Here in my answer to "How to include all host keys from all hosts in group" I created a small Ansible look-up module host_ssh_keys to extract public SSH keys from the host inventory. Adding all hosts' public ssh keys to /etc/ssh/ssh_known_hosts is then as simple as this, thanks to Ansible's integration of loops with look-up plugins:
- name: Add public keys of all inventory hosts to known_hosts
ansible.builtin.known_hosts:
path: /etc/ssh/ssh_known_hosts
name: "{{ item.host }}"
key: "{{ item.known_hosts }}"
with_host_ssh_keys: "{{ ansible_play_hosts }}"
For public SSH-Keys I use this one:
- hosts: localhost
tasks:
- set_fact:
linuxkey: "{{ lookup('file', '~/.ssh/id_rsa.pub') }}"
check_mode: no
- hosts: all
tasks:
- shell:
cmd: "sudo su - {{ application_user }}"
stdin: "[[ ! `grep \"{{ hostvars['localhost']['linuxkey'] }}\" ~/.ssh/authorized_keys` ]] && echo '{{ hostvars['localhost']['linuxkey'] }}' >> ~/.ssh/authorized_keys"
warn: no
executable: /bin/bash
register: results
failed_when: results.rc not in [0,1]
I think you can easy adapt it for known_hosts file

Run ansible-vault encrypt_string in ansible playbook

I have a job in Rundeck, which require users to pass in database password to ansible. And ansible will take it as an extra variable.
ansible-playbook test.yml -e "password=123"
However, we would like to vault the password during the runtime, but from ansible's best practice. They would require the password to be stored in a file.
and vault the entire file using ansible-vault create.
Since we have a large number of the password to pass in, and I notice there is a function call encrypt_string. I try to call it in a playbook and try to generate a vault password on the fly, but I'm getting error below:
"ERROR! Only one --vault-id can be used for encryption. This includes
passwords from configuration and cli."
Here is my playbook test.yml:
---
- name: test
hosts: localhost
tasks:
- name: vault var
command: ansible-vault encrypt_string "{{ password }}" --vault-password-file ~/.vault_pass.txt
register: var
- name: variable
set_fact:
mypass: var
- name: test encrypt_string
debug:
msg: "{{ mypass }}"
I'm not sure if this is the correct way to do it/best practice, anyone can shed some light will be very appreciated.
Thanks,
You may update your task by removing option --vault-password-file as ansible seems getting/reading it from your environment some way.
...
...
- name: test
hosts: localhost
tasks:
- name: vault var
command: ansible-vault encrypt_string "{{ password }}"
register: var
...
...
If you prefer to keep this option in playbook, you may need to find where ansible is reading it from. Ansible may be reading it from it's default config file, generally found at ~/.ansible.cfg [look for vault_password_file] or alias or somewhere else.
You may find more details at ansible vault documentation with examples.

Can Ansible deploy public SSH key asking password only once?

I wonder how to copy my SSH public key to many hosts using Ansible.
First attempt:
ansible all -i inventory -m local_action -a "ssh-copy-id {{ inventory_hostname }}" --ask-pass
But I have the error The module local_action was not found in configured module paths.
Second attempt using a playbook:
- hosts: all
become: no
tasks:
- local_action: command ssh-copy-id {{ inventory_hostname }}
Finally I have entered my password for each managed host:
ansible all -i inventory --list-hosts | while read h ; do ssh-copy-id "$h" ; done
How to fill password only once while deploying public SSH key to many hosts?
EDIT: I have succeeded to copy my SSH public key to multiple remote hosts using the following playbook from the Konstantin Suvorov's answer.
- hosts: all
tasks:
- authorized_key:
key: "{{ lookup('file', '~/.ssh/id_rsa.pub') }}"
The field user should be mandatory according to the documentation but it seems to work without. Therefore the above generic playbook may be used for any user when used with this command line:
ansible-playbook -i inventory authorized_key.yml -u "$USER" -k
Why don't you use authorized_key module?
- hosts: all
tasks:
- authorized_key:
user: remote_user_name
state: present
key: "{{ lookup('file', '/local/path/.ssh/id_rsa.pub') }}"
and run playbook with -u remote_user_name -k

How to get IP address of host when passed as extra-vars in Ansible?

Say I am passing the hostnames to an Ansible playbook like so:
ansible-playbook ansible/db-playbook.yml --extra-vars "master=mydb-master, slave=mydb-slave"
In the playbook I want to access the actual ip address of the mydb-slave host:
- name: Copy ssh key to Slave
command: ... "{{ mydb-slave }}"
In this case the output is the string literal mydb-slave, but I require the full ip address.
Make sure that the python module dnspython is installed on the host you are running ansible, then the following should work (if the hosts given are DNS names):
- name: Lookup slave IP
set_fact: slave_ip="{{ lookup('dig', '{{ slave }}')}}"
- name: Copy ssh key to Slave
command: ... "{{ slave_ip }}"
If the hosts are in your ansible inventory, the following should work:
- name: Copy ssh key to Slave
command: ... "{{ hostvars[slave]['ansible_eth0']['ipv4']['address'] }}"

How to add a host to the known_host file with ansible?

I want to add the ssh key for my private git server to the known_hosts file with ansible 1.9.3 but it doesn't work.
I have the following entry in my playbook:
- name: add SSH host key
known_hosts: name='myhost.com'
key="{{ lookup('file', 'host_key.pub') }}"
I have copied /etc/ssh/ssh_host_rsa_key.pub to host_key.pub and the file looks like:
ssh-rsa AAAAB3NzaC1... root#myhost.com
If I run my playbook I always get the following error message:
TASK: [add SSH host key]
******************************************************
failed: [default] => {"cmd": "/usr/bin/ssh-keygen -F myhost.com -f /tmp/tmpe5KNIW", "failed": true, "rc": 1}
What I am doing wrong?
You can directly use ssh-keyscan within the ansible task:
- name: Ensure servers are present in known_hosts file
known_hosts:
name: "{{ hostvars[item].ansible_host }}"
state: present
key: "{{ lookup('pipe', 'ssh-keyscan {{ hostvars[item].ansible_host }}') }}"
hash_host: true
with_items: "{{ groups.servers }}"
In the above snipped, we iterate over all hosts in the group "servers" defined in your inventory, use ssh-keyscan on them, read the result with pipe and add it using known_hosts.
If you have only one host that you want to add, it's even simpler:
- name: Ensure server is present in known_hosts file
known_hosts:
name: "myhost.com"
state: present
key: "{{ lookup('pipe', 'ssh-keyscan myhost.com') }}"
hash_host: true
Whether you need hash_host or not depends on your system.
Your copy of the remote host public key needs a name, that name needs to match what you specify for your known hosts.
In your case, prepend "myhost.com " to your host_key.pub key file as follows:
myhost.com ssh-rsa AAAAB3NzaC1... root#myhost.com
Reference:
Ansible known_hosts module, specifically the name parameter
Use ssh-keyscan to generate host_key.pub is another way.
ssh-keyscan myhost.com > host_key.pub
This command will generate the format like this.
$ ssh-keyscan github.com > github.com.pub
# github.com SSH-2.0-libssh-0.7.0
# github.com SSH-2.0-libssh-0.7.0
$ cat github.com.pub
github.com ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEAq2A7hRGmdnm9tUDbO9IDSwBK6TbQa+PXYPCPy6rbTrTtw7PHkccKrpp0yVhp5HdEIcKr6pLlVDBfOLX9QUsyCOV0wzfjIJNlGEYsdlLJizHhbn2mUjvSAHQqZETYP81eFzLQNnPHt4EVVUh7VfDESU84KezmD5QlWpXLmvU31/yMf+Se8xhHTvKSCZIFImWwoG6mbUoWf9nzpIoaSjB+weqqUUmpaaasXVal72J+UX2B+2RPW3RcT0eOzQgqlJL3RKrTJvdsjE3JEAvGq3lGHSZXy28G3skua2SmVi/w4yCE6gbODqnTWlg7+wC604ydGXA8VJiS5ap43JXiUFFAaQ==

Resources