I'm provisioning a new server, and want to automatically add its public key to my local known_hosts file. My server is running on port 2222.
hosts:
[remotes]
my_server ansible_host:42.42.42.42 ansible_port:2222
playbook.yml:
---
hosts: all
gather_facts: no
tasks:
- name: get host key
local_action: command ssh-keyscan -t rsa -p {{ansible_port}} -H {{ansible_host}}
register: host_key
- name: add host key
when: host_key is success
delegate_to: localhost
known_hosts:
name: "{{item}}"
state: present
hash_host: yes
key: "{{host_key.stdout}}"
with_items:
- "{{ansible_host}}"
- "{{inventory_hostname}}"
This adds new entries to the known_hosts.
BUT ssh 42.42.42.42:2222 and ssh my_server:2222 still show the unknown key warning.
I suspect it's because 1) I'm running on a non-standard port (the docs for the known_host module don't show an option for setting the port), or 2) something to do with the hashing option.
How do I do this?
I found a solution buried in an old issue. The trick is to use [host]:port instead of host.
---
hosts: all
gather_facts: no
tasks:
# add entry to known_hosts for server's IP address
- name: get host key
local_action: command ssh-keyscan -t rsa -p {{ansible_port}} -H {{ansible_host}}
register: host_key
- name: add host key
when: host_key is success
delegate_to: localhost
known_hosts:
name: "[{{ansible_host}}]:{{ansible_port}}" # <--- here
state: present
hash_host: yes
key: "{{host_key.stdout}}"
# add entry to known_hosts for server's hostname
- name: get host key
local_action: command ssh-keyscan -t rsa -p {{ansible_port}} -H {{inventory_hostname}}
register: host_key
- name: add host key
when: host_key is success
delegate_to: localhost
known_hosts:
name: "[{{inventory_hostname}}]:{{ansible_port}}" # <--- here
state: present
hash_host: yes
key: "{{host_key.stdout}}"
I couldn't find a way to avoid the repetition, because with_items can't be applied to multiple tasks at once, so it's ugly but it works.
This allows ssh 42.42.42.42:2222 and ssh my_server:2222 without prompts (though my_server must be defined in /etc/hosts and/or ~/.ssh/config).
Related
I have three hosts:
my local ansible controller
a jump/bastion host (jump_host) for my infrastructure
a target host I want to run ansible tasks against (target_host) which is only accessible through jump_host
As part of my inventory file, I have the details of both jump_host and target_host as follows:
jump_host:
ansible_host: "{{ jump_host_ip }}"
ansible_port: 22
ansible_user: root
ansible_password: password
target_host:
ansible_host: "{{ target_host_ip }}"
ansible_port: 22
ansible_user: root
ansible_password: password
ansible_ssh_common_args: '-o ProxyCommand="ssh -W %h:%p -q root#{{ jump_host_ip }}"'
How can we configure ansible to use the password mentioned in the jump_host settings from the inventory file instead of using any additional configurations from ~/.ssh/config file?
There is no direct way to provide the password for the jump host as part of the ProxyCommand.
So, I ended up doing the following:
# Generate SSH keys on the controller
- hosts: localhost
become: false
tasks:
- name: Generate the localhost ssh keys
community.crypto.openssh_keypair:
path: ~/.ssh/id_rsa
force: no
# Copy the host keys of Ansible host into the jump_host .ssh/authorized_keys file
# to ensure that no password is prompted while logging into jump_host
- hosts: jump_host
become: false
tasks:
- name: make sure public key exists on target for user
ansible.posix.authorized_key:
user: "{{ ansible_user }}"
key: "{{ lookup('file', '~/.ssh/id_rsa') }}"
state: present
I have the following inventory file:
[all]
192.168.1.107
192.168.1.108
192.168.1.109
I want to add fingerprints for these hosts to known_hosts file on local machine.
I know that I can use the ansible.builtin.known_hosts but based on the docs:
Name parameter must match with "hostname" or "ip" present in key
attribute.
it seems like I must already have keys generated and I must have three sets of keys - one set per host. I would like to have just one key for all my hosts.
Right now I can use this:
- name: accept new remote host ssh fingerprints at the local host
shell: "ssh-keyscan -t 'ecdsa' {{item}} >> {{ssh_dir}}known_hosts"
with_inventory_hostnames:
- all
but the problem with this approach is that it is not idempotent - if I run it three times it will add three similar lines in the known_hosts file.
Another solution would be to check the known_hosts file for presence of a host ip and add it only if it is not present, but I could not figure out how to use variables in when condition to check for more than one host.
So the question is how can I add hosts fingerprints to local known_hosts file before generating a set of private/public keys in idempotent manner?
Here in my answer to "How to include all host keys from all hosts in group" I created a small Ansible look-up module host_ssh_keys to extract public SSH keys from the host inventory. Adding all hosts' public ssh keys to /etc/ssh/ssh_known_hosts is then as simple as this, thanks to Ansible's integration of loops with look-up plugins:
- name: Add public keys of all inventory hosts to known_hosts
ansible.builtin.known_hosts:
path: /etc/ssh/ssh_known_hosts
name: "{{ item.host }}"
key: "{{ item.known_hosts }}"
with_host_ssh_keys: "{{ ansible_play_hosts }}"
For public SSH-Keys I use this one:
- hosts: localhost
tasks:
- set_fact:
linuxkey: "{{ lookup('file', '~/.ssh/id_rsa.pub') }}"
check_mode: no
- hosts: all
tasks:
- shell:
cmd: "sudo su - {{ application_user }}"
stdin: "[[ ! `grep \"{{ hostvars['localhost']['linuxkey'] }}\" ~/.ssh/authorized_keys` ]] && echo '{{ hostvars['localhost']['linuxkey'] }}' >> ~/.ssh/authorized_keys"
warn: no
executable: /bin/bash
register: results
failed_when: results.rc not in [0,1]
I think you can easy adapt it for known_hosts file
I can ssh to a remote server if I use the ansible command module
e.g
tasks:
- name: ssh to remote machine
command: ssh -i key ansible#172.16.2.2
However as this will be stored in github, I encrypted the private ssh key with ansible-vault.
Once I rerun the same command with the vault decryption password (--ask-vault-pass) it will not connect. It's as if the encryption/de-encryption does not return the same ssh key.
What am I doing wrong here?
My legendary colleague found a solution if anyone else comes across the same issue.
Ansible SSH private key in source control?
You need to copy your encrypted ssh private key to another file first to decrypt it and then you can use it e.g.
- hosts: localhost
gather_facts: false
vars:
source_key: key
dest_key: key2
tasks:
- name: Install ssh key
copy:
src: "{{ source_key }}"
dest: "{{ dest_key }}"
mode: 0600
- name: scp over the cert and key to remote server
command: ssh -i key2 ec2-user#1.1.1.1
When using an ssh proxy and the netwrok_cli connection method, I've been getting a "Error reading SSH protocol banner" error in Ansible.
Seems like it may be the banner_timeout setting, which I can set if I write a netmiko script and it works but I don't think I can set that in a playbook.
This is what the playbook looks like:
- name: playbook
hosts: target_dev_hostname
gather_facts: no
connection: network_cli
vars:
ansible_user: username
ansible_ssh_pass: password
ansible_become_pass: password
ansible_network_os: ios
tasks:
- name: set mode for private key
file:
path: jump_host.pem
mode: 0400
delegate_to: localhost
- name: config proxy
set_fact:
ansible_ssh_common_args: '-o ProxyCommand="ssh -o StrictHostKeyChecking=no -i /jump_host.pem -W %h:%p -q jump_user#jump_host.me.com"'
delegate_to: localhost
- name: basic show run
ios_command:
commands: show run
register: running_config
- name: show run results
debug: var=running_config
Any suggestions? Seems like this issue has been popping up in Ansible since about last year.... found this issue being reported:
https://github.com/ansible-collections/ansible.netcommon/issues/46
I should add, this same playbook works in Ansible 2.5.1
I want to add the ssh key for my private git server to the known_hosts file with ansible 1.9.3 but it doesn't work.
I have the following entry in my playbook:
- name: add SSH host key
known_hosts: name='myhost.com'
key="{{ lookup('file', 'host_key.pub') }}"
I have copied /etc/ssh/ssh_host_rsa_key.pub to host_key.pub and the file looks like:
ssh-rsa AAAAB3NzaC1... root#myhost.com
If I run my playbook I always get the following error message:
TASK: [add SSH host key]
******************************************************
failed: [default] => {"cmd": "/usr/bin/ssh-keygen -F myhost.com -f /tmp/tmpe5KNIW", "failed": true, "rc": 1}
What I am doing wrong?
You can directly use ssh-keyscan within the ansible task:
- name: Ensure servers are present in known_hosts file
known_hosts:
name: "{{ hostvars[item].ansible_host }}"
state: present
key: "{{ lookup('pipe', 'ssh-keyscan {{ hostvars[item].ansible_host }}') }}"
hash_host: true
with_items: "{{ groups.servers }}"
In the above snipped, we iterate over all hosts in the group "servers" defined in your inventory, use ssh-keyscan on them, read the result with pipe and add it using known_hosts.
If you have only one host that you want to add, it's even simpler:
- name: Ensure server is present in known_hosts file
known_hosts:
name: "myhost.com"
state: present
key: "{{ lookup('pipe', 'ssh-keyscan myhost.com') }}"
hash_host: true
Whether you need hash_host or not depends on your system.
Your copy of the remote host public key needs a name, that name needs to match what you specify for your known hosts.
In your case, prepend "myhost.com " to your host_key.pub key file as follows:
myhost.com ssh-rsa AAAAB3NzaC1... root#myhost.com
Reference:
Ansible known_hosts module, specifically the name parameter
Use ssh-keyscan to generate host_key.pub is another way.
ssh-keyscan myhost.com > host_key.pub
This command will generate the format like this.
$ ssh-keyscan github.com > github.com.pub
# github.com SSH-2.0-libssh-0.7.0
# github.com SSH-2.0-libssh-0.7.0
$ cat github.com.pub
github.com ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEAq2A7hRGmdnm9tUDbO9IDSwBK6TbQa+PXYPCPy6rbTrTtw7PHkccKrpp0yVhp5HdEIcKr6pLlVDBfOLX9QUsyCOV0wzfjIJNlGEYsdlLJizHhbn2mUjvSAHQqZETYP81eFzLQNnPHt4EVVUh7VfDESU84KezmD5QlWpXLmvU31/yMf+Se8xhHTvKSCZIFImWwoG6mbUoWf9nzpIoaSjB+weqqUUmpaaasXVal72J+UX2B+2RPW3RcT0eOzQgqlJL3RKrTJvdsjE3JEAvGq3lGHSZXy28G3skua2SmVi/w4yCE6gbODqnTWlg7+wC604ydGXA8VJiS5ap43JXiUFFAaQ==