I want to add the ssh key for my private git server to the known_hosts file with ansible 1.9.3 but it doesn't work.
I have the following entry in my playbook:
- name: add SSH host key
known_hosts: name='myhost.com'
key="{{ lookup('file', 'host_key.pub') }}"
I have copied /etc/ssh/ssh_host_rsa_key.pub to host_key.pub and the file looks like:
ssh-rsa AAAAB3NzaC1... root#myhost.com
If I run my playbook I always get the following error message:
TASK: [add SSH host key]
******************************************************
failed: [default] => {"cmd": "/usr/bin/ssh-keygen -F myhost.com -f /tmp/tmpe5KNIW", "failed": true, "rc": 1}
What I am doing wrong?
You can directly use ssh-keyscan within the ansible task:
- name: Ensure servers are present in known_hosts file
known_hosts:
name: "{{ hostvars[item].ansible_host }}"
state: present
key: "{{ lookup('pipe', 'ssh-keyscan {{ hostvars[item].ansible_host }}') }}"
hash_host: true
with_items: "{{ groups.servers }}"
In the above snipped, we iterate over all hosts in the group "servers" defined in your inventory, use ssh-keyscan on them, read the result with pipe and add it using known_hosts.
If you have only one host that you want to add, it's even simpler:
- name: Ensure server is present in known_hosts file
known_hosts:
name: "myhost.com"
state: present
key: "{{ lookup('pipe', 'ssh-keyscan myhost.com') }}"
hash_host: true
Whether you need hash_host or not depends on your system.
Your copy of the remote host public key needs a name, that name needs to match what you specify for your known hosts.
In your case, prepend "myhost.com " to your host_key.pub key file as follows:
myhost.com ssh-rsa AAAAB3NzaC1... root#myhost.com
Reference:
Ansible known_hosts module, specifically the name parameter
Use ssh-keyscan to generate host_key.pub is another way.
ssh-keyscan myhost.com > host_key.pub
This command will generate the format like this.
$ ssh-keyscan github.com > github.com.pub
# github.com SSH-2.0-libssh-0.7.0
# github.com SSH-2.0-libssh-0.7.0
$ cat github.com.pub
github.com ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEAq2A7hRGmdnm9tUDbO9IDSwBK6TbQa+PXYPCPy6rbTrTtw7PHkccKrpp0yVhp5HdEIcKr6pLlVDBfOLX9QUsyCOV0wzfjIJNlGEYsdlLJizHhbn2mUjvSAHQqZETYP81eFzLQNnPHt4EVVUh7VfDESU84KezmD5QlWpXLmvU31/yMf+Se8xhHTvKSCZIFImWwoG6mbUoWf9nzpIoaSjB+weqqUUmpaaasXVal72J+UX2B+2RPW3RcT0eOzQgqlJL3RKrTJvdsjE3JEAvGq3lGHSZXy28G3skua2SmVi/w4yCE6gbODqnTWlg7+wC604ydGXA8VJiS5ap43JXiUFFAaQ==
Related
I have the following inventory file:
[all]
192.168.1.107
192.168.1.108
192.168.1.109
I want to add fingerprints for these hosts to known_hosts file on local machine.
I know that I can use the ansible.builtin.known_hosts but based on the docs:
Name parameter must match with "hostname" or "ip" present in key
attribute.
it seems like I must already have keys generated and I must have three sets of keys - one set per host. I would like to have just one key for all my hosts.
Right now I can use this:
- name: accept new remote host ssh fingerprints at the local host
shell: "ssh-keyscan -t 'ecdsa' {{item}} >> {{ssh_dir}}known_hosts"
with_inventory_hostnames:
- all
but the problem with this approach is that it is not idempotent - if I run it three times it will add three similar lines in the known_hosts file.
Another solution would be to check the known_hosts file for presence of a host ip and add it only if it is not present, but I could not figure out how to use variables in when condition to check for more than one host.
So the question is how can I add hosts fingerprints to local known_hosts file before generating a set of private/public keys in idempotent manner?
Here in my answer to "How to include all host keys from all hosts in group" I created a small Ansible look-up module host_ssh_keys to extract public SSH keys from the host inventory. Adding all hosts' public ssh keys to /etc/ssh/ssh_known_hosts is then as simple as this, thanks to Ansible's integration of loops with look-up plugins:
- name: Add public keys of all inventory hosts to known_hosts
ansible.builtin.known_hosts:
path: /etc/ssh/ssh_known_hosts
name: "{{ item.host }}"
key: "{{ item.known_hosts }}"
with_host_ssh_keys: "{{ ansible_play_hosts }}"
For public SSH-Keys I use this one:
- hosts: localhost
tasks:
- set_fact:
linuxkey: "{{ lookup('file', '~/.ssh/id_rsa.pub') }}"
check_mode: no
- hosts: all
tasks:
- shell:
cmd: "sudo su - {{ application_user }}"
stdin: "[[ ! `grep \"{{ hostvars['localhost']['linuxkey'] }}\" ~/.ssh/authorized_keys` ]] && echo '{{ hostvars['localhost']['linuxkey'] }}' >> ~/.ssh/authorized_keys"
warn: no
executable: /bin/bash
register: results
failed_when: results.rc not in [0,1]
I think you can easy adapt it for known_hosts file
I'm provisioning a new server, and want to automatically add its public key to my local known_hosts file. My server is running on port 2222.
hosts:
[remotes]
my_server ansible_host:42.42.42.42 ansible_port:2222
playbook.yml:
---
hosts: all
gather_facts: no
tasks:
- name: get host key
local_action: command ssh-keyscan -t rsa -p {{ansible_port}} -H {{ansible_host}}
register: host_key
- name: add host key
when: host_key is success
delegate_to: localhost
known_hosts:
name: "{{item}}"
state: present
hash_host: yes
key: "{{host_key.stdout}}"
with_items:
- "{{ansible_host}}"
- "{{inventory_hostname}}"
This adds new entries to the known_hosts.
BUT ssh 42.42.42.42:2222 and ssh my_server:2222 still show the unknown key warning.
I suspect it's because 1) I'm running on a non-standard port (the docs for the known_host module don't show an option for setting the port), or 2) something to do with the hashing option.
How do I do this?
I found a solution buried in an old issue. The trick is to use [host]:port instead of host.
---
hosts: all
gather_facts: no
tasks:
# add entry to known_hosts for server's IP address
- name: get host key
local_action: command ssh-keyscan -t rsa -p {{ansible_port}} -H {{ansible_host}}
register: host_key
- name: add host key
when: host_key is success
delegate_to: localhost
known_hosts:
name: "[{{ansible_host}}]:{{ansible_port}}" # <--- here
state: present
hash_host: yes
key: "{{host_key.stdout}}"
# add entry to known_hosts for server's hostname
- name: get host key
local_action: command ssh-keyscan -t rsa -p {{ansible_port}} -H {{inventory_hostname}}
register: host_key
- name: add host key
when: host_key is success
delegate_to: localhost
known_hosts:
name: "[{{inventory_hostname}}]:{{ansible_port}}" # <--- here
state: present
hash_host: yes
key: "{{host_key.stdout}}"
I couldn't find a way to avoid the repetition, because with_items can't be applied to multiple tasks at once, so it's ugly but it works.
This allows ssh 42.42.42.42:2222 and ssh my_server:2222 without prompts (though my_server must be defined in /etc/hosts and/or ~/.ssh/config).
Similar questions have been asked before, but none have been answered or are specific to Vagrant.
I have a directory on host master which I would like to synchronize with my vagrant instance. Here's my playbook:
- hosts: master
vars:
backup_dir: /var/backups/projects/civi.common.scot/backups/latest/
dest_dir: /var/import
tasks:
- name: Synchronize directories
synchronize:
src: "{{ backup_dir }}"
dest: "{{ dest_dir }}"
mode: pull
delegate_to: default
Here is my inventory:
default ansible_host=192.168.121.199 ansible_port=22 ansible_user='vagrant' ansible_ssh_private_key_file='/run/media/daniel/RAIDStore/Workspace/docker/newhume/.vagrant/machines/default/libvirt/private_key'
master ansible_host=hume.common.scot
When I run this play, the process does not seem to copy any files to disk, but does not error or quit either.
With ssh.config.forward_agent = true in my Vagrantfile, I am able to issue the following command from the Vagrant guest:
rsync --rsync-path='sudo rsync' -avz -e ssh $remote_user#$remote_host:$remote_path $local_path`
However, the following playbook does not work (same problem as when using the synchronize module):
- name: synchronize directories (bugfix for above)
command: "rsync --rsync-path='sudo rsync' -avz -e ssh {{ remote_user }}#{{ remote_host }}:{{ backup_directory }} {{ dest_dir }}"
I have also tried using shell instead of command.
How can I copy these files to my vagrant instance?
The "syschronize" Ansible module "is run and originates on the local host where Ansible is being run" (quote from manpage). So it is for copy from local to remote. What you want to do, is to copy from remote A (master) to remote B (default).
To acomplish that you'd have to exchange ssh keys for a specific user from B to A and vice versa for known_hosts. The following should guide you through the process:
- hosts: default
tasks:
# transfer local pub-key to remote authorized_keys
- name: fetch local ssh key from root user
shell: cat /root/.ssh/id_rsa.pub
register: ssh_keys
changed_when: false
- name: deploy ssh key to remote server
authorized_key:
user: "root"
key: "{{ item }}"
delegate_to: "master"
with_items:
- "{{ ssh_keys.stdout }}"
# fetch remote host key and add to local known_hosts
# to omit key accept prompt
- name: fetch ssh rsa host key from remote server
shell: cat /etc/ssh/ssh_host_rsa_key.pub
register: ssh_host_rsa_key
delegate_to: master
changed_when: false
- name: create /root/.ssh/ if not existant
file:
path: "/root/.ssh/"
owner: root
group: root
mode: 0700
state: directory
- name: add hostkey to root known host file
lineinfile:
path: "/root/.ssh/known_hosts"
line: "{{ master.fqdn }} {{ ssh_host_rsa_key.stdout }}"
mode: 0600
create: yes
state: present
with_items:
- "{{ ssh_keys.stdout }}"
# now call rsync to fetch from master
- name: fetch from remote
shell: rsync --rsync-path='sudo rsync' -avz -e ssh root#{{ master.fqdn }}:{{ backup_directory }} {{ dest_dir }}
I wonder how to copy my SSH public key to many hosts using Ansible.
First attempt:
ansible all -i inventory -m local_action -a "ssh-copy-id {{ inventory_hostname }}" --ask-pass
But I have the error The module local_action was not found in configured module paths.
Second attempt using a playbook:
- hosts: all
become: no
tasks:
- local_action: command ssh-copy-id {{ inventory_hostname }}
Finally I have entered my password for each managed host:
ansible all -i inventory --list-hosts | while read h ; do ssh-copy-id "$h" ; done
How to fill password only once while deploying public SSH key to many hosts?
EDIT: I have succeeded to copy my SSH public key to multiple remote hosts using the following playbook from the Konstantin Suvorov's answer.
- hosts: all
tasks:
- authorized_key:
key: "{{ lookup('file', '~/.ssh/id_rsa.pub') }}"
The field user should be mandatory according to the documentation but it seems to work without. Therefore the above generic playbook may be used for any user when used with this command line:
ansible-playbook -i inventory authorized_key.yml -u "$USER" -k
Why don't you use authorized_key module?
- hosts: all
tasks:
- authorized_key:
user: remote_user_name
state: present
key: "{{ lookup('file', '/local/path/.ssh/id_rsa.pub') }}"
and run playbook with -u remote_user_name -k
I need to make sure, that all my servers per default trust each other, so I don't have to be prompted to trust the hosts when I setup ssh tunnels between my servers. How can I do that with ansible?
All my servers are stop with host names which can be accessed via: "{{ groups['all'] }}"
You've got a couple of options here.
You can ignore known_hosts entirely by adding
StrictHostKeyChecking no
To each servers' /etc/ssh/ssh_config.
Or you could use ssh-keyscan to add all of the hosts to each servers' known hosts with the equivalent of:
ssh-keyscan -H {host_name} >> path/to/known_hosts
This might look something like:
name: Add all hosts in inventory to known_hosts
shell: ssh-keyscan -H {{ item }} >> path/to/known_hosts
with_items: "{{ groups['all'] }}"
Here in my answer to "How to include all host keys from all hosts in group" I created a small Ansible look-up module host_ssh_keys to extract public SSH keys from the host inventory, which I believe addresses your use-case.
Adding all hosts' public ssh keys to /etc/ssh/ssh_known_hosts is then as simple as this, thanks to Ansible's integration of loops with look-up plugins:
- name: Add public keys of inventory hosts to known_hosts
ansible.builtin.known_hosts:
path: /etc/ssh/ssh_known_hosts
name: "{{ item.host }}"
key: "{{ item.known_hosts }}"
with_host_ssh_keys: "{{ groups['mygroup'] }}"