Failed to connect to the host...permission denied (publickey, password) unreachable - ansible

I'm finding it difficult to run a simple playbook. I already ping target and it was successful. When i run the playbook i get this error:
PLAY [install httpd and start services] ***********************************
TASK [Gathering Facts] ****************************************************
fatal:[192.168.112.66]: UNREACHABLE!=> {"changed": false "msg": "Failed to connect to the host via ssh: jay#192.168.112.66: Permission denied (publickey password)." "unreachable": true}
What's the problem with this?

The remote server is denying you the access due your key has a password.
Try this before run the playbook:
$ eval `ssh-agent`
$ ssh-add /path/to/your/private/key
Then run the playbook with the options -u and --private-key pointing to the user with access permissions on remote server and the private key you use.

I am guessing you used a password instead of ssh-key. So at the end of your command, add
--ask-pass
Let's say you're running your playbook. Your command will become:
ansible-playbook playbook.yml --ask-pass

Related

Run playbook as a different user

I have added ssh keys of ansible user to other hosts so ansible user is allowed on all hosts.Now I want to run playbook as root or any other service users like apache etc. i have already indicated user as ansible in my playbook i got below mentioned errors when I run playbook while logged in as root. But everything works fine when I run playbook while logging in as ansible user.
- hosts: nameservers
user: ansible
tasks:
- name: check hostname
command: hostname
Error,
[root#dev playbooks]# ansible-playbook pingtest.yml
PLAY [nameservers] *********************************************************************************************************************
TASK [Gathering Facts] *****************************************************************************************************************
fatal: [x.x.x.x]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).", "unreachable": true}
fatal: [x.x.x.x]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).", "unreachable": true}
fatal: [x.x.x.x]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).", "unreachable": true}.
note: i have replaced IPs with x
setuser in playbook
user: ansible
set keypath in ansible configuration file
private_key_file = /home/ansible/.ssh/id_rsa

Ansible remote_user: root, ssh: Permission denied (publickey)

I'm trying to change the password using playbook but not getting the permission to do so.
I'm running the command:
ansible-playbook playbook.yml -k
- hosts: servers
remote_user: root
vars:
password: $1$Izd9zEZS$T11sNBK3bQgbzWkBMZq.
tasks:
- name: Changing Passwords
user:
name=root
password={{password}}
fatal: [host1]: UNREACHABLE! => {"changed": false, "msg": "Failed to
connect to the host via ssh: Permission denied (publickey).",
"unreachable": true}
Share the key to hot machine using below commands.
Save the .pub key :
ssh-keygen
Copy the key to host machine using ssh-copy-id command.
ssh-copy-id <IP address>
Problem
fatal: [host1]:..."msg": "Failed to connect to the host via ssh: Permission denied (publickey)."
The error message says that ansible_user, i.e the user who is running the command ansible-playbook, or ansible_user set in the inventory of the group servers, is not able to connect via SSH to root#host1 (see remote_user: root in the playbook), because the public key of ansible_user is missing in authorized_keys of root#host1.
Solution
To fix this problem, put the public key of ansible_user (in most cases ~/.ssh/id_rsa.pub) into the authorized_keys of root#host1 (in most cases /root/.ssh/authorized_keys).
Best practice
The best practice is not to allow root to login via SSH. Secure systems disable root login via SSH by default.
$ grep PermitRootLogin /etc/ssh/sshd_config
PermitRootLogin no
Instead, best practice is to SSH as an unprivileged user, e.g. remote_user: admin and escalate the privilege become: yes. See details in Understanding Privilege Escalation.
Put the username and password in 'etc\ansible\hosts'
[server]
172.30.141.1 ansible_password=xxx ansible_user=root
and test the connectivity by executing the following command
ansible all -m ping
it works for me.

Problem when invoking Ansible from Jenkins

When i Invoking Ansible through Jenkins i have added the below script in my Playbook
- name: HELLO WORLD PLAY
hosts: webserver
become: yes
become_method: sudo
tasks:
- debug:
msg: "HELLO......."
- shell: echo "HELLO WORLD"
I am getting below error when i build job
TASK [setup] *******************************************************************
fatal: [10.142.0.13]: UNREACHABLE! =>
{
"changed": false,
"msg": "ERROR! SSH encountered an unknown error during the connection. We recommend you re-run the command using -vvvv, which will enable SSH debugging output to help diagnose the issue",
"unreachable": true
}
when I run this playbook through CLI it is running successfully
but I am not able to run through Jenkins as (i have already done the set up by pasting private key in Jenkins)

I get the "Failed to connect to the host via ssh" error when Ansible tries to connect to a machine via ssh using private key

I am trying to provision a machine using ansible. I must connect to it via ssh using a private key, instead of password.
This is the content of my inventory.txt file:
target ansible_host=<ip_address> ansible_ssh_private_key_file=~/.ssh/<private_key_name>.pem
This is the content of my playbook.yaml file:
-
name: Playbook name
hosts: target
tasks:
<task_list>
When I am executing the command ansible-playbook <playbook_name>.yaml -i inventory.txt I get the following error:
fatal: [target]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Permission denied (publickey).\r\n", "unreachable": true}
I also tried executing the following command: ansible-playbook <playbook_name>.yaml --private-key=~/.ssh/<private_key_name>.pem -i inventory.txt, without the ansible_ssh_private_key_file property inside the inventory.txt file.
Note: I can connect to the machine using the command ssh -i <private_key_name>.pem <username>#<ip_address>.
How can I resolve this issue ?
I suspect you are connecting as different user. In the above example you use <user>#<host> during ssh checks but you don't have ansible_user=... field configured. Try providing username this way in hosts file.

Cloud9 and ansible

when trying to run ansible on cloud9,
some of my task have:
sudo_user: emr-user
HOSTS file:
[development]
localhost ansible_connection=local ansible_ssh_user=ubuntu
Running with:
ansible-playbook -i hosts site.yml --limit=development
keeps failing on this task with:
failed: [localhost] => {"failed": true, "parsed": false}
[sudo via ansible, key=zacflhyhixxhiajrlmtitjxgpxqimnmn] password:
I believe it is related to the fact the cloud9 runs on password-less ubuntu root
I was able to bypass it using sudo su and then running:
ansible-playbook -i hosts site.yml --limit=development
but it doesn't feel right. any other ideas?

Resources