I am trying to use ansible to connect to my switches and just do a show version. For some reason when i run the ansible playbook i keep getting the error "Failed to open session", i don't know why i keep getting it. I am able to ssh directly to the box with no issues.
[Ansible.cfg]
enable_task_debugger=True
hostfile=inventory
transport=paramiko
host_key_checking=False
[inventory/hosts]
127.0.0.1 ansible_connection=local
[routers]
192.168.10.1
[test.yaml]
---
- hosts: routers
gather_facts: true
connection: paramiko
tasks:
- name: show run
ios_command:
commands:
- show version
then i try to run it like this
ansible-playbook -vvv -i inventory test.yaml -u username -k
And then this is the last line of the error
EXEC /bin/sh -c 'echo ~ && sleep 0'
fatal: [192.168.10.1]: UNREACHABLE! => {
"changed": false,
"msg": "Failed to open session",
"unreachable": true
}
Anisble version is 2.4.2.0
Please use::
connection: local
change - hosts: routers to - hosts: localhost
Related
I am trying to run ansible playbooks within Concourse for remote hosts, however i cannot do that. Below are my steps:-
Concourse Yaml File:-
---
resource_types:
- name: ansible-playbook
type: docker-image
source:
repository: troykinsella/concourse-ansible-playbook-resource
tag: latest
resources:
- name: ansible
type: ansible-playbook
source:
debug: true
user: cloud_user
ssh_private_key: ((ssh-key))
verbose: vvv
- name: source-code
type: git
source:
uri: ((git-repo))
branch: master
private_key: ((ssh-key))
jobs:
- name: ansible-concourse
plan:
- get: source-code # git resource
- put: ansible
params:
check: true
diff: true
become: true
become_user: root
inventory: inventory/hosts
playbook: site.yml
path: source-code
Host File:-
[test]
localhost
Inside the Container:-
I intercepted the container and i can ssh to any IP inside, however i am not able to make ssh-login.
Ansible Playbook:-
---
- name: "Running Current Working Directory"
hosts: test
gather_facts: no
tasks:
- name: "Current Working Directory"
shell: pwd
register: value
- debug:
msg: "The Current Working Directory {{value.stdout_lines}}"
Output Coming in Concourse:-
ansible-playbook -i inventory/hosts --private-key /tmp/ansible-playbook-resource-ssh-private-key --user cloud_user -vvv site.yml
ansible-playbook 2.9.0
config file = /tmp/build/put/source-code/ansible.cfg
configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python3.6/dist-packages/ansible
executable location = /usr/local/bin/ansible-playbook
python version = 3.6.8 (default, Oct 7 2019, 12:59:55) [GCC 8.3.0]
Using /tmp/build/put/source-code/ansible.cfg as config file
host_list declined parsing /tmp/build/put/source-code/inventory/hosts as it did not pass its verify_file() method
script declined parsing /tmp/build/put/source-code/inventory/hosts as it did not pass its verify_file() method
auto declined parsing /tmp/build/put/source-code/inventory/hosts as it did not pass its verify_file() method
Parsed /tmp/build/put/source-code/inventory/hosts inventory source with ini plugin
PLAYBOOK: site.yml *************************************************************
1 plays in site.yml
PLAY [Running Current Working Directory] ***************************************
META: ran handlers
TASK [Current Working Directory] ***********************************************
task path: /tmp/build/put/source-code/site.yml:7
Monday 18 November 2019 12:38:49 +0000 (0:00:00.084) 0:00:00.085 *******
<localhost> ESTABLISH SSH CONNECTION FOR USER: cloud_user
<localhost> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/tmp/ansible-playbook-resource-ssh-private-key"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="cloud_user"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/dc52b3112c localhost '/bin/sh -c '"'"'echo ~cloud_user && sleep 0'"'"''
<localhost> (255, b'', b'')
fatal: [localhost]: UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: ",
"unreachable": true
}
PLAY RECAP *********************************************************************
localhost : ok=0 changed=0 unreachable=1 failed=0 skipped=0 rescued=0 ignored=0
Monday 18 November 2019 12:38:49 +0000 (0:00:00.029) 0:00:00.114 *******
===============================================================================
Current Working Directory ----------------------------------------------- 0.03s
/tmp/build/put/source-code/site.yml:7 -----------------------------------------
localhost is normally accessed through the local connection plugin (unless you are trying to do something really special and you have configured access through ssh which does not seem to be the case from your above error message).
If you don't declare it in you inventory, localhost is implicit, uses the local connection and is not matched in the all group.
However, if you declare localhost explicitly in your inventory, the default connection plugin becomes ssh and the all group matches this host too. You have to set the connection back to local yourself in that case.
You have two options to make your current test work:
Delete your inventory (or use one that does not explicitly declare localhost) and modify your playbook to target localhost directly => hosts: localhost
Keep your playbook as is and modify your inventory
[test]
localhost ansible_connection=local
I want to provision a new vps. The way this is typically done: 1) try login manually as a non-root user, and 2) if that fails then perform the provisioning.
But I can't connect. I can't even login as root. (I can ssh from the shell, so the password is correct.)
hosts:
[server]
42.42.42.42
playbook.yml:
---
- hosts: all
vars:
ROOT_PASSWORD: foo
gather_facts: no
tasks:
- name: set root password
set_fact: ansible_password={{ ROOT_PASSWORD }}
- name: try login with password
local_action: "command ssh -q -o BatchMode=yes -o ConnectTimeout=3 root#{{ inventory_hostname }} 'echo ok'"
ignore_errors: true
changed_when: false
# more stuff here...
I tried the following, but all don't connect:
I stored the password in a variable like above
I prompted for the password using ansible-playbook -k playbook.yml
I moved the password to the inventory file
[server]
42.42.42.42 ansible_user=root ansible_password=foo
I added the ssh flag -o PreferredAuthentications=password to force password auth
But none of the above connects. I always get the error
root#42.42.42.42: Permission denied (publickey,password).
If I remove -o BatchMode=yes then it prompts me for a password, and does connect. But that prevents automation, the idea is to do this without user intervention.
What am I doing wrong?
This is a new vps, nothing is set up yet - so I'm looking for the simplest possible example of a playbook that connects using root and a password.
You're close. The variable is ansible_ssh_password, not ansible_ssh_pass. The variables with _ssh in the name are legacy names, so you can juse use ansible_user and ansible_password instead.
If I have an inventory like this:
[server]
example ansible_host=192.168.122.148 ansible_user=root ansible_password=secret
Then I can run this command successfully:
$ ansible all -i hosts -m ping
example | SUCCESS => {
"ansible_facts": {
"discovered_interpreter_python": "/usr/bin/python"
},
"changed": false,
"ping": "pong"
}
If the above ad-hoc command works correctly, then a playbook should work correctly as well. E.g., still assuming the above inventory, I can use the following playbook:
---
- hosts: all
gather_facts: false
tasks:
- ping:
And I can call it like this:
$ ansible-playbook playbook.yml -i hosts
PLAY [all] ***************************************************************************
TASK [ping] **************************************************************************
ok: [example]
PLAY RECAP ***************************************************************************
example : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
...and it all works just fine.
Try using --ask-become-pass
ansible-playbook -k playbook.yml --ask-become-pass
That way it's not hardcoded.
Also, inside the playbook you can invoke:
---
- hosts: all
become: true
gather_facts: no
All SO answers and blog articles I've seen so far recommend doing it the way I've shown.
But after spending much time on this, I don't believe it could work that way, so I don't understand why it is always recommended. I noticed that ansible has changed its API many times, and maybe that approach is simply outdated!
So I came up with an alternative, using sshpass:
hosts:
[server]
42.42.42.42
playbook.yml:
---
- hosts: all
vars:
ROOT_PASSWORD: foo
gather_facts: no
tasks:
- name: try login with password (using out-of-band ssh connection)
local_action: command sshpass -p {{ ROOT_PASSWORD }} ssh -q -o ConnectTimeout=3 root#{{ inventory_hostname }} 'echo ok'
ignore_errors: true
register: exists_user
- name: ping if above succeeded (using in-band ssh connection)
remote_user: root
block:
- name: set root ssh password
set_fact:
ansible_password: "{{ ROOT_PASSWORD }}"
- name: ping
ping:
data: pong
when: exists_user is success
This is just a tiny proof of concept.
The actual use case is to try connect with a non-root user, and if that fails, to provision the server. The above is the starting point for such a playbook.
Unlike #larsks' excellent alternative, this does not assume python is installed on the remote, and performs the ssh connection test out of band, assisted by sshpass.
The end goal is for me to copy file.txt from Host2 over to Host1. However, I keep getting the same error whenever I perform the function. I have triple checked my spacing and made sure I spelled everything correctly, but nothing seems to work.
Command to start the playbook:
ansible-playbook playbook_name.yml -i inventory/inventory_name -u username -k
My Code:
- hosts: Host1
tasks:
- name: Synchronization using rsync protocol on delegate host (pull)
synchronize:
mode: pull
src: rsync://Host2.linux.us.com/tmp/file.txt
dest: /tmp
delegate_to: Host2.linux.us.com
Expected Result:
Successfully working
Actual Result:
fatal: [Host1.linux.us.com]: FAILED! => {"changed": false, "cmd": "sshpass", "msg": "[Errno 2] No such file or directory", "rc": 2}
I have the same problem as you,Installing sshpass on the target host can work normally
yum install -y sshpass
I'm very, very new to Ansible so I just need someone to break down how to set up a yaml file to use as a playbook.
I wrote this string of code that does work:
ansible Test --user exampleuser --ask-pass -c local -m ping
Output:
192.168.1.4 | SUCCESS => {
"changed": false,
"ping": "pong"
How to I format what I wrote so I can just type:
ansible-playbook test.yaml
Below is the content of yaml file should look like
---
- hosts: Test
connection: local
remote_user: exampleuser
tasks:
- ping:
I'm running an ansible 2.3.1.0 on my local machine (macOs) and trying to achieve :
connecting to user1#host1
copying a file from user2#host2:/path/to/file to user1#host1:/tmp/path/to/file
I'm on my local, with host1 as hosts and user1 as remote_user:
- synchronize: mode=pull src=user2#host2:/path/to/file dest=/tmp/path/to/file
Wrong output:
/usr/bin/rsync (...) user1#host1:user2#host2:/path/to/file /tmp/path/to/file
Conclusion
I've been trying different options. I've debugged ansible. I can't understand what's wrong.
Help!
Edit 1
I've also tried adding delegate_to:
- synchronize: mode=pull src=/path/to/file dest=/tmp/path/to/file
delegate_to: host2
It gives:
fatal: [host1]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Permission denied (publickey,password,keyboard-interactive).\r\n", "unreachable": true}
And also:
- synchronize: mode=pull src=/path/to/file dest=/tmp/path/to/file
delegate_to: user2#host2
Which gives:
fatal: [host1 -> host2]: FAILED! => {"changed": false, "cmd": "/usr/bin/rsync --delay-updates -F --compress --archive --rsh=/usr/bin/ssh -S none -o StrictHostKeyChecking=no --rsync-path=sudo rsync --out-format=<<CHANGED>>%i %n%L host1:/path/to/file /tmp/path/to/file", "failed": true, "msg": "Permission denied (publickey).\r\nrsync: connection unexpectedly closed (0 bytes received so far) [Receiver]\nrsync error: unexplained error (code 255) at io.c(235) [Receiver=3.1.2]\n", "rc": 255}
NB: ssh user1#host1 and then ssh user2#host2 works with ssh keys (no password required)
Please pay attention to this notes from modules' docs:
For the synchronize module, the “local host” is the host the synchronize task originates on, and the “destination host” is the host synchronize is connecting to.
The “local host” can be changed to a different host by using delegate_to. This enables copying between two remote hosts or entirely on one remote machine.
I guess, you may want to try (assuming Ansible can connect to host2):
- synchronize:
src: /path/to/file
dest: /tmp/path/to/file
delegate_to: host2