In my job there is a playbook developed in the following way that is executed by ansible tower.
This is the file that ansible tower executes and calls a playbook
report.yaml:
- hosts: localhost
gather_facts: false
connection: local
tasks:
- name: "Execute"
include_role:
name: 'fusion'
main.yaml from fusion role:
- name: "hc fusion"
include_tasks: "hc_fusion.yaml"
hc_fusion.yaml from fusion role:
- name: "FUSION"
shell: ansible-playbook roles/fusion/tasks/fusion.yaml --extra-vars 'fusion_ip_ha={{item.ip}} fusion_user={{item.username}} fusion_pass={{item.password}} fecha="{{fecha.stdout}}" fusion_ansible_become_user={{item.ansible_become_user}} fusion_ansible_become_pass={{item.ansible_become_pass}}'
fusion.yaml from fusion role:
- hosts: localhost
vars:
ansible_become_user: "{{fusion_ansible_become_user}}"
ansible_become_pass: "{{fusion_ansible_become_pass}}"
tasks:
- name: Validate
ignore_unreachable: yes
shell: service had status
delegate_to: "{{fusion_user}}#{{fusion_ip_ha}}"
become: True
become_method: su
This is a summary of the entire run.
Previously it worked but throws the following error.
stdout: PLAY [localhost] \nTASK [Validate] [1;31mfatal: [localhost -> gandalf#10.66.173.14]: UNREACHABLE! => {\"changed\": false, \"msg\": \"Failed to connect to the host via ssh: Warning: Permanently added '10.66.173.14' (RSA) to the list of known hosts.\ngandalf#10.66.173.14: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password), \"skip_reason\": \"Host localhost is unreachable\"
When I execute ansible-playbook roles/fusion/tasks/fusion.yaml --extra-vars XXXXXXXX from the command line with user awx it works.
Also I validated the connection from the server where ansible tower is running to where you want to connect with the ssh command and if it allows me to connect without requesting a password with the user awx
fusion.yaml does not explicitly specify connection plugin, thus default ssh type is being used. For localhost this approach usually brings a number of related problems (ssh keys, known_hosts, loopback interfaces etc.). If you need to run tasks on localhost you should define connection plugin local just like in your report.yaml playbook.
Additionally, as Zeitounator mentioned, running one ansible playbook from another with shell model is a really bad practice. Please, avoid this. Ansible has a number of mechanism for code re-use (includes, imports, roles etc.).
Related
How can I run a local command on a Ansible control server, if that control server does not have a SSH daemon running?
If I run the following playbook:
- name: Test commands
hosts: localhost
connection: local
gather_facts: false
tasks:
- name: Test local action
local_action: command echo "hello world"
I get the following error:
fatal: [localhost]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: ssh: connect to host localhost port 22: Connection refused", "unreachable": true}
It seems that local_action is the same as delegate_to: 127.0.0.1, so Ansible tries to ssh to the localhost. However, there is no SSH daemon running on the local controller host (only on the remote machines).
So my immediate question is how to run a specific command from Ansible, without Ansible first trying to SSH to localhost.
Crucial addition, not in the original question:
My host_vars contained the following line:
ansible_connection: ssh
how to run a specific command from Ansible, without Ansible first trying to SSH to localhost.
connection: local is sufficient to make the tasks run in the controller without using SSH.
Try,
- name: Test commands
hosts: localhost
connection: local
gather_facts: false
tasks:
- name: Test local action
command: echo "hello world"
I'll answer the details myself, perhaps it is useful to someone:
In my case:
ansible_connection was set to ssh in the host_vars.
ansible_host was set to localhost by local_action.
This combined let to a ssh to localhost that failed.
Further considerations:
delegate_to, local_action set ansible_host and ansible_connection, but any setting in the host_vars, playbook or task override that.
connection: local only sets ansible_connection (ansible_host is unmodified), but any setting of ansible_connection in the host_vars, playbook or task overrides it.
So my solution was to either remove the ansible_connection in the host_vars, or setting the var ansible_connection in a task.
That looks wrong for me.
- name: import profiles of VMs
connection: local
hosts: localhost
gather_facts: false
tasks:
- name: list files
find:
paths: .
recurse: no
delegate_to: localhost
He is still asking me for ssh password:
❯ ansible-playbook playbooks/import_vm_profiles.yml -i localhost, -k [WARNING]: Unable to parse the plugin filter file /Users/fredericclement/devops/ansible_refactored/etc/Plugin_filters.yml as module_blacklist is not a list. Skipping.
SSH password:
In Ansible I have a need to execute a set of tasks and obtain the passwords from a third party (this part was handled) and then use those SSH credentials to connect.
The problem is it seems when I am doing this the best way to loop through my inventory is to include a list of tasks, that's great. The major problem is that I can only get this if I specify hosts in my main.yml playbook to localhost. (Or set to the name of the server group and specify connection: local) this makes the command module execute locally, which defeats the purpose.
I have tried looking into the SSH module but it looks like it is not registering to give me a no_action detected. I am aware I am likely overlooking something glaring.
I will be posting closer to exact code later but what I have now is
main.yml
---
- hosts: localhost
tasks:
- name: subplay
include: secondary.yml
vars:
user:myUser
address:"{{hostvars[item].address}}"
with_items: hostvars['mygroup']
secondary.yml
---
- name: fetch password
[...fethchMyPassword, it works]
register: password
- name:
[...Need to connect with fetched user for this task and this task only..]
command: /my/local/usr/task.sh
I am wanting to connect and execute the script there but it seems no matter what I try it either fails to execute at all or executes locally.
Additionally, I might note I checked out https://docs.ansible.com/ansible/latest/plugins/connection/paramiko_ssh.html and
https://docs.ansible.com/ansible/latest/plugins/connection/ssh.html but must be doing something wrong
it looks like to me that only your fetch task needs to be delegated to localhost, the rest on my_group, and when you have all your connection info, setup connection with set_facts by setting values to ansible_{user, ssh_pass, password} try this :
main.yml
---
- hosts: mygroup # inventory_hostname will loop through all your hosts in my_group
tasks:
- name: subplay
include: secondary.yml
vars:
user:myUser
address:"{{hostvars[inventory_hostname].address}}"
secondary.yml
---
- name: fetch password
[...fethchMyPassword, it works]
delegate_to: localhost # this task is only run on localhost
register: password
- set_fact: # use registered password and vars to setup connection
ansible_user: "{{ user}}"
ansible_ssh_pass: "{{ password }}"
ansible_host: "{{ address }}"
- name: Launch task # this task is run on each hosts of my_group
[...Need to connect with fetched user for this task and this task only..]
command: /my/local/usr/task.sh
launch this with
ansible-playbook main.yml
try to write a role with your secondary.yml, and a playbook witht your main.yml
I'm trying to develop an Ansible script to generate a VM. I wrote a myvm role that contains the script that orchestrates vmware_guest. This script contains a delegate_to: localhost which vmware_guest requires.
Then, I added my new-to-be-vm to hosts, and added the following to hosts:
[myvms]
myvm1
and extended site.yml with:
- hosts: myvms
roles:
- myvm
Now, when I run:
ansible-playbook site.yml -i hosts --limit myvm1
it fails with:
fatal: [myvm1]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Connection reset by 192.168.10.13 port 22\r\n", "unreachable": true}
It seems ansible tries to connect to the vm ip before reading the actual role that creates the vm where it delegates to localhost. Adding 'delegate_to' to site.yml fails, however.
How can I fix my Ansible scripts to properly generate the VM for me?
Add gather_facts: false to the play.
- hosts: myvms
gather_facts: false
roles:
- myvm
Ansible by default connects to target machines and runs script which collect data (facts).
I want to use Ansible to disable selinux in some remote servers. I don't know yet the full list of the servers, it will come from time to time.
That would be great if the ssh-copy-id phase would be integrated somehow in the playbook - you would expect that from an automation system ? I don't mind getting asked for the password one time per server.
With various reading, I understand I can run a local_action in my task:
---
- name: Disable SELinux
hosts: all
remote_user: root
gather_facts: False
tasks:
- local_action: command ssh-copy-id {{remote_user}}#{{hostname}}
- selinux:
state: disabled
However:
It fails because {{remote_user}} and {{hostname}} are not accessible in this context.
I need to gather_factsto False, because it's executed before local_action
Any idea if that's possible within Ansible playbooks ?
You may try this:
- hosts: all
gather_facts: no
tasks:
- set_fact:
rem_user: "{{ ansible_user | default(lookup('env','USER')) }}"
rem_host: "{{ ansible_host }}"
- local_action: command ssh-copy-id {{ rem_user }}#{{ rem_host }}
- setup:
- selinux:
state: disabled
Define remote user and remote host first, then make local action, then enforce fact gathering with setup.
I'm starting out with ansible and I'm looking for a way to create a boilerplate project on the server and on the local environment with ansible playbooks.
I want to use ansible templates locally to create some generic files.
But how would i take ansible to execute something locally?
I read something with local_action but i guess i did not get this right.
This is for the webbserver...but how do i take this and create some files locally?
- hosts: webservers
remote_user: someuser
- name: create some file
template: src=~/workspace/ansible_templates/somefile_template.j2 dest=/etc/somefile/apps-available/someproject.ini
You can delegate tasks with the param delegate_to to any host you like, for example:
- name: create some file
template: src=~/workspace/ansible_templates/somefile_template.j2 dest=/etc/somefile/apps-available/someproject.ini
delegate_to: localhost
See Playbook Delegation in the docs.
If your playbook should in general run locally and no external hosts are involved though, you can simply create a group which contains localhost and then run the playbook against this group. In your inventory:
[local]
localhost ansible_connection=local
and then in your playbook:
hosts: local
Ansible has a local_action directive to support these scenarios which avoids the localhost and/or ansible_connection workarounds and is covered in the Delegation docs.
To modify your original example to use local_action:
- name: create some file
local_action: template src=~/workspace/ansible_templates/somefile_template.j2 dest=/etc/somefile/apps-available/someproject.ini
which looks cleaner.
If you cannot do/allow localhost SSH, you can split the playbook on local actions and remote actions.
The connection: local says to not use SSH for a playbook, as shown here: https://docs.ansible.com/ansible/latest/user_guide/playbooks_delegation.html#local-playbooks
Example:
# myplaybook.yml
- hosts: remote_machines
tasks:
- debug: msg="do stuff in the remote machines"
- hosts: 127.0.0.1
connection: local
tasks:
- debug: msg="ran in local ansible machine"
- hosts: remote_machines
tasks:
- debug: msg="do more stuff in remote machines"