Ansible connect via ssh after certain tasks have populated passwords - bash

In Ansible I have a need to execute a set of tasks and obtain the passwords from a third party (this part was handled) and then use those SSH credentials to connect.
The problem is it seems when I am doing this the best way to loop through my inventory is to include a list of tasks, that's great. The major problem is that I can only get this if I specify hosts in my main.yml playbook to localhost. (Or set to the name of the server group and specify connection: local) this makes the command module execute locally, which defeats the purpose.
I have tried looking into the SSH module but it looks like it is not registering to give me a no_action detected. I am aware I am likely overlooking something glaring.
I will be posting closer to exact code later but what I have now is
main.yml
---
- hosts: localhost
tasks:
- name: subplay
include: secondary.yml
vars:
user:myUser
address:"{{hostvars[item].address}}"
with_items: hostvars['mygroup']
secondary.yml
---
- name: fetch password
[...fethchMyPassword, it works]
register: password
- name:
[...Need to connect with fetched user for this task and this task only..]
command: /my/local/usr/task.sh
I am wanting to connect and execute the script there but it seems no matter what I try it either fails to execute at all or executes locally.
Additionally, I might note I checked out https://docs.ansible.com/ansible/latest/plugins/connection/paramiko_ssh.html and
https://docs.ansible.com/ansible/latest/plugins/connection/ssh.html but must be doing something wrong

it looks like to me that only your fetch task needs to be delegated to localhost, the rest on my_group, and when you have all your connection info, setup connection with set_facts by setting values to ansible_{user, ssh_pass, password} try this :
main.yml
---
- hosts: mygroup # inventory_hostname will loop through all your hosts in my_group
tasks:
- name: subplay
include: secondary.yml
vars:
user:myUser
address:"{{hostvars[inventory_hostname].address}}"
secondary.yml
---
- name: fetch password
[...fethchMyPassword, it works]
delegate_to: localhost # this task is only run on localhost
register: password
- set_fact: # use registered password and vars to setup connection
ansible_user: "{{ user}}"
ansible_ssh_pass: "{{ password }}"
ansible_host: "{{ address }}"
- name: Launch task # this task is run on each hosts of my_group
[...Need to connect with fetched user for this task and this task only..]
command: /my/local/usr/task.sh
launch this with
ansible-playbook main.yml
try to write a role with your secondary.yml, and a playbook witht your main.yml

Related

How to run tasks file as a playbook and as an include_file

I have a generic tasks, like update a DNS records based on my machine -> server
Which I use it in my playbooks with include_tasks like so:
- name: (include) Update DNS
include_tasks: task_update_dns.yml
I have many of these "generic tasks" simply acting as a "shared task" in many playbooks.
But in some cases, I just want to run one of these generic tasks on a server, but running the following gives the below error:
ansible-playbook playbooks/task_update_dns.yml --limit $myserver
# ERROR
ERROR! 'set_fact' is not a valid attribute for a Play
Simply because it's not a "playbook" they are just "tasks"
File: playbooks/task_update_dns.yml
---
- name: dig mydomain.com
set_fact:
my_hosts: '{{ lookup("dig", "mydomain.com").split(",") }}'
tags: always
- name: Set entries
blockinfile:
....
I know I can write a playbook, empty, that only "include" the task file, but I don't want to create a shallow playbook now for each task.
Is there a way to configure the task file in such way that I'll be able to run it for both include_tasks and as a "stand alone"/play command line ?
Have you tried to use custom tags for those tasks?
Create a playbook with all the tasks
---
- name: Update web servers
hosts: webservers
remote_user: root
tasks:
- name: dig mydomain.com
set_fact:
my_hosts: '{{ lookup("dig", "mydomain.com").split(",") }}'
tags: always
- name: Set entries
blockinfile:
tags:
- set_entries
- name: (include) Update DNS
include_tasks: task_update_dns_2.yml
tags:
- dns
...
when you need to run only that specific task, you just need to add the --tag parameter in the execution with the task name
ansible-playbook playbooks/task_update_dns.yml --limit $myserver --tag set_entries

Ansible: Host localhost is unreachable

In my job there is a playbook developed in the following way that is executed by ansible tower.
This is the file that ansible tower executes and calls a playbook
report.yaml:
- hosts: localhost
gather_facts: false
connection: local
tasks:
- name: "Execute"
include_role:
name: 'fusion'
main.yaml from fusion role:
- name: "hc fusion"
include_tasks: "hc_fusion.yaml"
hc_fusion.yaml from fusion role:
- name: "FUSION"
shell: ansible-playbook roles/fusion/tasks/fusion.yaml --extra-vars 'fusion_ip_ha={{item.ip}} fusion_user={{item.username}} fusion_pass={{item.password}} fecha="{{fecha.stdout}}" fusion_ansible_become_user={{item.ansible_become_user}} fusion_ansible_become_pass={{item.ansible_become_pass}}'
fusion.yaml from fusion role:
- hosts: localhost
vars:
ansible_become_user: "{{fusion_ansible_become_user}}"
ansible_become_pass: "{{fusion_ansible_become_pass}}"
tasks:
- name: Validate
ignore_unreachable: yes
shell: service had status
delegate_to: "{{fusion_user}}#{{fusion_ip_ha}}"
become: True
become_method: su
This is a summary of the entire run.
Previously it worked but throws the following error.
stdout: PLAY [localhost] \nTASK [Validate] [1;31mfatal: [localhost -> gandalf#10.66.173.14]: UNREACHABLE! => {\"changed\": false, \"msg\": \"Failed to connect to the host via ssh: Warning: Permanently added '10.66.173.14' (RSA) to the list of known hosts.\ngandalf#10.66.173.14: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password), \"skip_reason\": \"Host localhost is unreachable\"
When I execute ansible-playbook roles/fusion/tasks/fusion.yaml --extra-vars XXXXXXXX from the command line with user awx it works.
Also I validated the connection from the server where ansible tower is running to where you want to connect with the ssh command and if it allows me to connect without requesting a password with the user awx
fusion.yaml does not explicitly specify connection plugin, thus default ssh type is being used. For localhost this approach usually brings a number of related problems (ssh keys, known_hosts, loopback interfaces etc.). If you need to run tasks on localhost you should define connection plugin local just like in your report.yaml playbook.
Additionally, as Zeitounator mentioned, running one ansible playbook from another with shell model is a really bad practice. Please, avoid this. Ansible has a number of mechanism for code re-use (includes, imports, roles etc.).

roles overides tasks in playbook

I have ansible playbook which look similar to the code below :
---
- hosts: localhost
connection: local
tasks:
- name: "Create custom fact directory
file:
path: "/etc/ansible/facts.d"
state: "directory"
- name: "Insert custom fact file"
copy:
src: custom_fact.fact
dest: /etc/ansible/facts.d/custom_fact.fact
mode: 0755
roles:
- role1
- role2
once i am running the playbook with ansible-playbook command
only the roles is running ,but the tasks is not getting ran
if i am remarking the roles from the playbook,the task gets ran
how can i make the task to run before the roles ?
Put the tasks in a section pre_tasks which are run before roles.
You may also find post_tasks useful which run tasks after roles.
Correct the indentation
- hosts: localhost
connection: local
tasks:
- name: "Create custom fact directory
file:
path: ...

How do I apply an Ansible task to multiple hosts from within a playbook?

I am writing an ansible playbook to rotate IAM access keys. It runs on my localhost to create a new IAM Access Key on AWS. I want to push that key to multiple other hosts' ~/.aws/credentials files.
---
- name: Roll IAM access keys
hosts: localhost
connection: local
gather_facts: false
strategy: free
roles:
- iam-rotation
In the iam-rotation role, I have something like this:
- name: Create new Access Key
iam:
iam_type: user
name: "{{ item }}"
state: present
access_key_state: create
key_count: 2
with_items:
- ansible-test-user
register: aws_user
- set_fact:
aws_user_name: "{{ aws_user.results.0.user_name }}"
created_keys_count: "{{ aws_user.results.0.created_keys | length }}"
aws_user_keys: "{{ aws_user.results[0]['keys'] }}"
I want to use push the newly created access keys out to jenkins builders. How would I use the list of hosts from with_items in the task? The debug task is just a placeholder.
# Deploy to all Jenkins builders
- name: Deploy new keys to jenkins builders
debug:
msg: "Deploying key to host {{item}}"
with_items:
- "{{ groups.jenkins_builders }}"
Hosts file that includes the list of hosts I want to apply to
[jenkins_builders]
builder1.example.org
builder2.example.org
builder3.example.org
I am executing the playbook on localhost. But within the playbook I want one task to execute on remote hosts which I'm getting from the hosts file. The question was...
How would I use the list of hosts from with_items in the task?
Separate the tasks into two roles. Then execute the first role against localhost and the second one against jenkins_builders:
---
- name: Rotate IAM access keys
hosts: localhost
connection: local
gather_facts: false
strategy: free
roles:
- iam-rotation
- name: Push out IAM access keys
hosts: jenkins_builders
roles:
- iam-propagation
Per AWS best practices recommendations, if you are running an application on an Amazon EC2 instance and the application needs access to AWS resources, you should use IAM roles for EC2 instead of keys:
https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2.html
You can use
delegate_to: servername
in the task module, it will run only on the particular

How to create a file locally with ansible templates on the development machine

I'm starting out with ansible and I'm looking for a way to create a boilerplate project on the server and on the local environment with ansible playbooks.
I want to use ansible templates locally to create some generic files.
But how would i take ansible to execute something locally?
I read something with local_action but i guess i did not get this right.
This is for the webbserver...but how do i take this and create some files locally?
- hosts: webservers
remote_user: someuser
- name: create some file
template: src=~/workspace/ansible_templates/somefile_template.j2 dest=/etc/somefile/apps-available/someproject.ini
You can delegate tasks with the param delegate_to to any host you like, for example:
- name: create some file
template: src=~/workspace/ansible_templates/somefile_template.j2 dest=/etc/somefile/apps-available/someproject.ini
delegate_to: localhost
See Playbook Delegation in the docs.
If your playbook should in general run locally and no external hosts are involved though, you can simply create a group which contains localhost and then run the playbook against this group. In your inventory:
[local]
localhost ansible_connection=local
and then in your playbook:
hosts: local
Ansible has a local_action directive to support these scenarios which avoids the localhost and/or ansible_connection workarounds and is covered in the Delegation docs.
To modify your original example to use local_action:
- name: create some file
local_action: template src=~/workspace/ansible_templates/somefile_template.j2 dest=/etc/somefile/apps-available/someproject.ini
which looks cleaner.
If you cannot do/allow localhost SSH, you can split the playbook on local actions and remote actions.
The connection: local says to not use SSH for a playbook, as shown here: https://docs.ansible.com/ansible/latest/user_guide/playbooks_delegation.html#local-playbooks
Example:
# myplaybook.yml
- hosts: remote_machines
tasks:
- debug: msg="do stuff in the remote machines"
- hosts: 127.0.0.1
connection: local
tasks:
- debug: msg="ran in local ansible machine"
- hosts: remote_machines
tasks:
- debug: msg="do more stuff in remote machines"

Resources