Ansible Dynamic Inventory - ansible

I'm running a playbook which houses multiple roles targets multiple hosts
The goal is to deploy a VM and use it's IP to deploy an app.
My playbook, has two roles, using "build_vm" role I'm able to display IP address via debug, yet when passing ipaddr variable to second role, Ansible complains that the variable is not defined
- hosts: linux
become: true
roles:
- build_vm
- tasks:
- debug: msg="{{ ipaddr }}"
- hosts: "{{ ipaddr }}"
roles:
- deploy_app
I have used set_fact with and ran into same issue, I wonder what I should be using here? dynamic inventory? I have searched sparse docs online and I'm unable to find an intuitive example to follow.

There are many ways to using add_host. In this example, I am adding the new host to a group and using it in a later play.
- hosts: linux
become: true
roles:
- build_vm
- tasks:
- debug: msg="{{ ipaddr }}"
- name: Add ipaddr to host inventory
add_host: name="{{ ipaddr }}" group=NewHostGroup
- hosts: NewHostGroup
roles:
- deploy_app

Related

Access variables at play level

I use localhost and set_fact to store variables and access them in different playbooks.
---
- hosts: localhost
connection: local
gather_facts: False
tasks:
- name: set_variables
set_fact:
cloudinit_fqdn: 'server1.example.com'
additional_container_config_values:
security.nesting: 'false'
security.privileged: 'false'
cloudinit_network_raw:
version: 2
renderer: networkd
ethernets:
eth0:
dhcp4: False
addresses: [192.168.178.35/24]
gateway4: 192.168.178.1
nameservers:
addresses: [192.168.178.13]
Now I want to use the cloudinit_fqdn at import_playbook:
- name: system configuration
import_playbook: "{{ hostvars['localhost']['cloudinit_fqdn'] }}_server_config.yml"
I tried different ways to get that variable, but I get errors like:
'ERROR! 'hostvars' is undefined'
I am not able to get access to that variable by:
- debug:
msg: '{{ vars }}'
ERROR! 'debug' is not a valid attribute for a Play
How can I use a variable at play-level?
Regarding your use case I've setup a short test to come around the syntax errors of the variable, as well the debug task.
---
- hosts: localhost
become: false
gather_facts: false
tasks:
- name: Set variables
set_fact:
example_fqdn: 'test.example.com'
- name: Show variables
debug:
msg: "{{ hostvars['localhost'].example_fqdn }}"
While the example is working, adding
- name: Import playbook
import_playbook: "{{ hostvars['localhost'].example_fqdn }}.yml"
or even a simple
- name: Import playbook
import_playbook: "{{ example_fqdn }}.yml"
let the playbook run fail with
ERROR! 'hostvars' is undefined
ERROR! 'example_fqdn' is undefined
since the import is done during compile time, whereby the variable will be defined during runtime. Even not possible is
- name: Import playbook
import_playbook: "{{ to_import }}.yml"
vars:
to_import: "{{ example_fqdn }}"
as the import is static, not dynamic. Importing playbooks and Re-using playbooks seems not working in that way.
What is actually working is
- name: Import playbook
import_playbook: test.example.com.yml
Furher Questions and Answers
Ansible: import_playbook fails with variable undefined error
Ansible: Skip import_playbook with variable definition
What's the difference between include_tasks and import_tasks

How can I more easily delegate all tasks in a role in Ansible?

I'm still somewhat new to Ansible so I'm sure this isn't the proper way of doing this, but it's what I've come up with considering the requirements I was given.
I have to perform tasks on a server, which I do not have credentials to access since they are locked in a vault. My way of working around this is to get the credentials from the vault, then delegate tasks to that server. I've accomplished this, but I'm wondering if there is a cleaner or more adequate way of doing it. So, here's my setup:
I have a playbook that just has:
---
- hosts: localhost
roles:
- role: get_credentials <-- Not the real role names
- role: use_credentials
Basically, get_credentials gets some credentials from a vault and then use_credentials performs tasks, but each task has
delegate_to: protected_server
vars:
ansible_ssh_user: "{{ user }}"
ansible_ssh_pass: "{{ password }}"
at the end of it
Is there a way I can delegate all the tasks in use_credentials without having to delegate each task individually?
I'ld move both your role from the roles: section to the tasks:, using include_role. Something like this:
tasks:
- name: Get credentials
include_role:
name: get_credentials # I expect this one to set_fact user_from_get_credential and password_from_get_credential
delegate_to: protected_server
- name: Use credentials
include_role:
name: use_credentials
vars:
ansible_ssh_user: "{{ user_from_get_credential }}"
ansible_ssh_pass: "{{ password_from_get_credential }}"

Change ansible inventory based on variables

Can a playbook load inventory list from variables? So I can easily customize the run based on chosen environment?
tasks:
- name: include environment config variables
include_vars:
file: "{{ item }}"
with_items:
- "../../environments/default.yml"
- "../../environments/{{ env_name }}.yml"
- name: set inventory
set_fact:
inventory.docker_host = " {{ env_docker_host }}"
Yes. Use the add_host module: https://docs.ansible.com/ansible/latest/modules/add_host_module.html
As I'm in ansible 2.3 I can't use the add_host module (see Jack's answer and add_host docs) and that would be a superior solution. Therefore, I'll use a different trick to augment an existing ansible inventory file, reload and use it.
hosts.inv
[remotehosts]
main.yml
- hosts: localhost
pre_tasks:
- name: include environment config variables
include_vars:
file: "{{ item }}"
with_items:
- "../environments/default.yml"
- "../environments/{{ env_name }}.yml"
- name: inventory facts
run_once: true
set_fact:
my_host: "{{ env_host_name }}"
- name: update inventory for env
local_action: lineinfile
path=hosts.inv
regexp={{ my_host }}
insertafter="[remotehosts]" line={{ my_host }}
- meta: refresh_inventory
- hosts: remotehosts
...
The pretasks process the environments yml with all the variable replacement etc and use that to populate hosts.inv prior to reloading via refresh_inventory
Any tasks defined beneath - hosts: remotehosts would execute on the remote host or hosts.

How do I apply an Ansible task to multiple hosts from within a playbook?

I am writing an ansible playbook to rotate IAM access keys. It runs on my localhost to create a new IAM Access Key on AWS. I want to push that key to multiple other hosts' ~/.aws/credentials files.
---
- name: Roll IAM access keys
hosts: localhost
connection: local
gather_facts: false
strategy: free
roles:
- iam-rotation
In the iam-rotation role, I have something like this:
- name: Create new Access Key
iam:
iam_type: user
name: "{{ item }}"
state: present
access_key_state: create
key_count: 2
with_items:
- ansible-test-user
register: aws_user
- set_fact:
aws_user_name: "{{ aws_user.results.0.user_name }}"
created_keys_count: "{{ aws_user.results.0.created_keys | length }}"
aws_user_keys: "{{ aws_user.results[0]['keys'] }}"
I want to use push the newly created access keys out to jenkins builders. How would I use the list of hosts from with_items in the task? The debug task is just a placeholder.
# Deploy to all Jenkins builders
- name: Deploy new keys to jenkins builders
debug:
msg: "Deploying key to host {{item}}"
with_items:
- "{{ groups.jenkins_builders }}"
Hosts file that includes the list of hosts I want to apply to
[jenkins_builders]
builder1.example.org
builder2.example.org
builder3.example.org
I am executing the playbook on localhost. But within the playbook I want one task to execute on remote hosts which I'm getting from the hosts file. The question was...
How would I use the list of hosts from with_items in the task?
Separate the tasks into two roles. Then execute the first role against localhost and the second one against jenkins_builders:
---
- name: Rotate IAM access keys
hosts: localhost
connection: local
gather_facts: false
strategy: free
roles:
- iam-rotation
- name: Push out IAM access keys
hosts: jenkins_builders
roles:
- iam-propagation
Per AWS best practices recommendations, if you are running an application on an Amazon EC2 instance and the application needs access to AWS resources, you should use IAM roles for EC2 instead of keys:
https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2.html
You can use
delegate_to: servername
in the task module, it will run only on the particular

Ansible - Get Facts from Remote Windows Hosts

I am using Ansible / Ansible Tower and would like to determine what facts are available on my Windows host. The documentation states that I can run the following:
ansible hostname -m setup
How would I incorporate this into a playbook I run from Tower so I can gather the information from hosts?
Here is the current Playbook per the assistance given:
# This play outputs the facts of the Windows targets in scope
- name: Gather Windows Facts
hosts: "{{ target }}"
gather_facts: yes
tasks:
- setup:
register: ansible_facts
- debug: item
with_dict: ansible_facts
However, running this produces the following error:
ERROR! this task 'debug' has extra params, which is only allowed in
the following modules: command, shell, script, include, include_vars,
add_host, group_by, set_fact, raw, meta
Use gather_facts which is true by default. It is equivalent to running setup module.
- hosts: ....
gather_facts: yes
The facts are saved in ansible variables to be used in playbooks. See System Facts
There are many ways to display the ansible facts. For you to understand how it works, try the following:
- hosts: 127.0.0.1
gather_facts: true
tasks:
- setup:
register: ansible_facts
- debug: item
with_dict: ansible_facts
Testing and working through it, this is working for me:
- name: Gather Windows Facts
hosts: "{{ target }}"
tasks:
- debug: var=vars
- debug: var=hostvars[inventory_hostname]

Resources