Save playbook information to hosts file - ansible

I have a playbook that spins up a new droplet on DigitalOcean using the core module built into Ansible:
- name: Provision droplet on DigitalOcean
local_action: digital_ocean
state=present
ssh_key_ids=1234
name=mydroplet
client_id=ABC
api_key=ABC
size_id=1
region_id=2
image_id=3
wait_timeout=500
register: my_droplet
- debug: msg="ID is {{ my_droplet.droplet.id }}"
- debug: msg="Your droplet has the IP address of {{ my_droplet.droplet.ip_address }}"
I run this using (note the local argument):
ansible-playbook playbooks/create_droplet.yml -c local -i playbooks/hosts
My hosts file initially looks like this:
[production]
TBA
[localhost]
localhost
When the above playbook finishes I can see the debug information in STDOUT:
ok: [localhost] => {
"msg": "Your droplet has the IP address of 255.255.255.255"
}
Is there any way for this playbook to retain the my_droplet.ip_address variable and save the TBA in the hosts file instead of having to manually copy-pasta it there? I ask because I want to add this provisioning playbook to a ruby script that subsequently bootstraps the VPS with another playbook.

I am doing the same, having written a play that creates servers from a dict (approximately 53 servers at a go, dynamically creating a full test environment). To use a static hosts file, I add the following:
- name: Create in-memory inventory
add_host:
name: "{{ item.value.ServerName }}"
groups: "{{ item.value.RoleTag }},tag_Environment_{{ env_name }}"
when: item.value.template is defined
with_dict: server_details
- name: create file
become: yes
template:
src: ansible-hosts.j2
dest: "{{ wm_inventory_file }}"
The ansible-hosts.j2 template is simply:
{% for item in groups %}
[{{item}}]
{% for entry in groups[item] %}
{{ entry }}
{% endfor %}
{% endfor %}

I'm launching instances with ec2, and I originally wanted to do the same thing. I found some examples of using lineinfile and massaging it to the do the same thing as so:
- name: Launching new instances for {{ application }}
local_action: ec2 group={{ security_group }} instance_type={{ instance_type}} image={{ image }} wait=true region={{ region }} keypair={{ keypair }} vpc_subnet_id={{ subnet }} count={{ instance_count }}
register: ec2
- name: Add instance to local host group
local_action: lineinfile dest=ansible_hosts regexp="{{ item.public_ip }}" insertafter="\[{{ application }}\]" line="{{ item.public_ip }} ansible_ssh_private_key_file=~/ec2-keys/{{ keypair }}.pem" state=present
with_items: ec2.instances
But, I have to agree that it's generally not something you want to do. I found it to be quite a pain. I've since switched to using add_host and life is much better. BTW, application would be the value I used for the group name...

Is there any way for this playbook to retain the my_droplet.ip_address
variable and save the TBA in the hosts file instead of having to
manually copy-pasta it there?
You can retain the ip address of your new host by using the add_host module which allows you to dynamically change the in-memory inventory during an ansible-playbook run. This is useful for when you want to provision a new host and then configure it in a single playbook.
For example
local_action: >
add_host
hostname={{ my_droplet.droplet.id }}
groupname=launched
And then later in your playbook:
- name: Configure instance(s)
hosts: launched
tasks:
...
the second part of your questions:
... and save the TBA in the hosts file instead of having to
manually copy-pasta it there?
There is no built-in ansible way to write additions to the inventory file that is on disk. This is generally not something you want to do. In this case you would need to add it or use a dynamic inventory script to discover the host for future configuration runs.

You should use a dynamic inventory script for this. Using name to distinguish instances, you can subsequently refer to your droplets.
Check https://github.com/ansible/ansible/blob/devel/contrib/inventory/digital_ocean.py for an example script.

Related

Ansible use 'raw' module to scp from one host to list of hosts (preferably in inventory)

I am limited to using the raw module in this task, which is what is causing the complications.
I simply want to grab a file from the local host, and scp it to the list of hosts in my inventory file. I don't want to have to set up ssh keys from all destination hosts to the source host, so I want the scp to go outwards, from source to list of destination hosts. I could do something along the lines of:
raw: "scp {{ file }} {{ user_id }}#{{ item }}:."
with_items: "{{ list_of_hosts }}"
but then i'd have to define the list of hosts in two places, which, with a dynamic list, you don't want to be doing.
Is there any way to have an inventory file such as:
[src-host]
hostA
[dest-hosts]
host1
host2
host3
[dest-hosts:vars]
user_id="foo"
file="bar"
and a playbook such as:
- hosts: src-host
tasks:
- raw: "scp {{ file }} {{ user_id }}#{{ item }}"
with_items: "{{ dest-hosts }}"
EDIT: To clarify based on comment 1, I may ONLY use the 'raw' module due to the limitations of the target (destination) host.

Ansible wait_for to test firewall openings to a host in the inventory with host alias [duplicate]

In Ansible 2.1, I have a role being called by a playbook that needs access to a host file variable. Any thoughts on how to access it?
I am trying to access the ansible_ssh_host in the test1 section of the following inventory host file:
[test1]
test-1 ansible_ssh_host=abc.def.ghi.jkl ansible_ssh_port=1212
[test2]
test2-1 ansible_ssh_host=abc.def.ghi.mno ansible_ssh_port=1212
[test3]
test3-1 ansible_ssh_host=abc.def.ghi.pqr ansible_ssh_port=1212
test3-2 ansible_ssh_host=abc.def.ghi.stu ansible_ssh_port=1212
[all:children]
test1
test2
test3
I have tried accessing the role in the following fashions:
{{ hostvars.ansible_ssh_host }}
and
{{ hostvars.test1.ansible_ssh_host }}
I get this error:
fatal: [localhost]: FAILED! => {"failed": true, "msg": "'ansible.vars.hostvars.HostVars object' has no attribute 'ansible'"}
You are on the right track about hostvars.
This magic variable is used to access information about other hosts.
hostvars is a hash with inventory hostnames as keys.
To access fields of each host, use hostvars['test-1'], hostvars['test2-1'], etc.
ansible_ssh_host is deprecated in favor of ansible_host since 2.0.
So you should first remove "_ssh" from inventory hosts arguments (i.e. to become "ansible_user", "ansible_host", and "ansible_port"), then in your role call it with:
{{ hostvars['your_host_group'].ansible_host }}
[host_group]
host-1 ansible_ssh_host=192.168.0.21 node_name=foo
host-2 ansible_ssh_host=192.168.0.22 node_name=bar
[host_group:vars]
custom_var=asdasdasd
You can access host group vars using:
{{ hostvars['host_group'].custom_var }}
If you need a specific value from specific host, you can use:
{{ hostvars[groups['host_group'][0]].node_name }}
You should be able to use the variable name directly
ansible_ssh_host
Or you can go through hostvars without having to specify the host literally
by using the magic variable inventory_hostname
hostvars[inventory_hostname].ansible_ssh_host
I struggled with this, too. My specific setup is: Your host.ini (with the modern names):
[test3]
test3-1 ansible_host=abc.def.ghi.pqr ansible_port=1212
test3-2 ansible_host=abc.def.ghi.stu ansible_port=1212
plus a play fill_file.yml
---
- remote_user: ec2-user
hosts: test3
tasks:
- name: fill file
template:
src: file.j2
dest: filled_file.txt
plus a template file.j2 , like
{% for host in groups['test3'] %}
{{ hostvars[host].ansible_host }}
{% endfor %}
This worked for me, the result is
abc.def.ghi.pqr
abc.def.ghi.stu
I have to admit it's ansible 2.7, not 2.1. The template is a variation of an example in https://docs.ansible.com/ansible/latest/user_guide/playbooks_variables.html.
The accepted answer didn't work in my setup. With a template
{{ hostvars['test3'].ansible_host }}
my play fails with "AnsibleUndefinedVariable: \"hostvars['test3']\" is undefined" .
Remark: I tried some variations, but failed, occasionally with "ansible.vars.hostvars.HostVars object has no element "; Some of this might be explained by what they say. in https://github.com/ansible/ansible/issues/13343#issuecomment-160992631
hostvars emulates a dictionary [...]. hostvars is also lazily loaded
I've found also a nice and simple way to address hostsvars right on one of Ansible's Github issues
Looks like you can do this as well:
- debug:
msg: "{{ ansible_ssh_host }}"
Thanks a lot this note was very useful for me!
Was able to send the variable defined under /group_var/vars in the ansible playbook
as indicated below.
tasks:
- name: check service account password expiry
- command:
sh /home/monit/get_ldap_attr.sh {{ item }} {{ LDAP_AUTH_USR }}

Ansible: set_fact variables across hosts blocks [duplicate]

In Ansible 2.1, I have a role being called by a playbook that needs access to a host file variable. Any thoughts on how to access it?
I am trying to access the ansible_ssh_host in the test1 section of the following inventory host file:
[test1]
test-1 ansible_ssh_host=abc.def.ghi.jkl ansible_ssh_port=1212
[test2]
test2-1 ansible_ssh_host=abc.def.ghi.mno ansible_ssh_port=1212
[test3]
test3-1 ansible_ssh_host=abc.def.ghi.pqr ansible_ssh_port=1212
test3-2 ansible_ssh_host=abc.def.ghi.stu ansible_ssh_port=1212
[all:children]
test1
test2
test3
I have tried accessing the role in the following fashions:
{{ hostvars.ansible_ssh_host }}
and
{{ hostvars.test1.ansible_ssh_host }}
I get this error:
fatal: [localhost]: FAILED! => {"failed": true, "msg": "'ansible.vars.hostvars.HostVars object' has no attribute 'ansible'"}
You are on the right track about hostvars.
This magic variable is used to access information about other hosts.
hostvars is a hash with inventory hostnames as keys.
To access fields of each host, use hostvars['test-1'], hostvars['test2-1'], etc.
ansible_ssh_host is deprecated in favor of ansible_host since 2.0.
So you should first remove "_ssh" from inventory hosts arguments (i.e. to become "ansible_user", "ansible_host", and "ansible_port"), then in your role call it with:
{{ hostvars['your_host_group'].ansible_host }}
[host_group]
host-1 ansible_ssh_host=192.168.0.21 node_name=foo
host-2 ansible_ssh_host=192.168.0.22 node_name=bar
[host_group:vars]
custom_var=asdasdasd
You can access host group vars using:
{{ hostvars['host_group'].custom_var }}
If you need a specific value from specific host, you can use:
{{ hostvars[groups['host_group'][0]].node_name }}
You should be able to use the variable name directly
ansible_ssh_host
Or you can go through hostvars without having to specify the host literally
by using the magic variable inventory_hostname
hostvars[inventory_hostname].ansible_ssh_host
I struggled with this, too. My specific setup is: Your host.ini (with the modern names):
[test3]
test3-1 ansible_host=abc.def.ghi.pqr ansible_port=1212
test3-2 ansible_host=abc.def.ghi.stu ansible_port=1212
plus a play fill_file.yml
---
- remote_user: ec2-user
hosts: test3
tasks:
- name: fill file
template:
src: file.j2
dest: filled_file.txt
plus a template file.j2 , like
{% for host in groups['test3'] %}
{{ hostvars[host].ansible_host }}
{% endfor %}
This worked for me, the result is
abc.def.ghi.pqr
abc.def.ghi.stu
I have to admit it's ansible 2.7, not 2.1. The template is a variation of an example in https://docs.ansible.com/ansible/latest/user_guide/playbooks_variables.html.
The accepted answer didn't work in my setup. With a template
{{ hostvars['test3'].ansible_host }}
my play fails with "AnsibleUndefinedVariable: \"hostvars['test3']\" is undefined" .
Remark: I tried some variations, but failed, occasionally with "ansible.vars.hostvars.HostVars object has no element "; Some of this might be explained by what they say. in https://github.com/ansible/ansible/issues/13343#issuecomment-160992631
hostvars emulates a dictionary [...]. hostvars is also lazily loaded
I've found also a nice and simple way to address hostsvars right on one of Ansible's Github issues
Looks like you can do this as well:
- debug:
msg: "{{ ansible_ssh_host }}"
Thanks a lot this note was very useful for me!
Was able to send the variable defined under /group_var/vars in the ansible playbook
as indicated below.
tasks:
- name: check service account password expiry
- command:
sh /home/monit/get_ldap_attr.sh {{ item }} {{ LDAP_AUTH_USR }}

lineinfile module of ansible with delegate_to localhost doesn't write all data to localhost, it writes only 1 random entry on localhost

I have 3 remote VMs and 1 ansible node.
I am getting the hostname of some VMs by running hostname command on those remote VMs through ansible shell module and registering that output in hostname_output variable.
Then I want to add those VM's IP (collected using gather_facts: True, {{ ansible_default_ipv4.address }} ) with their hostname and append it to a file temp_hostname on localhost, hence I am delegating the task to localhost.
But the issue is, when I see on console, the lineinfile module says that line has been added when the module executed for each node and delegated to localhost, but when I check the file on the localhost, only 1 entry is shown on localhost instead of 3.
---
- name: get hostnames of dynamically created VMs
hosts: all
remote_user: "{{ remote_user }}"
gather_facts: True
tasks:
- name: save hostname in variable, as this command is executed remotely, and we want the value on the ansible node
shell: hostname
register: hostname_output
- name: writing hostname_output in ansible node in file on ansible node
lineinfile:
line: "{{ ansible_default_ipv4.address }} {{ hostname_output.stdout }}"
dest: temp_hostname
state: present
delegate_to: 127.0.0.1
I even tried with copy module as specified in Ansible writing output from multiple task to a single file , but that also gave same result i.e 1 entry only.
---
- name: get hostnames of dynamically created VMs
hosts: all
remote_user: "{{ remote_user }}"
gather_facts: True
tasks:
- name: save hostname in variable, as this command is executed remotely, and we want the value on the ansible node
shell: hostname
register: hostname_output
- name: writing hostname_output in ansible node in file on ansible node
copy:
content: "{{ ansible_default_ipv4.address }} {{ hostname_output.stdout }}"
dest: /volume200gb/sushil/test/code_hostname/temp_hostname
delegate_to: 127.0.0.1
Finally when I used shell module with redirection operator, it worked as I wanted i.e 3 entries in file on localhost.
---
- name: get hostnames of dynamically created VMs
hosts: all
remote_user: "{{ remote_user }}"
gather_facts: True
tasks:
- name: save hostname in variable, as this command is executed remotely, and we want the value on the ansible node
shell: hostname
register: hostname_output
- name: writing hostname_output in ansible node in file on ansible node
shell: echo -e "{{ ansible_default_ipv4.address }} {{ hostname_output.stdout }}" >> temp_hostname
delegate_to: 127.0.0.1
I am calling this ansible-playbook get_hostname.yml using command:
ansible-playbook -i hosts get_hostname.yml --ssh-extra-args="-o StrictHostKeyChecking=no" --extra-vars "remote_user=cloud-user" -vvv
My hosts file is:
10.194.11.86 private_key_file=/root/.ssh/id_rsa
10.194.11.87 private_key_file=/root/.ssh/id_rsa
10.194.11.88 private_key_file=/root/.ssh/id_rsa
I am using ansible 2.1.0.0
I am using default ansible.cfg only, no modications
My question is why lineinfile and copy module didn't work? Did I miss anything or wrote something wrongly
I tried to reproduce your issue and it did not happen for me, I suspect this is a problem with your version of ansible, try with the latest.
That being said, I think you might be able to make it work using serial: 1, it is probably an issue with file locking that I don't see happening in ansible 2.3. I also think that instead of using a shell task to gather the hostname you could use the ansible_hostname variable which is provided as an ansible fact, and you can also avoid gathering ALL facts if all you want is the hostname by adding a task for that specifically. In the end, it would look like this:
---
- name: get hostnames of dynamically created VMs
hosts: all
serial: 1
remote_user: "{{ remote_user }}"
tasks:
- name: Get hostnames
setup:
filter: ansible_hostname
- name: writing hostname_output in ansible node in file on ansible node
lineinfile:
line: "{{ ansible_default_ipv4.address }} {{ ansible_hostname }}"
dest: temp_hostname
state: present
delegate_to: 127.0.0.1
I get inconsistent results using your first code block with lineinfile. Sometimes I get all 3 IPs and hostnames in the destination file and sometimes I only get 2. I'm not sure why this is happening but my guess is that Ansible is trying to save changes to the file at the same time and only one change gets picked up.
The second code block won't work since copy will overwrite the file unless content matches what is already there. The last host that runs will be the only IP/hostname in the destination file.
To work around this, you can loop over your play_hosts (the active hosts in the current play) and reference their variables using hostvars.
- name: writing hostname_output in ansible node in file on ansible node
lineinfile:
line: "{{ hostvars[item]['ansible_default_ipv4'].address }} {{ hostvars[item]['hostname_output'].stdout }}"
dest: temp_hostname
state: present
delegate_to: 127.0.0.1
run_once: True
with_items: "{{ play_hosts }}"
Or you can use a template with the same logic
- name: writing hostname_output in ansible node in file on ansible node
template:
src: IP_hostname.j2
dest: temp_hostname
delegate_to: 127.0.0.1
run_once: True
IP_hostname.j2
{% for host in play_hosts %}
{{ hostvars[host]['ansible_default_ipv4'].address }} {{ hostvars[host]['hostname_output'].stdout }}
{% endfor %}
The problem is here that there is multiple concurrent writes to only one file. That leads to unexpected results:
A solution for that is to use serial: 1 on your play, which forces non-parallel execution among your hosts.
But it can be a performance killer depending on the number of hosts.
I would suggest using another solution: instead of writing to only one file, each host delegation could write on its own file (here using the inventory_hostname value). Therefore, it will have no more concurrent writes.
After that, you can use the module assemble to merge all the file in one. Here is an example (untested):
---
- name: get hostnames of dynamically created VMs
hosts: all
remote_user: "{{ remote_user }}"
gather_facts: True
tasks:
- name: save hostname in variable, as this command is executed remotely, and we want the value on the ansible node
shell: hostname
register: hostname_output
- name: deleting tmp folder
file: path=/tmp/temp_hostname state=absent
delegate_to: 127.0.0.1
run_once: true
- name: create tmp folder
file: path=/tmp/temp_hostname state=directory
delegate_to: 127.0.0.1
run_once: true
- name: writing hostname_output in ansible node in file on ansible node
template: path=tpl.j2 dest=/tmp/temp_hostname/{{ inventory_hostname }}
delegate_to: 127.0.0.1
- name: assemble hostnames
assemble: src=/tmp/temp_hostname/ dest=temp_hostname
delegate_to: '{{ base_rundeck_server }}'
run_once: true
Obviously you have to create the tpl.j2 file.

Accessing inventory host variable in Ansible playbook

In Ansible 2.1, I have a role being called by a playbook that needs access to a host file variable. Any thoughts on how to access it?
I am trying to access the ansible_ssh_host in the test1 section of the following inventory host file:
[test1]
test-1 ansible_ssh_host=abc.def.ghi.jkl ansible_ssh_port=1212
[test2]
test2-1 ansible_ssh_host=abc.def.ghi.mno ansible_ssh_port=1212
[test3]
test3-1 ansible_ssh_host=abc.def.ghi.pqr ansible_ssh_port=1212
test3-2 ansible_ssh_host=abc.def.ghi.stu ansible_ssh_port=1212
[all:children]
test1
test2
test3
I have tried accessing the role in the following fashions:
{{ hostvars.ansible_ssh_host }}
and
{{ hostvars.test1.ansible_ssh_host }}
I get this error:
fatal: [localhost]: FAILED! => {"failed": true, "msg": "'ansible.vars.hostvars.HostVars object' has no attribute 'ansible'"}
You are on the right track about hostvars.
This magic variable is used to access information about other hosts.
hostvars is a hash with inventory hostnames as keys.
To access fields of each host, use hostvars['test-1'], hostvars['test2-1'], etc.
ansible_ssh_host is deprecated in favor of ansible_host since 2.0.
So you should first remove "_ssh" from inventory hosts arguments (i.e. to become "ansible_user", "ansible_host", and "ansible_port"), then in your role call it with:
{{ hostvars['your_host_group'].ansible_host }}
[host_group]
host-1 ansible_ssh_host=192.168.0.21 node_name=foo
host-2 ansible_ssh_host=192.168.0.22 node_name=bar
[host_group:vars]
custom_var=asdasdasd
You can access host group vars using:
{{ hostvars['host_group'].custom_var }}
If you need a specific value from specific host, you can use:
{{ hostvars[groups['host_group'][0]].node_name }}
You should be able to use the variable name directly
ansible_ssh_host
Or you can go through hostvars without having to specify the host literally
by using the magic variable inventory_hostname
hostvars[inventory_hostname].ansible_ssh_host
I struggled with this, too. My specific setup is: Your host.ini (with the modern names):
[test3]
test3-1 ansible_host=abc.def.ghi.pqr ansible_port=1212
test3-2 ansible_host=abc.def.ghi.stu ansible_port=1212
plus a play fill_file.yml
---
- remote_user: ec2-user
hosts: test3
tasks:
- name: fill file
template:
src: file.j2
dest: filled_file.txt
plus a template file.j2 , like
{% for host in groups['test3'] %}
{{ hostvars[host].ansible_host }}
{% endfor %}
This worked for me, the result is
abc.def.ghi.pqr
abc.def.ghi.stu
I have to admit it's ansible 2.7, not 2.1. The template is a variation of an example in https://docs.ansible.com/ansible/latest/user_guide/playbooks_variables.html.
The accepted answer didn't work in my setup. With a template
{{ hostvars['test3'].ansible_host }}
my play fails with "AnsibleUndefinedVariable: \"hostvars['test3']\" is undefined" .
Remark: I tried some variations, but failed, occasionally with "ansible.vars.hostvars.HostVars object has no element "; Some of this might be explained by what they say. in https://github.com/ansible/ansible/issues/13343#issuecomment-160992631
hostvars emulates a dictionary [...]. hostvars is also lazily loaded
I've found also a nice and simple way to address hostsvars right on one of Ansible's Github issues
Looks like you can do this as well:
- debug:
msg: "{{ ansible_ssh_host }}"
Thanks a lot this note was very useful for me!
Was able to send the variable defined under /group_var/vars in the ansible playbook
as indicated below.
tasks:
- name: check service account password expiry
- command:
sh /home/monit/get_ldap_attr.sh {{ item }} {{ LDAP_AUTH_USR }}

Resources