Is it possible to use dynamic host_vars files? - ansible

I'm aware I can add host variables dynamically via an inventory script. I'm wondering if I can scripts in the host_vars directory which will be executed instead of simple read.
I have tried to create a simple script that outputs some variables. It seems only .json and .yml or no extension are read by the ansible-playbook. Since these are not executed the raw source will result in an error.
So, hence the question. Is this even possible and if not, would you be aware of a method to achieve the same results: Query a (local) dynamic source for variables of a particular host.

I'm pretty sure lookup("pipe") will do what you want, provided the script is available on the target host:
- set_fact:
my_vars: '{{ lookup("pipe", "./my_script.py") | from_json }}'
(substituting from_json with from_yaml or whatever to coerce the textual output from the script into a python datastructure; it's possible that ansible would coerce it automagically, but explicit is better than implicit)
If you want the script that is on the control machine to run, you'll likely have to do some hoopjumpery with delegate_to: and some hostvars ninjary to promote the set_fact: off of the control host over to all playbook hosts

Related

Is there any way to use environmental variables inside Ansible's AWS dynamic inventory file (aws_ec2)

I'd like to use an environmental variable inside aws_ec2 inventory file for the simple case of being able to easily separate different environments. Let's take this configuration as an example:
plugin: aws_ec2
filters:
#tag:Cust_code: "01"
tag:Cust_code: "{{ lookup('env','CUSTOMER_CODE') }}"
While the first line works (commented out), the second obviously doesn't and an empty host list is returned:
$ export CUSTOMER_CODE="01"
$ echo $CUSTOMER_CODE
01
$ ansible-inventory -i inventory/aws_ec2.yaml --graph
#all:
|--#aws_ec2:
|--#ungrouped:
I've read that the reason is because jinja2 templates are not supported in inventory files, even though they seem to work for some specific parameters according to this post - https://stackoverflow.com/a/72241930/19407408 .
I don't want to use dynamic inventory script because I feel it might be too complicated and I don't understand the official documentation for it. I also would prefer not to use different inventory files for different environments as I already have to use 2 different inventory files for the same environment (because for some hosts I need to use "ansible_host: private_ip_address" via compose and for jumphosts I can't). Although the latter will have to be the solution if there's no better alternative.
Has anyone been able to come up with a clever solution to this problem?
No, filters: (and its exclude_filters: and include_filters: siblings) are not jinja2 aware. The good thing about ansible being open source is that one can see under the hood how things work:
_query applies the include and exclude filters
ansible_dict_to_boto3_filter_list merely pivots the tag:Name=whatever over to [{"Name":"tag:Name","Values":["whatever"]}] format that boto wants, without further touching the key nor the values
_get_instances_by_region just calls describe-instances with those filters, again, without involving jinja2
Depending on how many instances the unfiltered list contains, using groups: or keyed_groups: may be an option along with a parameterized - hosts: in your playbook (e.g. - hosts: cust_code{{ CUSTOMER_CODE }})
Otherwise, I'd guess your best bet would be to use add_host: in a separate play in the playbook, since that allows you to have almost unlimited customization:
- hosts: localhost
tasks:
- add_host: ...
groups:
- cust_code_machines
- hosts: cust_code_machines
tasks:
- debug: msg="off to the races"

Ansible: How to use multi-value data list as inventory and pass to target host

The source of my host inventory is from an internal tool that outputs pairs of values, example, here are six observations, I currently have 160 observations:
servername1 processname1
servername1 processname2
servername1 processname3
servername2 processname1
servername3 processname1
servername4 processname1
So column 1 is my target host list (my inventory). Column2 are unique processname values, assigned specifically to the value of the servername. Often the same server will occur. Some servers have only one processname, others may have 2 to N. Meaning my target host may repeat for a unique list of processnames. I want to use both dynamic inventory from this output list of pairs, and I need both values on each observation to be associated and assigned to variables. I'm not absolutely required to use dynamic inventory, I just need a solution. I also need to pass to the target host and the value in {{ processname# }}, via the command: or shell: modules. (This is unique, there are no modules related to this need)
If required, I have a way to filter this data and output it in JSON format or YAML, making a separate YML file for each host. While I'd prefer to process these dynamically; pre-processing the list is acceptable.
Because ansible-playbook, requires some known host inventory list, I'm getting stuck understanding how I can create this list, from my dynamic output, at the time I start the play.
What I've done so far: I've tried reading up and trying to set these pairs as in the /etc/ansible/hosts/host_vars/servername#.yml files. This is extremely ugly, as I have to pre-process the output of the data, into YML format. But it does not give me a host list to reference in my playbook. So while it seems that hostvar is the logical choice, I cannot get my head around it.
What I need:
- The suggested format of the data? JSON? YAML? Other? (if I cannot read it in dynamically.
- Is putting this in host_vars correct?
- Last night I saw another answer using set_fact, would that help?
Thank you for any insight. I've now been using Ansible for 3.5 weeks! I've done pretty good using static and dynamic inventories, but this stumps me as the inventory list is not obvious, give the format of the matched pairs.
Note: MANY have suggested using host_vars, but that seems to me, to be reserved to hostnames, and related port and proxy values. I could be wrong.
===================================================================
UPDATE: Thanks for the help in the right direction.
I have updated our inventory script to output host list in JSON.
The first new option is to output the hosts in JSON.
Example:
{"my_host":["servername1","servername2",]}
Calling this as a dynamic inventory script, works great!
ansible all -m ping
servername1 | SUCCESS => {
"changed": false,
"ping": "pong"
}
servername2 | SUCCESS => {
"changed": false,
"ping": "pong"
}
Next: The second new option to the inventory script was to add a new switch to input a hostname. This part is still confusing me. Here is the output:
showInv --host=servername1
{"servername1":["processname1","processname2","processname3",]}
The final part that I am missing is how I call the inventory script with a specific "--host={{ my_host }} , from within my playbook.
It seems that I need to find the variable for the existing hostname and pass that back to the inventory script as the switch option "--host= "
You say that you are OK with dynamic inventories. Make your own.
Here is the docs.
You need to make a script that will do two things:
when executed with --list, processes your file and prints this JSON to stdout:
{ "myhosts": ["servername1", "servername2", "servername3"] }
when executed with --host servername1, prints this JSON to stdout:
{ "myprocesses": ["processname1", "processname2"] }
So with --list you should provide uniq list of hosts. In my example they belong to myhosts group.
And with --host <hostname> you should provide list dict of host vars for that host (<hostname>). In my example there is a list variable myprocesses that contains all processes for that host.
Then just call ansible-playbook -i my_inv_script myplaybook.yml.
Example playbook:
---
- hosts: myhosts
tasks:
- debug:
msg: "Process name is {{ item }}"
with_items: "{{ myprocesses }}"
This playbook will go trough all hosts in your dynamic inventory and print all processes for each host.
You will need to develop a dynamic inventory script, that takes the first column as the hostname, and the second column as variables for that host.
Please, find below the link to my dynamic inventory written in php
https://github.com/walden-it/ansible-ij/blob/master/inventory.php
take a look at the functions get_vars() and get_hosts() to see how is the array being populated.
And in case you need it, here is the dump for the database this script is looking at:
https://github.com/walden-it/ansible-ij/blob/master/ansible.sql
Then you just specify it with -i inventory in the ansible run, or add it as inventory_file to the ansible.cfg
Closing this out. With the help of Konstantin's suggestions, I now have a working play. What is not immediately apparent, is that Ansible is doing some "magic" behind the scenes. I had to modify my inventory script, that generates my dynamic inventory to accept the "--list" switch option, and the "--host hostname" option.
Once this was done, I could run the playbook with the -i listInv and Ansible internally calls this script as listInv --list, which produces my dynamic inventory list. Then it loops, through to the with_items, and internally calls the script as, listInv --host {{ items }} and outputs the matching processnames.
Additionally, the JSON output generated by my script, had to make the "group" (first) field, "myprocess". Initially, I had it as "my_process", and this failed. Removing the underscore, fixed, that error.
All working now. This is a great example for learning, but it's still magic.
The playbook looks like this:
- hosts: all
gather_facts: no
connection: local
tasks:
- debug:
msg: "Process name is {{ item }}"
with_items: "{{ myprocess }}"

Defining host as variable in Ansible hosts file

I have a rather simple hosts file
[clients]
qas0062
[dbs_server]
qas0063
For the users of the project we don't want them to modify hosts file but rather we have a separate user.config.yml file that contain various user-configurable parameters. There we have entry such as
dbs_server: qas0065
So the question is: is it possible to use a variable in the hosts file that would use a value defined in user.config.yml? And what would be the format?
Pretty sure you can't templatize the actual host key entry in the inventory, but you can templatize the value of its ansible_host connection var to achieve roughly the same effect, eg:
[clients]
clienthost ansible_host="{{ clienthost_var }}"
[dbs_server]
dbsserver ansible_host="{{ dbsserver_var }}"
then set the value of those vars from external vars before the play starts executing (eg, with the vars_files directive or -e).
There is another way to do the same thing. We can simply refer to values in the hosts (inventory) file by using the following syntax in our playbook
host={{ groups['dbs_server'][0] }}
This works well when you have one entry in the group (db_server in this specific case)

Read Ansible command invocation

Is it possible to detect within a playbook what the Ansible invocation was? Specifically I'd like to detect whether the "--ask-vault-pass" option was supplied, and if not, exit the playbook.
I think the easiest way might be to use another task to check the content of your file. For example,
- name: check if a string is in the content
failed_when: "expected_string" not in "{{ file.content }}"
If the vault-pass/file is not supplied, the file can not be decrypted. However, you will also need to make sure the ansible.cfg file is not configured with the file path already.

Using a variable from one Ansible var file in a second var file

In using Ansible, I'm trying to use a vaulted vars file to store private variables, and then using those in another vars file, in the same role. (The idea from 'Vault Pseudo leaf encryption' here.)
e.g. I have one standard vars file, roles/myrole/vars/main.yml:
---
my_variable: '{{ my_variable_vaulted }}'
and then one which is encrypted, roles/myrole/vars/vaulted_vars.yml:
---
my_variable_vaulted: 'SECRET!'
But when I run the playbook I always get '"ERROR! ERROR! 'my_variable_vaulted' is undefined"'.
I've tried it without encrypting the second file, to make sure it's not an issue with encryption, and I'm getting the same error.
The reason why my_variable_vaulted wasn't available was because I hadn't included the variable file. I'd assumed that all files in a role's vars/ directory were picked up automatically, but I think that's only the case with vars/main.yml.
So, to make the vaulted variables available to all tasks within the role, in roles/myrole/tasks/main.yml I added this before all the tasks:
- include_vars: vars/vaulted_vars.yml
That is not the best way to handle vault in ansibles. Much better approach is outlined in vault documentation for ansible. So you would create your basic variable for environment in group_vars/all.yml like that:
my_variable: {{ vault_my_variable }}
And then in your inventories/main you decide which hosts should load which vault file to satisfy this variable. As example you can have that in your inventories/main:
[production:children]
myhost1
[development:children]
myhost2
[production_vault:children]
production
[development_vault:children]
development
Then ansible will automatically fetch production_vault.yml or development_vault.yml respectively from group_vars depending on which environment box belongs to. And then you can use my_variable in your roles/playbooks as before, without having to worry about fetching it from the right place.

Resources