Change ansible inventory based on variables - ansible

Can a playbook load inventory list from variables? So I can easily customize the run based on chosen environment?
tasks:
- name: include environment config variables
include_vars:
file: "{{ item }}"
with_items:
- "../../environments/default.yml"
- "../../environments/{{ env_name }}.yml"
- name: set inventory
set_fact:
inventory.docker_host = " {{ env_docker_host }}"

Yes. Use the add_host module: https://docs.ansible.com/ansible/latest/modules/add_host_module.html

As I'm in ansible 2.3 I can't use the add_host module (see Jack's answer and add_host docs) and that would be a superior solution. Therefore, I'll use a different trick to augment an existing ansible inventory file, reload and use it.
hosts.inv
[remotehosts]
main.yml
- hosts: localhost
pre_tasks:
- name: include environment config variables
include_vars:
file: "{{ item }}"
with_items:
- "../environments/default.yml"
- "../environments/{{ env_name }}.yml"
- name: inventory facts
run_once: true
set_fact:
my_host: "{{ env_host_name }}"
- name: update inventory for env
local_action: lineinfile
path=hosts.inv
regexp={{ my_host }}
insertafter="[remotehosts]" line={{ my_host }}
- meta: refresh_inventory
- hosts: remotehosts
...
The pretasks process the environments yml with all the variable replacement etc and use that to populate hosts.inv prior to reloading via refresh_inventory
Any tasks defined beneath - hosts: remotehosts would execute on the remote host or hosts.

Related

Access variables at play level

I use localhost and set_fact to store variables and access them in different playbooks.
---
- hosts: localhost
connection: local
gather_facts: False
tasks:
- name: set_variables
set_fact:
cloudinit_fqdn: 'server1.example.com'
additional_container_config_values:
security.nesting: 'false'
security.privileged: 'false'
cloudinit_network_raw:
version: 2
renderer: networkd
ethernets:
eth0:
dhcp4: False
addresses: [192.168.178.35/24]
gateway4: 192.168.178.1
nameservers:
addresses: [192.168.178.13]
Now I want to use the cloudinit_fqdn at import_playbook:
- name: system configuration
import_playbook: "{{ hostvars['localhost']['cloudinit_fqdn'] }}_server_config.yml"
I tried different ways to get that variable, but I get errors like:
'ERROR! 'hostvars' is undefined'
I am not able to get access to that variable by:
- debug:
msg: '{{ vars }}'
ERROR! 'debug' is not a valid attribute for a Play
How can I use a variable at play-level?
Regarding your use case I've setup a short test to come around the syntax errors of the variable, as well the debug task.
---
- hosts: localhost
become: false
gather_facts: false
tasks:
- name: Set variables
set_fact:
example_fqdn: 'test.example.com'
- name: Show variables
debug:
msg: "{{ hostvars['localhost'].example_fqdn }}"
While the example is working, adding
- name: Import playbook
import_playbook: "{{ hostvars['localhost'].example_fqdn }}.yml"
or even a simple
- name: Import playbook
import_playbook: "{{ example_fqdn }}.yml"
let the playbook run fail with
ERROR! 'hostvars' is undefined
ERROR! 'example_fqdn' is undefined
since the import is done during compile time, whereby the variable will be defined during runtime. Even not possible is
- name: Import playbook
import_playbook: "{{ to_import }}.yml"
vars:
to_import: "{{ example_fqdn }}"
as the import is static, not dynamic. Importing playbooks and Re-using playbooks seems not working in that way.
What is actually working is
- name: Import playbook
import_playbook: test.example.com.yml
Furher Questions and Answers
Ansible: import_playbook fails with variable undefined error
Ansible: Skip import_playbook with variable definition
What's the difference between include_tasks and import_tasks

How to pass a files/ folder in ansible include_role option

I am trying to reuse an existing role by using the include_role feature in ansible but I can not seem to find a way to pass the files inside the files/testrole1.yaml folder from the calling role and it always uses the files from the common role.
Here is the structure and code I came up with so far:
---
- name: importing tasks from role1
include_role:
name: service-deploy-role1
tasks_from: "{{item}}"
loop:
- install
- setup
The above code always uses the testrole1.yaml file. Is is possible to pass the testrole2.yml when I call the install task from the service-deploy-role1?
I could figure out the solution:
---
- name: workaround
set_fact:
role_location: "{{ role_path }}"
- name: debug role path
debug:
msg: "{{ role_location }}"
- name: importing tasks from role1
include_role:
name: service-deploy-role1
tasks_from: "{{item}}"
vars:
role_dir: "{{ role_location }}"
loop:
- install
- setup

ansible with_items pass variables from outside file

I just want to pass list of rpm packages in a yml file and call it in with_items inside my tasks.
Which format the yml file should be. Please help me. I googled a lot, still being confused. I need to achieve so that I could change only the package names in outside file, without changing the main file.
Ex: files.yaml
---
- vars:
modules:
- firmware-system-p89-2.56_2018_01_22-1.1.i386.rpm
- firmware-smartarray-ea3138d8e8-6.30-1.1.x86_64.rpm
=> passing to with_items in another file
---
- name: List required packages
include_vars:
-files.yml
set_fact: pkglist="{{ item}}"
with_items:
- "{{ modules }}"
register: pkglist_result
Comment:
Thanks a lot. that helps.
I did just this to finally accomplish. I wasn't trying right. But I mentioned just the name of the packages in files.yml and placed the real packages in files directory where tasks directory resides.
- name: List required packages
include_vars: files.yml
register: pkglist_result
- name: make a list
set_fact: pkg_list="{{ pkglist_result.ansible_facts.modules}}"
- debug: var=pkg_list
files.yml:
---
modules:
- firmware-system-p89-2.56_2018_01_22-1.1.i386.rpm
- firmware-smartarray-ea3138d8e8-6.30-1.1.x86_64.rpm
example playbook:
---
- hosts: my_hosts
vars_files:
- files.yml
tasks:
- name: print module name one by one
debug:
msg: "{{ item }}"
with_items: "{{ modules }}"

Read a file locally and use the vars remote in Ansible

I read a YAML file locally with the following playbook:
- name: Ensure the deploy_manifest var is defined and read deploy manifest
hosts: localhost
connection: local
gather_facts: False
tasks:
- assert:
that: deploy_manifest is defined
msg: |
Error: Must provide providers config path. Fix: Add '-e deploy_manifest=/path/to/manifest' to the ansible-playbook command
- name: Read deploy manifest
include_vars:
file: "{{ deploy_manifest }}"
name: manifest
register: manifest
- debug:
msg: "[{{ manifest.key }}]: {{ manifest.value }}"
with_dict: "{{ manifest.ansible_facts }}"
and then in the same playbook YAML file I run:
- name: Deploy Backend services
hosts: backend
remote_user: ubuntu
gather_facts: False
vars:
env: "{{ env }}"
services: "{{ manifest.ansible_facts }}"
tasks:
- include_role:
name: services_backend
when: backend | default(true) | bool
However it doesn't work because debug fails. It says that manifest is empty.
Which is the best way to read a YAML file or generally a configuration in a playbook and then have the variables passed in another playbook?
Your debug module doesn't say "that manifest is empty", it says the key manifest.key does not exist because it does not.
You registered a fact named manifest with:
register: manifest
You try to refer to a key of the above manifest named key and another key (!) named value:
msg: "[{{ manifest.key }}]: {{ manifest.value }}"
Please read Looping over Hashes chapter and acknowledge that (without using loop control) you refer to the iterated variable using item.
Please note that with name: manifest and register: manifest you read your vars file into manifest.ansible_facts.manifest.

Ansible Dynamic Inventory

I'm running a playbook which houses multiple roles targets multiple hosts
The goal is to deploy a VM and use it's IP to deploy an app.
My playbook, has two roles, using "build_vm" role I'm able to display IP address via debug, yet when passing ipaddr variable to second role, Ansible complains that the variable is not defined
- hosts: linux
become: true
roles:
- build_vm
- tasks:
- debug: msg="{{ ipaddr }}"
- hosts: "{{ ipaddr }}"
roles:
- deploy_app
I have used set_fact with and ran into same issue, I wonder what I should be using here? dynamic inventory? I have searched sparse docs online and I'm unable to find an intuitive example to follow.
There are many ways to using add_host. In this example, I am adding the new host to a group and using it in a later play.
- hosts: linux
become: true
roles:
- build_vm
- tasks:
- debug: msg="{{ ipaddr }}"
- name: Add ipaddr to host inventory
add_host: name="{{ ipaddr }}" group=NewHostGroup
- hosts: NewHostGroup
roles:
- deploy_app

Resources