Ansible dynamic inventory: unable to use group_vars - ansible

Here is my directory structure,
├── README.md
├── internal-api.retry
├── internal-api.yaml
├── ec2.py
├── environments
│   ├── alpha
│   │   ├── group_vars
│   │   │   ├── alpha.yaml
│   │   │   ├── internal-api.yaml
│   │   ├── host_vars
│   │   ├── internal_ec2.ini
│   ├── prod
│   │   ├── group_vars
│   | │   ├── prod.yaml
│   │   │   ├── internal-api.yaml
│   │   │   ├── tag_Name_prod-internal-api-3.yml
│   │   ├── host_vars
│   │   ├── internal_ec2.ini
│   └── stage
│   ├── group_vars
│   │   ├── internal-api.yaml
│   │   ├── stage.yaml
│   ├── host_vars
│   │   ├── internal_ec2.ini
├── roles
│   ├── internal-api
├── roles.yaml
I am using separate config for an ec2 instance with tag Name = prod-internal-api-3, so I have defined a separate file, tag_Name_prod-internal-api-3.yaml in environments/prod/group_vars/ folder.
Here is my tag_Name_prod-internal-api-3.yaml,
---
internal_api_gunicorn_worker_type: gevent
Here is my main playbook, internal-api.yaml
- hosts: all
any_errors_fatal: true
vars_files:
- "environments/{{env}}/group_vars/{{env}}.yaml" # this has the ssh key,users config according to environments
- "environments/{{env}}/group_vars/internal-api.yaml"
become: yes
roles:
- internal-api
For prod deployemnts, I do export EC2_INI_PATH=environment/prod/internal_ec2.ini, likewise for stage and alpha. In environment/prod/internal_ec2.ini I have added instance filter, instance_filters = tag:Name=prod-internal-api-3
When I run my playbook,
I get this error,
fatal: [xx.xx.xx.xx]: FAILED! => {"changed": false, "msg": "AnsibleUndefinedVariable: 'internal_api_gunicorn_worker_type' is undefined"}
It means that it is not able to pick variable from the file tag_Name_prod-internal-api-3.yaml. Why is it happening? Do I need to manually add it in include_vars(I don't think that should be the case)?

Okay, so it is really weird, like really really weird. I don't know whether it has been documented or not(please provide link if it is).
If your tag Name is like prod-my-api-1, then the file name tag_Name_prod-my-api-1 will not work.
Your filename has to be tag_Name_prod_my_api_1. Yeah, thanks ansible for making me cry for 2 days.

Related

Copying entire folder structure from a role merging/overwriting with already existing files

I use Ansible to deploy my userspecific configuration (shell, texteditor, etc.) on a newly installed system. That's why i have all config files in my roles file directory, structured the same way as they should be placed in my home directory.
What's the correct way to realize this? I don't want to list every single file in the role and exisiting files should be overwriten, existing directories should be merged.
I've tried the copy module, but the whole task is skipped; I assume because the parent directory(.config) already exist.
Edit: add the requested additional information
Ansible Version: 2.9.9
The roles copy task:
- name: Install user configurations
copy:
src: "home/"
dest: "{{ ansible_env.HOME }}"
The Files to copy in the role directory:
desktop-enviroment
├── defaults
│   └── main.yml
├── files
│   └── home
│   ├── .config
│   │   ├── autostart-scripts
│   │   │   └── ssh-keys.sh
│   │   ├── MusicBrainz
│   │   │   ├── Picard
│   │   │   ├── Picard.conf
│   │   │   └── Picard.ini
│   │   ├── sublime-text-3
│   │   │   ├── Installed Packages
│   │   │   ├── Lib
│   │   │   ├── Local
│   │   │   └── Packages
│   │   └── yakuakerc
│   └── .local
│   └── share
│   ├── plasma
│   └── yakuake
├── handlers
│   └── main.yml
├── meta
│   └── main.yml
├── tasks
│   ├── desktop-common.yaml
│   ├── desktop-gnome.yaml
│   ├── desktop-kde.yaml
│   └── main.yml
├── templates
└── vars
└── main.yml
The relevant ansible output:
TASK [desktop-enviroment : Install user configurations] **
ok: [localhost]

Ansible Molecule how to use multiple group_vars

I have a folder structure like this in Ansible where global variables are at the root group_vars and then environment specific variables are in inventories/dev/group_vars/all etc.
.
├── ansible.cfg
├── group_vars
│   └── all
├── inventories
│   ├── dev
│   │   ├── group_vars
│   │   │   └── all
│   │   └── hosts
│   └── prod
│   ├── group_vars
│   │   └── all
│   └── hosts
└── playbook.yml
I want to use to be able reuse the existing variables in both var files in Molecule but unable to do so as it cannot find the variable. Something similar to the below works but I need both group_vars/all and inventories/dev/group_vars/all
extract of my molecule.yml
provisioner:
name: ansible
inventory:
links:
group_vars: ../../../group_vars
I tried comma separated and that doesn't work because afterall it's just a symlink to the file.

Ansible how to get access to the main level roles dir ? so i can include generic tasks from any playbook?

i have role which i try to get using includ_role
now i have this file structure
.
├── foo_A
│   └── roles
│   ├── foo_deploy
├── foo_B
│   └── roles
│   ├── db_foo
│   │   └── tasks
├── foo_C
│   └── roles
│   ├── package_deploy
│   │   ├── defaults
│   │   ├── files
│   │   └── tasks
│   │   └── main.yml  
├── group_vars
└── roles
└── utilities
├── defaults
├── files
├── handlers
├── meta
├── tasks
└── dpackage.yml 
├── templates
└── vars
I'm calling the include_role with the utilities name from main.yml
but I'm getting an error that that main level role is not units search paths
ERROR! the role 'utilities' was not found in /home/ec2-user/ansible/foo_C/roles:/home/ec2-user/.ansible/roles:/usr/share/ansible/roles:/etc/ansible/roles:/home/ec2-user/ansible/foo_C
The error appears to be in '/home/ec2-user/ansible/foo_C/roles/package_deploy/tasks/main.yml': line 78, column 11, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
include_role:
name: utilities
^ here
how can i get access to the main roles dir under : /home/ec2-user/ansible

Anible vars in inventories directory no applying

I am using a role (zaxos.lvm-ansible-role) to manage lvms on a few hosts. Initially I had my vars for the lvm under host_vars/server.yaml which works.
Here is the working layout
├── filter_plugins
├── group_vars
├── host_vars
│   ├── server1.yaml
│   └── server2.yaml
├── inventories
│   ├── preprod
│   ├── preprod.yml
│   ├── production
│   │   ├── group_vars
│   │   └── host_vars
│   ├── production.yaml
│   ├── staging
│   │   ├── group_vars
│   │   └── host_vars
│   └── staging.yml
├── library
├── main.yaml
├── module_utils
└── roles
└── zaxos.lvm-ansible-role
├── defaults
│   └── main.yml
├── handlers
│   └── main.yml
├── LICENSE
├── meta
│   └── main.yml
├── README.md
├── tasks
│   ├── create-lvm.yml
│   ├── main.yml
│   ├── mount-lvm.yml
│   ├── remove-lvm.yml
│   └── unmount-lvm.yml
├── tests
│   ├── inventory
│   └── test.yml
└── vars
└── main.yml
For my environment it would make more sense to have the host_vars under the inventories directory which is also supported (Alternative Directory Layout) as per Ansible doc.
However when I change to this layout the vars are not initialized and the lvms on the host don’t change.
├── filter_plugins
├── inventories
│   ├── preprod
│   │   ├── group_vars
│   │   └── host_vars
│   │   ├── server1.yaml
│   │   └── server2.yaml
│   ├── preprod.yml
│   ├── production
│   │   ├── group_vars
│   │   └── host_vars
│   ├── production.yaml
│   ├── staging
│   │   ├── group_vars
│   │   └── host_vars
│   └── staging.yml
├── library
├── main.yaml
├── module_utils
└── roles
└── zaxos.lvm-ansible-role
├── defaults
│   └── main.yml
├── handlers
│   └── main.yml
├── LICENSE
├── meta
│   └── main.yml
├── README.md
├── tasks
│   ├── create-lvm.yml
│   ├── main.yml
│   ├── mount-lvm.yml
│   ├── remove-lvm.yml
│   └── unmount-lvm.yml
├── tests
│   ├── inventory
│   └── test.yml
└── vars
└── main.yml
Any idea why this approach is not working?
Your host_vars directory must reside in ansible's discovered inventory_dir.
With the above filetree, I guess you are launching your playbook with ansible-playbook -i inventories/preprod.yml yourplaybook.yml. In this context, ansible discovers inventory_dir as inventories
The solution is to move your inventory files inside each directory for your environment, e.g. for preprod => mv inventories/preprod.yml inventories/preprod/
You can then launch your playbook with ansible-playbook -i inventories/preprod/preprod.yml yourplaybook.yml and it should work as you expect.

Callback plugin didn't work with Ansible v2.0

I am using Ansible v2.0 and using this plugin, which shows the time that each task consume and here is my directory struture:
.
├── aws.yml
├── callback_plugins
│   ├── profile_tasks.py  
├── inventory
│   └── hosts
├── roles
│   ├── ec2instance
│   │   ├── defaults
│   │   │   └── main.yml
│   │   └── tasks
│   │   └── main.yml
│   ├── ec2key
│   │   ├── defaults
│   │   │   └── main.yml
│   │   └── tasks
│   │   └── main.yml
│   ├── ec2sg
│   │   ├── defaults
│   │   │   └── main.yml
│   │   └── tasks
│   │   └── main.yml
│   ├── elb
│   │   ├── defaults
│   │   │   └── main.yml
│   │   └── tasks
│   │   └── main.yml
│   ├── rds
│   │   ├── defaults
│   │   │   └── main.yml
│   │   └── tasks
│   │   └── main.yml
│   └── vpc
│   ├── defaults
│   │   └── main.yml
│   └── tasks
│   └── main.yml
└── secret_vars
├── backup.yml
└── secret.yml
But when I run the playbook, it didn't show the result, can you please point me that where I am making mistake.
I am able to solve this problem by adding this to the ansible.cfg file:
[defaults]
callback_whitelist = profile_tasks
plugin is included with ansible 2.0 and as most of those included it requires whitelisting in ansible.cfg
Hope this will help others.
Did you set callback directory in your ansible.cfg file?
If not, just add ansible.cfg file at the root level of your directory and specify path to your callback folder.
Because there are other plugin types, I suggest placing callback_plugins inside of the plugins folder.
[defaults]
callback_plugins = ./plugins/callback_plugins

Resources