Ansible - Importing group_vars all.yml from a playbook - ansible

I got a dir structure like this:
.
├── inventories
│   └── production
│   ├── ansible.cfg
│   ├── group_vars
│   │   ├── all.yml
│ │ └── gitlab.yml
│   ├── hosts
│   └── host_vars
│  
├── playbooks
│   └── gitlab.yml
└── roles
└── gitlab
   ├── defaults
   │   └── main.yml
   ├── handlers
   │   ├── gitlab.yml
   │   └── main.yml
   ├── meta
   │   └── main.yml
   ├── README.md
   ├── tasks
   │   ├── configure.yml
   │   ├── conf_integrity.yml
   │   ├── hook.yml
   │   ├── install.yml
   │   └── main.yml
   ├── templates
   │   ├── certificates
   │   ├── gitlab.rb.j2
   │   ├── post-receive-gdys.sh.j2
   │   ├── post-receive-mys.sh.j2
   │   ├── post-receive-no-backup-gdys.sh.j2
   │   ├── post-receive-no-backup-mys.sh.j2
   │   ├── post-receive.sh.j2
   │   ├── ssl-crt.j2
   │   └── ssl-key.j2
   └── vars
   ├── conf_list.yml
   ├── hook.yml
   ├── main.yml
   └── package.yml
And I want to import all.yml vars to all of my hosts(in any role) for production inventory. And want to import group/role spesific vars (like gitlab.yml) to only my relevant roles. How can i do this? How should be content of my gitlab.yml playbook? In my setup ansible cant import the group_vars/all.yml to my playbook's tasks.
inventories/production/hosts:
[ansible]
[gitlab]
gitlab.zek.local
inventories/production/group_vars/all.yml:
gitlab_settings:
Fqdn: "git.zek.local"
Rails_shell_ssh_port: "22"
Use_self_signed_certs: "yes"
Backup:
enabled: "no"
Server: git02.zek.local
Port: 22
ssh_settings:
Port: "22"
PasswordAuthentication: "no"
roles/gitlab/tasks/configure.yml:
- name: Generating ssl cert for GitLab
command: >
openssl req -x509 -nodes -subj '/CN={{ gitlab_settings[Fqdn] }}' -days 365
-newkey rsa:4096 -sha256 -keyout /etc/gitlab/{{ gitlab_settings[Fqdn] }}.key -out /etc/gitlab/ssl/{{ gitlab_settings[Fqdn] }}.crt
creates=/etc/gitlab/ssl/{{ gitlab_settings[Fqdn] }}.crt
when: "{{ gitlab_settings['Use_self_signed_certs'] }}" == "yes"
notify:
- GitLab servisi yeniden baslatiliyor
sudo: yes
tags: ssl
playbooks/gitlab.yml:
---
- hosts: gitlab
remote_user: zek
sudo: yes
vars_files:
- ../roles/gitlab/vars/package.yml
- ../roles/gitlab/vars/hook.yml
- ../roles/gitlab/vars/conf_list.yml
roles:
- { role: gitlab }
my command for run playbook:
ansible-playbook -i inventories/production playbooks/gitlab.yml --flush-cache

use a separate inventory:
inventory/production
inventory/global
In global put your global vars: inventory/global/group_vars/all.yml
Execute your playbook with the two inventories:
ansible-playbook -i inventories/production -i invenrory/global playbooks/gitlab.yml

Related

Ansible dynamic inventory: unable to use group_vars

Here is my directory structure,
├── README.md
├── internal-api.retry
├── internal-api.yaml
├── ec2.py
├── environments
│   ├── alpha
│   │   ├── group_vars
│   │   │   ├── alpha.yaml
│   │   │   ├── internal-api.yaml
│   │   ├── host_vars
│   │   ├── internal_ec2.ini
│   ├── prod
│   │   ├── group_vars
│   | │   ├── prod.yaml
│   │   │   ├── internal-api.yaml
│   │   │   ├── tag_Name_prod-internal-api-3.yml
│   │   ├── host_vars
│   │   ├── internal_ec2.ini
│   └── stage
│   ├── group_vars
│   │   ├── internal-api.yaml
│   │   ├── stage.yaml
│   ├── host_vars
│   │   ├── internal_ec2.ini
├── roles
│   ├── internal-api
├── roles.yaml
I am using separate config for an ec2 instance with tag Name = prod-internal-api-3, so I have defined a separate file, tag_Name_prod-internal-api-3.yaml in environments/prod/group_vars/ folder.
Here is my tag_Name_prod-internal-api-3.yaml,
---
internal_api_gunicorn_worker_type: gevent
Here is my main playbook, internal-api.yaml
- hosts: all
any_errors_fatal: true
vars_files:
- "environments/{{env}}/group_vars/{{env}}.yaml" # this has the ssh key,users config according to environments
- "environments/{{env}}/group_vars/internal-api.yaml"
become: yes
roles:
- internal-api
For prod deployemnts, I do export EC2_INI_PATH=environment/prod/internal_ec2.ini, likewise for stage and alpha. In environment/prod/internal_ec2.ini I have added instance filter, instance_filters = tag:Name=prod-internal-api-3
When I run my playbook,
I get this error,
fatal: [xx.xx.xx.xx]: FAILED! => {"changed": false, "msg": "AnsibleUndefinedVariable: 'internal_api_gunicorn_worker_type' is undefined"}
It means that it is not able to pick variable from the file tag_Name_prod-internal-api-3.yaml. Why is it happening? Do I need to manually add it in include_vars(I don't think that should be the case)?
Okay, so it is really weird, like really really weird. I don't know whether it has been documented or not(please provide link if it is).
If your tag Name is like prod-my-api-1, then the file name tag_Name_prod-my-api-1 will not work.
Your filename has to be tag_Name_prod_my_api_1. Yeah, thanks ansible for making me cry for 2 days.

Anible vars in inventories directory no applying

I am using a role (zaxos.lvm-ansible-role) to manage lvms on a few hosts. Initially I had my vars for the lvm under host_vars/server.yaml which works.
Here is the working layout
├── filter_plugins
├── group_vars
├── host_vars
│   ├── server1.yaml
│   └── server2.yaml
├── inventories
│   ├── preprod
│   ├── preprod.yml
│   ├── production
│   │   ├── group_vars
│   │   └── host_vars
│   ├── production.yaml
│   ├── staging
│   │   ├── group_vars
│   │   └── host_vars
│   └── staging.yml
├── library
├── main.yaml
├── module_utils
└── roles
└── zaxos.lvm-ansible-role
├── defaults
│   └── main.yml
├── handlers
│   └── main.yml
├── LICENSE
├── meta
│   └── main.yml
├── README.md
├── tasks
│   ├── create-lvm.yml
│   ├── main.yml
│   ├── mount-lvm.yml
│   ├── remove-lvm.yml
│   └── unmount-lvm.yml
├── tests
│   ├── inventory
│   └── test.yml
└── vars
└── main.yml
For my environment it would make more sense to have the host_vars under the inventories directory which is also supported (Alternative Directory Layout) as per Ansible doc.
However when I change to this layout the vars are not initialized and the lvms on the host don’t change.
├── filter_plugins
├── inventories
│   ├── preprod
│   │   ├── group_vars
│   │   └── host_vars
│   │   ├── server1.yaml
│   │   └── server2.yaml
│   ├── preprod.yml
│   ├── production
│   │   ├── group_vars
│   │   └── host_vars
│   ├── production.yaml
│   ├── staging
│   │   ├── group_vars
│   │   └── host_vars
│   └── staging.yml
├── library
├── main.yaml
├── module_utils
└── roles
└── zaxos.lvm-ansible-role
├── defaults
│   └── main.yml
├── handlers
│   └── main.yml
├── LICENSE
├── meta
│   └── main.yml
├── README.md
├── tasks
│   ├── create-lvm.yml
│   ├── main.yml
│   ├── mount-lvm.yml
│   ├── remove-lvm.yml
│   └── unmount-lvm.yml
├── tests
│   ├── inventory
│   └── test.yml
└── vars
└── main.yml
Any idea why this approach is not working?
Your host_vars directory must reside in ansible's discovered inventory_dir.
With the above filetree, I guess you are launching your playbook with ansible-playbook -i inventories/preprod.yml yourplaybook.yml. In this context, ansible discovers inventory_dir as inventories
The solution is to move your inventory files inside each directory for your environment, e.g. for preprod => mv inventories/preprod.yml inventories/preprod/
You can then launch your playbook with ansible-playbook -i inventories/preprod/preprod.yml yourplaybook.yml and it should work as you expect.

Ansible with "Alternative Directory Layout" and using vaults

I am trying to use the Alternative Directory Layout and ansible-vaults within.
But when i run my playbook, variables which are vault encrypted could not resolve with that directory structure. So what iam doing wrong?
I execute via:
ansible-playbook -i inventories/inv/hosts playbooks/inv/invTest.yml --check --ask-vault
Here is my structure:
.
├── inventories
│   ├── inv
│   │   ├── group_vars
│   │   │   ├── var.yml
│   │   │   └── vault.yml
│   │   └── hosts
│   └── staging
│      ├── group_vars
│      │   ├── var.yml
│      │   └── vault.yml
│      └── hosts
├── playbooks
│   ├── staging
│   │   └── stagingTest.yml
│   └── inv
│   ├── invTest.retry
│   └── invTest.yml
└── roles
├── basic-linux
│   ├── defaults
│   │   └── main.yml
│   └── tasks
│   └── main.yml
├── test
│   ├── defaults
│   │   └── main.yml
│   └── tasks
│   └── main.yml
└── webserver
├── defaults
│   └── main.yml
├── files
├── handler
│   └── main.yml
├── tasks
│   └── main.yml
└── templates
this is my hosts file (inventories/inv/hosts):
[inv]
testvm-01 ansible_ssh_port=22 ansible_ssh_host=172.16.0.101 ansible_ssh_user=root
testvm-02 ansible_ssh_port=22 ansible_ssh_host=172.16.0.102 ansible_ssh_user=root
playbook (playbooks/inv/invTest.yml):
---
- name: this is test
hosts: inv
roles:
- { role: ../../roles/test }
...
role which uses the vault encrypted var (roles/test/tasks/main.yml):
---
- name: create test folder
file:
path: "/opt/test/{{ app_user }}/"
state: directory
owner: "{{ default_user }}"
group: "{{ default_group }}"
mode: 2755
recurse: yes
...
var which points to vault (inventories/inv/group_vars/var.yml):
---
app_user: '{{ vault_app_user }}'
app_pass: '{{ vault_app_pass }}'
...
vault file (ansible-vault edit inventories/inv/group_vars/vault.yml):
vault_app_user: itest
vault_app_pass: itest123
The error message iam getting is something like this:
FAILED! => {"failed": true, "msg": "the field 'args' has an invalid value, which appears to include a variable that is undefined. The error was: {{ app_user }}: 'app_user' is undefined\n\nThe error appears to have been in 'roles/test/tasks/main.yml': but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n - name: create test folder\n ^ here\n"}
You define variable app_user in a file called var.yml stored in group_vars folder.
In your execution line you point to the inventories/inv/hosts as your inventory directory.
It doesn't matter what strings you used in this path -- from Ansible's point of view it sees only:
hosts
group_vars
├── var.yml
└── vault.yml
It will read var.yml for a host group called var and vault.yml for a host group called vault.
In your case -- never.
You likely wanted to organise your files this way:
inventories
└── production
├── group_vars
│ └── inv
│ ├── var.yml
│ └── vault.yml
└── hosts
This way, files in group_vars/inv will be read for hosts in group inv.

Ansible runs all dependency role even specifying specific tags

Ansible executes all dependency role but my main.yml in meta folder looks like this:
---
dependencies:
- { role: common, caller_role: docker, tags: ['packages'] }
So, ansible should execute that part of role common that contains the following:
---
- name: Install required packages
package: name={{ item.name }} state=present
with_items:
- "{{ vars[caller_role]['SYSTEM']['PACKAGES'] }}"
tags:
- packages
- name: Modify /etc/hosts
lineinfile:
dest: /etc/hosts
line: "{{ vars[caller_role]['REGISTRY']['ip'] }} {{ vars[caller_role]['REGISTRY']['hostname']}}"
tags:
- write_etc_hosts
I execute ansible 2.1.1.0 as follows: ansible-playbook --list-tags site.yml and here i'm copying site.yml:
- hosts: localhost
connection: local
remote_user: root
become: yes
roles:
- docker
And finally the tree:
├── common
│   ├── defaults
│   │   └── main.yml
│   ├── files
│   ├── handlers
│   │   └── main.yml
│   ├── meta
│   │   └── main.yml
│   ├── README.md
│   ├── tasks
│   │   └── main.yml
│   ├── templates
│   ├── tests
│   │   ├── inventory
│   │   └── test.yml
│   └── vars
│   └── main.yml
├── docker
│   ├── defaults
│   │   └── main.yml
│   ├── files
│   ├── handlers
│   │   └── main.yml
│   ├── meta
│   │   └── main.yml
│   ├── README.md
│   ├── tasks
│   │   └── main.yml
│   ├── templates
│   ├── tests
│   │   ├── inventory
│   │   └── test.yml
│   └── vars
│   └── main.yml
└── site.yml
I fail to understand what is happening..
If you specify tags for a role, Ansible applies them to every task in that role.
In your example, tag packages will be added to every task in the role common.
Please inspect tag inheritance section in the documentation.
You can apply tags to more than tasks, but they ONLY affect the tasks themselves. Applying tags anywhere else is just a convenience so you don’t have to write it on every task
All of these [samples] apply the specified tags to EACH task inside the play, included file, or role, so that these tasks can be selectively run when the playbook is invoked with the corresponding tags.
OK, thank you Konstantin. For this purpose i think i will use:
- include: foo.yml
tags: [web,foo]
Regards

Callback plugin didn't work with Ansible v2.0

I am using Ansible v2.0 and using this plugin, which shows the time that each task consume and here is my directory struture:
.
├── aws.yml
├── callback_plugins
│   ├── profile_tasks.py  
├── inventory
│   └── hosts
├── roles
│   ├── ec2instance
│   │   ├── defaults
│   │   │   └── main.yml
│   │   └── tasks
│   │   └── main.yml
│   ├── ec2key
│   │   ├── defaults
│   │   │   └── main.yml
│   │   └── tasks
│   │   └── main.yml
│   ├── ec2sg
│   │   ├── defaults
│   │   │   └── main.yml
│   │   └── tasks
│   │   └── main.yml
│   ├── elb
│   │   ├── defaults
│   │   │   └── main.yml
│   │   └── tasks
│   │   └── main.yml
│   ├── rds
│   │   ├── defaults
│   │   │   └── main.yml
│   │   └── tasks
│   │   └── main.yml
│   └── vpc
│   ├── defaults
│   │   └── main.yml
│   └── tasks
│   └── main.yml
└── secret_vars
├── backup.yml
└── secret.yml
But when I run the playbook, it didn't show the result, can you please point me that where I am making mistake.
I am able to solve this problem by adding this to the ansible.cfg file:
[defaults]
callback_whitelist = profile_tasks
plugin is included with ansible 2.0 and as most of those included it requires whitelisting in ansible.cfg
Hope this will help others.
Did you set callback directory in your ansible.cfg file?
If not, just add ansible.cfg file at the root level of your directory and specify path to your callback folder.
Because there are other plugin types, I suggest placing callback_plugins inside of the plugins folder.
[defaults]
callback_plugins = ./plugins/callback_plugins

Resources