How to override role's file on Ansible? - provisioning

I am using the zzet.rbenv role on my playbook. It has a files/default-gems file that it copies to the provisioned system.
I need my playbook to check for a myplaybook/files/default-gems and use it if it exists, using the zzet.rbenv/files/default-gems if otherwise.
How can I do that?

After some research and trial/error. I found out that Ansible is not capable of checking if files exist between roles. This is due to the way role dependencies (which roles themselves) will get expanded into the one requiring it, making it part of the playbook. There are no tasks that will let you differentiate my_role/files/my_file.txt from required_role/files/my_file.txt.
One approach to the problem (the one I found the easiest and cleanest) was to:
Add a variable to the my_role with the path to the file I want to use (overriding the default one)
Add a task (identical to the one that uses the default file) that checks if the above variable is defined and run the task using it
Example
required_role
# Existing task
- name: some task
copy: src=roles_file.txt dest=some/directory/file.txt
when: my_file_path is not defined
# My custom task
- name: my custom task (an alteration of the above task)
copy: src={{ my_file_path }} dest=/some/directory/file.txt
when: my_file_path is defined
my_role
#... existing code
my_file_path: "path/to/my/file"
As mentioned by Ramon de la Fuente: this solution was accepted into the zzet.rbenv repo :)

Related

Determine return type for object returned from a task

I have a playbook which should retrieve atrifacts from maven repo, extract them to temp dir and copy some file to destinatiom folder. Currently it works pretty fine - artifacts are downloaded using maven_artifact task. But some requirements have changed and I need to use get_url task now. After changing to get_url the whole rest of the playbook is broken because object returned from maven_artifact and get_url are of different types. How to determine what type with what fields is getting returned from a task?
Best regards
No matter which ansible module you use there is the option to create variables from the output of the task by using register.
The ansible documentation states which return values are available to you when doing so. Here for example are the return values for the get_url module: https://docs.ansible.com/ansible/latest/reference_appendices/common_return_values.html
In that case you may do something like the following to retrieve the status code of the get_url module:
- name: Download foo.conf
get_url:
url: http://example.com/path/file.conf
dest: /etc/foo.conf
mode: '0440'
register: my_result
- name: Print status code of get_url
debug:
var: my_result.status_code
Each module returns an object of a different type.
There Is no way within Ansible to identify the type of a registered variable (i.e. what attributes you can read from it) however, a module will always return an object of the same type.
The return values of a module are listed at the bottom of that modules documentation page.

Set ansible facts though generated name

I’m working on a ansible playbook to deploy some configuration of services to different region, and I have initial vars imported by include_vars as the following:
common: [...]
us_local: [...]
uk_local: [...]
us_global: [...]
uk_global: [...]
Basically, I want to generate the configuration by including vars from common, all the global configs as well as the local config of that region, using {{ site }} variable which is defined in hosts.yaml.
For example, if the deployed host is us, then I want to use common, us_local, us_global, uk_global.
I will use a jinja2 template to generate the final config, and from my understanding, the easiest way is to create another variable called current_site_local and copy everything from {{ site }}_local into it, so that later on I could directly reference it inside the template. However, I’m having trouble making it work through set_facts.
Any help would be appreciated.
UPDATE:
I used the following syntax and it works:
- name: generate curr_site_local
set_fact:
current_site_local: '{{ vars[site + "_local"] }}'
Try using the combine filter.
- name: Set site config '{{ site }}'
set_fact:
current_site_local: '{{ common
| combine(vars[site + "_global"])
| combine(vars[site + "_local"]) }}'
In this case, the order of precedence is that the local config will override the global, and that will override the common.
Not sure if that is what you wanted, but that is the order you gave in your question, but the <other>_global is not included now.
If you want common to have highest precedence, just reverse the order.
See docs for combine.
Updated my answer with suggestion from Matthew L Daniel.

Resolve Local Files by Playbook Directory?

I have the following Ansible role which simply does the following:
Create a temporary directory.
Download Goss, a server testing tool, into that temporary directory.
Upload a main Goss YAML file for the tests.
Upload additional directories for additional included tests.
Here are a couple places where I'm using it:
naftulikay.python-dev
naftulikay.ruby-dev
Specifically, these playbooks upload a local file adjacent to the playbook named goss.yml and a directory goss.d again adjacent to the playbook.
Unfortunately, it seems that Ansible logic has changed recently, causing my tests to not work as expected. My role ships with a default goss.yml, and it appears that when I set goss_file: goss.yml within my playbook, it uploads degoss/files/goss.yml instead of the Goss file adjacent to my playbook.
If I'm passing the name of a file to a role, is there a way to specify that Ansible should look up the file in the context of the playbook or the current working directory?
The actual role logic that is no longer working is this:
# deploy test files including the main and additional test files
- name: deploy test files
copy: src={{ item }} dest={{ degoss_test_root }} mode=0644 directory_mode=0755 setype=user_tmp_t
with_items: "{{ [goss_file] + goss_addtl_files + goss_addtl_dirs }}"
changed_when: degoss_changed_when
I am on Ansible 2.3.2.0 and I can reproduce this across distributions (namely CentOS 7, Ubuntu 14.04, and Ubuntu 16.04).
Ansible searches for relative paths in role's scope first, then in playbook's scope.
For example if you want to copy file test.txt in role r1, search order is this:
/path/to/playbook/roles/r1/files/test.txt
/path/to/playbook/roles/r1/test.txt
/path/to/playbook/roles/r1/tasks/files/test.txt
/path/to/playbook/roles/r1/tasks/test.txt
/path/to/playbook/files/test.txt
/path/to/playbook/test.txt
You can inspect your search_path order by calling ansible with ANSIBLE_DEBUG=1.
To answer your question, you have to options:
Use filename that doesn't exist within role's scope. Like:
goss_file: local_goss.yml
Supply absolute path. For example, you can use:
goss_file: '{{ playbook_dir }}/goss.yml'
Ansible doesn't apply search logic if the path is absolute.

How to Include task from another role in ansible?

I want to include a task from a different role.
I would not want to hardcode it like
- name : Set topology based on Jenkins job name
include: ../../pre-req/tasks/set-topo.yml
tags: core
Is there a way to do this with dependency? I tried creating a meta directory with files and tasks, somehow it' s not getting triggered.
something like this
vim roles/pre-req/meta/main.yml
---
allow_duplicates: yes
dependencies:
- { role: topo, tags: ['core'] }
I would not want to hardcode it like
Why not? You want to include a task, and that's how you include a task.
If what you want to do is include the entire other role, Ansible 2.2 (released yesterday) added include_role.

Have ansible role retrieve its files from external location as part of its own role

So one thing we've encountered in our project is that we do not want to store our large files in our git repo for our ansible roles because it slows down cloning (and git limits files to 100 mb anyways).
What we've done is store our files in a separate internal location, where our files can sit statically and have no size restrictions. Our roles are written so that they first pull these static files to their local files folder and then continue like normal.
i.e.
roles/foo/tasks/main.yml
- name: Create role's files directory
file:
path: "{{roles_files_directory}}"
state: directory
- name: Copy static foo to local
get_url:
url: "{{foo_static_gz}}"
dest: "{{roles_files_directory}}/{{foo_gz}}"
#....Do rest of the tasks...
roles/foo/vars/main.yml
roles_files_directory: "/some/path/roles/foo/files"
foo_static_gz: "https://internal.foo.tar.gz"
foo_gz: "foo.tar.gz"
The main thing I don't find really sound is the hard coded path to the role's files directory. I preferably would like to dynamically look up the path when running ansible, but I haven't been able to find documentation on that. The issue can arise because different users may check roles to a different root paths. Does anyone know how to dynamically know the role path, or have some other pattern that solves the overall problem?
Edit:
I discovered there's actually a {{playbook_dir}} variable that would return "/some/path", which might be dynamic enough in this case. Still isn't safe against the situation where the role name might change, but that's a way rarer occurrence and can be handled through version control.
What about passing values from the command line?
---
- hosts: '{{ hosts }}'
remote_user: '{{ user }}'
tasks:
- ...
ansible-playbook release.yml --extra-vars "hosts=vipers user=starbuck"
http://docs.ansible.com/playbooks_variables.html#passing-variables-on-the-command-line
I just want to add another possible solution: you can try to add custom "facter".
Here is a link to official documentation: http://docs.ansible.com/setup_module.html
And I found this article that might be useful: http://serverascode.com/2015/01/27/ansible-custom-facts.html

Resources