I have 4 playbooks. 2 of them are deploying services on my target machines and 2 of them are removing them again.
Now I want to put them in roles. But I'm not sure what the best-practice is.
The 2 deploy playbooks are doing the exact same thing only with different variables and templates. Same applies to the remove playbooks.
Atm my structure looks like this:
ansible.cfg
ssh_key
inventoryfile
group_vars
....
roles
deployservicegroupA
vars
...
templates
...
tasks
main.yml (this file simply includes the two tasks right below)
copy-service-templates.yml
start-services.yml
deployservicegroupB
vars
...
templates
...
tasks
main.yml (this file simply includes the two tasks right below)
copy-service-templates.yml
start-services.yml
removeservicegroupA
vars
...
templates
...
tasks
main.yml (this file simply includes the two tasks right below)
remove-services.yml
cleanup.yml
removeservicegroupB
vars
...
templates
...
tasks
main.yml (this file simply includes the two tasks right below)
remove-services.yml
cleanup.yml
Is this they way it was intended to be done by users?
I'm especially wondering about my tasks that do the exact same thing, but can be found in different roles. And also if I should include my tasks in the main.yml task file.
As per your comment you are using a group for each service, you can use group_vars to specify the variables you want to use.
You can then merge the roles together, the only thing you will have to do is load specific templates based on the group you are running your play on.
Related
TASK [Test: Install Test Authentication] ***********************************
fatal: [ubuntu2004]: FAILED! => {"reason": "Could not find or access '/home/test/test.playbook/molecule/default/test.yaml' on the Ansible Controller."}
Molecule handles files and templates fine relative to the .roles folder...why does this not occur for tasks when called via include_tasks :
- name: Install Test Authentication
include_tasks:
file: test.yaml
when: test is defined
Folder structure is pretty simple.
~/playbook
|___molecule
|___default
|___converge.yaml
|___.roles
|___ files
|___ tasks
|___ main.yaml <- called with no issues
|___ test.yaml <- will not find when used in case above.
Obviously, there are other files...but my template tasks and file tasks work fine.....following their relative paths but tasks won't, why is this or what am I doing wrong. I can find no documentation and I am sure others have run into the issue, yet all I can find is the following:
https://github.com/ansible-community/molecule/issues/2171 which is about ansible-lint but its the closest thing I could find.
Also, there seems to be a total of 4 locations to discuss/ask questions regarding molecule....so I am not sure which will get answered first.
|TLDR;
How do I get molecule to follow include_tasks correctly.
From what i see you have a wrong role structure.
Molecule folder should be in role folder.
This is from what you should start. After that recheck file naming. Recently i had common problem and it was, because lack of file extension.
I have the following Ansible role which simply does the following:
Create a temporary directory.
Download Goss, a server testing tool, into that temporary directory.
Upload a main Goss YAML file for the tests.
Upload additional directories for additional included tests.
Here are a couple places where I'm using it:
naftulikay.python-dev
naftulikay.ruby-dev
Specifically, these playbooks upload a local file adjacent to the playbook named goss.yml and a directory goss.d again adjacent to the playbook.
Unfortunately, it seems that Ansible logic has changed recently, causing my tests to not work as expected. My role ships with a default goss.yml, and it appears that when I set goss_file: goss.yml within my playbook, it uploads degoss/files/goss.yml instead of the Goss file adjacent to my playbook.
If I'm passing the name of a file to a role, is there a way to specify that Ansible should look up the file in the context of the playbook or the current working directory?
The actual role logic that is no longer working is this:
# deploy test files including the main and additional test files
- name: deploy test files
copy: src={{ item }} dest={{ degoss_test_root }} mode=0644 directory_mode=0755 setype=user_tmp_t
with_items: "{{ [goss_file] + goss_addtl_files + goss_addtl_dirs }}"
changed_when: degoss_changed_when
I am on Ansible 2.3.2.0 and I can reproduce this across distributions (namely CentOS 7, Ubuntu 14.04, and Ubuntu 16.04).
Ansible searches for relative paths in role's scope first, then in playbook's scope.
For example if you want to copy file test.txt in role r1, search order is this:
/path/to/playbook/roles/r1/files/test.txt
/path/to/playbook/roles/r1/test.txt
/path/to/playbook/roles/r1/tasks/files/test.txt
/path/to/playbook/roles/r1/tasks/test.txt
/path/to/playbook/files/test.txt
/path/to/playbook/test.txt
You can inspect your search_path order by calling ansible with ANSIBLE_DEBUG=1.
To answer your question, you have to options:
Use filename that doesn't exist within role's scope. Like:
goss_file: local_goss.yml
Supply absolute path. For example, you can use:
goss_file: '{{ playbook_dir }}/goss.yml'
Ansible doesn't apply search logic if the path is absolute.
I want to deploy staging and production environment using ansible. All components are same for both staging and production except few configurations. So i created following files
group_vars
- all.yml
- production.yml
- staging.yml
Whenever i run ansible-playbook, it loads configuration from all.yml. But i also want to load either production.yml or staging.yml.
How do i include this configuration when run ansible playbook command?
This should actually work out of the box, given you have your hosts grouped in your inventory.
So let's say your inventory looks like this:
[production]
host.a
host.b
[staging]
host.c
host.d
And then you'd have the following yaml files, relative to your playbook:
group_vars/all
group_vars/production
group_vars/staging
The vars from all matching groups will be loaded. And of course additionally the all file.
Instead of files, the group names could also be directories and all included yaml files would then be loaded.
Update after discussion in comments:
So if your inventory for production looks like this:
[redisServers]
host.a
host.b
[apiServers]
host.c
host.d
[SQLServers]
host.e
host.f
Then you'd add another group production. To not repeat all the hostnames you can create a group of groups, like so:
[production:children]
redisServers
apiServers
SQLServers
So one thing we've encountered in our project is that we do not want to store our large files in our git repo for our ansible roles because it slows down cloning (and git limits files to 100 mb anyways).
What we've done is store our files in a separate internal location, where our files can sit statically and have no size restrictions. Our roles are written so that they first pull these static files to their local files folder and then continue like normal.
i.e.
roles/foo/tasks/main.yml
- name: Create role's files directory
file:
path: "{{roles_files_directory}}"
state: directory
- name: Copy static foo to local
get_url:
url: "{{foo_static_gz}}"
dest: "{{roles_files_directory}}/{{foo_gz}}"
#....Do rest of the tasks...
roles/foo/vars/main.yml
roles_files_directory: "/some/path/roles/foo/files"
foo_static_gz: "https://internal.foo.tar.gz"
foo_gz: "foo.tar.gz"
The main thing I don't find really sound is the hard coded path to the role's files directory. I preferably would like to dynamically look up the path when running ansible, but I haven't been able to find documentation on that. The issue can arise because different users may check roles to a different root paths. Does anyone know how to dynamically know the role path, or have some other pattern that solves the overall problem?
Edit:
I discovered there's actually a {{playbook_dir}} variable that would return "/some/path", which might be dynamic enough in this case. Still isn't safe against the situation where the role name might change, but that's a way rarer occurrence and can be handled through version control.
What about passing values from the command line?
---
- hosts: '{{ hosts }}'
remote_user: '{{ user }}'
tasks:
- ...
ansible-playbook release.yml --extra-vars "hosts=vipers user=starbuck"
http://docs.ansible.com/playbooks_variables.html#passing-variables-on-the-command-line
I just want to add another possible solution: you can try to add custom "facter".
Here is a link to official documentation: http://docs.ansible.com/setup_module.html
And I found this article that might be useful: http://serverascode.com/2015/01/27/ansible-custom-facts.html
I have the following problem. I'm keeping two separate Ansible project directories for two different technologies. Imagine you have a nice Ansible setup and want to pull an Ansible project and use some of your established structure without integrating it completely.
The first statement does what I want. It gives a fq path.
debug: msg="{{lynx_ansible}}/roles/centos_common/centos_{{jdk_provider}}.yml"
include: "{{lynx_ansible}}/roles/centos_common/centos_{{jdk_provider}}.yml"
The include adds a path to the ansible-project root dir and doesn't expand the variables. Is there a way to do this?
Try $lynx_ansible rather than {{ lynx_ansible }}. Include doesn't seem to support jinja2 syntax.