We have sets of config files that end in the environment extension (eg. app.properties.prod, app.properties.dev, db.prod, db.dev, and so on). I am passing Ansible a variable named environment=prod with the intention of pulling just the files ending in a .prod extension from a filerepo and then need to drop that suffix from the filename so it ends up as app.properties
Something like this will find the correct files:
(env = prod)
copy:
src: "{{item}}"
dest: /app/homedir
with_fileglob:
- /go/to/my/repo/*{{env}}
This copies the correct files to my /app/homedir
However, trying to drop the env file extension does not work
copy:
src: "{{dropsuffix}}"
dest: "{{dropsuffix.split('.{{env}}')[0] }}"
with_fileglob:
-/app/homedir/*.{{env}}
loop_control:
loop_var: dropsuffix
However removing the {{env}} and just adding the text 'prod' will work
dest: "{{dropsuffix.split('.prod')[0] }}"
I'm assuming there is some jinja formatting issue with the variable nested in there, I've tried various permutations and I'm stumped
It's possible to use regex_replace filter to remove the extension.
- copy:
src: "{{ item }}"
dest: "/app/homedir/{{ item|basename|regex_replace(regex, replace) }}"
loop: "{{ lookup('fileglob', '/go/to/my/repo/*.' ~ env, wantlist=True) }}"
vars:
regex: "{{ '^(.*)\\.' ~ env ~ '$' }}"
replace: "{{ '\\1' }}"
Related
I faced problems when I don't know how to pass many variables in with_items:
I have these vars:
- vars:
paths:
- /tmp/tecom/python3/
- /tmp/tecom/pip/
- /tmp/tecom/psycopg2/
- /tmp/tecom/docker/
- /tmp/tecom/postgresql/
files:
- python3.tar
- pip.tar
- psycopg2.tar
- docker.tar
- postgresql.tar
And I have the task that should extract the archives:
- name: unarchive "{{ item }}"
unarchive:
src: "{{ item }}"
dest: # should take the paths above. Every path should match it's own `.tar` file.
with_items: "{{ files }}"
Every path should match its own .tar file.
For example,
- debug:
msg: "unarchive {{ item }} to {{ _path }}"
loop: "{{ files }}"
vars:
_path: "{{ path }}/{{ item|splitext|first }}/"
path: /tmp/tecom
files:
- python3.tar
- pip.tar
- psycopg2.tar
- docker.tar
- postgresql.tar
gives (abridged)
msg: unarchive python3.tar to /tmp/tecom/python3/
msg: unarchive pip.tar to /tmp/tecom/pip/
msg: unarchive psycopg2.tar to /tmp/tecom/psycopg2/
msg: unarchive docker.tar to /tmp/tecom/docker/
msg: unarchive postgresql.tar to /tmp/tecom/postgresql/
Q: "How do I unarchive those files to the folders?"
A: Use the expressions as appropriate, e.g.
- name: "unarchive {{ item }}"
unarchive:
src: "{{ item }}"
dest: "{{ path }}/{{ item|splitext|first }}/"
loop: "{{ files }}"
(not tested)
It's up to you where you put the variables files and path. See Variable precedence: Where should I put a variable?
I have a directory with several tools and text files that I would like to copy to multiple users homedirs on given host. There are couple of caveats:
I don't want to list the files in playbook
Some files are executable, some are not
I want the files to be owned by their respective users
I want the symlinks to be copied as-is
When pushing to ansible_user home dir, there's no issue as ansible.posix.synchronize does the job very well with archive=true and owner/group set to false:
- name: Sync testing_files repo to ansible_user
ansible.posix.synchronize:
src: ~/testing_files/
dest: ~/testing_files/
mode: push
archive: true
delete: true
owner: false
group: false
register: rsync_output
The links are properly handled as well (they are rsynced as symlinks).
However, the problem is with populating the same for other users. I tried the following approach:
- name: Sync testing_files repo to extra_users
ansible.builtin.copy:
src: ~/testing_files/
dest: ~{{ item }}/testing_files/
remote_src: true
mode: preserve
owner: "{{ item }}"
follow: true # I tried with `false` as well
with_items:
- "{{ extra_users if extra_users is not none else [] }}"
The file permissions are correct, the owner as well, however:
the group of file remains the same as source file
the symlinks are ignored
How can I make it work? For the group issue the only solution I came up is to have another task that will run stat to check group and save it for future use, e.g. like this:
- name: Get group of homedir
ansible.builtin.stat:
path: "~{{ item }}"
register: homedir
with_items:
- "{{ extra_users_or_empty }}"
- name: Sync testing_files repo to extra_users
ansible.builtin.copy:
src: ~/testing_files/
dest: "~{{ item }}/testing_files/"
remote_src: true
mode: preserve
owner: "{{ item }}"
group: "{{ homedir.results | selectattr('item', 'eq', item) | map(attribute='stat.gr_name') | first }}"
follow: true
with_items:
- "{{ extra_users_or_empty }}"
(NOTE: extra_users_or_empty: "{{ extra_users if extra_users is not none else [] }}")
However that feels like something that should be achieved in more elegant way. And for symlinks - I have no idea why the ansible.builtin.copy ignores them.
Any ideas?
Huh, ok, seems like when we're using remote_src, we should set local_follow, instead of follow. The following solution handles symlinks properly:
- name: Get group of homedir
ansible.builtin.stat:
path: "~{{ item }}"
register: homedir
with_items:
- "{{ extra_users_or_empty }}"
- name: Sync testing_files repo to extra_users
ansible.builtin.copy:
src: ~/testing_files/
dest: "~{{ item }}/testing_files/"
remote_src: true
mode: preserve
owner: "{{ item }}"
group: "{{ homedir.results | selectattr('item', 'eq', item) | map(attribute='stat.gr_name') | first }}"
local_follow: false
with_items:
- "{{ extra_users_or_empty }}"
I want to delete files based on a wildcard but also add exceptions to the rule.
- hosts: all
tasks:
- name: Ansible delete file wildcard
find:
paths: /etc/wild_card/example
patterns: "*.txt"
use_regex: true
register: wildcard_files_to_delete
- name: Ansible remove file wildcard
file:
path: "{{ item.path }}"
state: absent
with_items: "{{ wildcard_files_to_delete.files }}"
For example I want to except a file named "important.txt". How can I do that?
Just add a when condition to the task that deletes files. E.g., something like:
- name: Ansible remove file wildcard
file:
path: "{{ item.path }}"
state: absent
when: item.path != '/etc/wild_card/example/important.txt'
with_items: "{{ wildcard_files_to_delete.files }}"
This will skip a specific file. If you have a list of files to skip you could do this instead:
- name: Ansible remove file wildcard
file:
path: "{{ item.path }}"
state: absent
when: item.path not in files_to_skip
with_items: "{{ wildcard_files_to_delete.files }}"
vars:
files_to_skip:
- /etc/wild_card/example/important.txt
- /etc/wild_card/example/saveme.txt
And if you want to preserve files based on some sort of pattern, you could make use of ansible's match or search tests:
- name: Ansible remove file wildcard
file:
path: "{{ item.path }}"
state: absent
when: item.path is not search('important.txt')
with_items: "{{ wildcard_files_to_delete.files }}"
Is it possible to loop through a list of items if a string is defined in a variable i will specify.
Essentially i want to have a list of variables defined and utilized the aws_s3 module to download the files only if they are defined when running the playbook
e.g
say i have the list "var1,var2"
and I have the following variables defined:
apps_location:
- { name: 'vars1', src: 'vars1.tgz', dest: '/tmp/vars1_file.tgz' }
- { name: 'vars2', src: 'vars2.tgz', dest: '/tmp/vars2_file.tgz' }
- { name: 'vars3', src: 'vars3.tgz', dest: '/tmp/vars3_file.tgz' }
Task:
- name: "Splunk Search Head | Download Splunk Apps from S3"
aws_s3:
bucket: "{{ resource_bucket_name }}"
object: "{{ item.src }}"
dest: "{{ item.dest }}"
mode: get
with_items: "{{ apps_location }}"
I want to run the command:
ansible-playbook -i inventory -e "var1,var2"
and download only var1 and var2 on that specific run.
I tried utilizing "lookups" but couldnt get the syntax right. Im not entirely sure if this best way of doing this, but i want to have a predefined list of file locations and only download the ones that i'm passing during runtime.
Note the only reason "name" exists in apps_location is to see if i can do a lookup and only install that one but i couldnt get the syntax right.
Define a variable containing a list of defined apps. I'm trying:
- name: "Set Fact"
set_fact:
dict: "{{ apps_location[item].dest }}"
with_items: "{{ my_vars|default([]) }}"
However whenever I output dict I only get the last value.
Any help would be appreciated :)
The extra-vars must be an assignment of a variable and value. For example
shell> ansible-playbook -i inventory -e "my_vars=['vars1','vars2']"
A more convenient structure of the data would be a dictionary for this purpose. For example
apps_location:
vars1:
src: 'vars1.tgz'
dest: '/tmp/vars1_file.tgz'
vars2:
src: 'vars2.tgz'
dest: '/tmp/vars2_file.tgz'
vars3:
src: 'vars3.tgz'
dest: '/tmp/vars3_file.tgz'
Then the loop might look like
- aws_s3:
bucket: "{{ resource_bucket_name }}"
object: "{{ apps_location[item].src }}"
dest: "{{ apps_location[item].dest }}"
mode: get
loop: "{{ my_vars|default([]) }}"
Q: "Define a variable containing a list of defined apps."
A: Try this
- set_fact:
my_list: "{{ my_list(default([]) +
[apps_location[item].dest] }}"
loop: "{{ my_vars|default([]) }}"
(not tested)
I believe the Ansible copy module can take a whole bunch of "files" and copy them in one hit. This I believe can be achieved by copying a directory recursively.
Can the Ansible template module take a whole bunch of "templates" and deploy them in one hit? Is there such a thing as deploying a folder of templates and applying them recursively?
The template module itself runs the action on a single file, but you can use with_filetree to loop recursively over a specified path:
- name: Ensure directory structure exists
ansible.builtin.file:
path: '{{ templates_destination }}/{{ item.path }}'
state: directory
with_community.general.filetree: '{{ templates_source }}'
when: item.state == 'directory'
- name: Ensure files are populated from templates
ansible.builtin.template:
src: '{{ item.src }}'
dest: '{{ templates_destination }}/{{ item.path }}'
with_community.general.filetree: '{{ templates_source }}'
when: item.state == 'file'
And for templates in a single directory you can use with_fileglob.
This answer provides a working example of the approach laid down by #techraf
with_fileglob expects only files to live within the templates folder - see https://serverfault.com/questions/578544/deploying-a-folder-of-template-files-using-ansible
with_fileglob will only parse files in the templates folder
with_filetree maintains the directory structure when moving the template files to dest. It auto creates those directories for you at dest.
with_filetree will parse all files in the templates folder and nested directories
- name: Approve certs server directories
file:
state: directory
dest: '~/{{ item.path }}'
with_filetree: '../templates'
when: item.state == 'directory'
- name: Approve certs server files
template:
src: '{{ item.src }}'
dest: '~/{{ item.path }}'
with_filetree: '../templates'
when: item.state == 'file'
Essentially, think of this approach as copying and pasting a directory and all its contents from A to B and whilst doing so, parsing all templates.
I could not manage to do it with the other answers. This is what worked for me:
- name: Template all the templates and place them in the corresponding path
template:
src: "{{ item.src }}"
dest: "{{ destination_path }}/{{ item.path | regex_replace('\\.j2$', '') }}"
force: yes
with_filetree: '{{ role_path }}/templates'
when: item.state == 'file'
In my case folder contain both files and jinja2 templates.
- name: copy all directories recursively
file: dest={{templates_dest_path}}/{{ item|replace(templates_src_path+'/', '') }} state=directory
with_items: "{{ lookup('pipe', 'find '+ templates_src_path +'/ -type d').split('\n') }}"
- name: copy all files recursively
copy: src={{ item }} dest={{templates_dest_path}}/{{ item|replace(templates_src_path+'/', '') }}
with_items: "{{ lookup('pipe', 'find '+ templates_src_path +'/ -type f -not -name *.j2 ').split('\n') }}"
- name: copy templates files recursively
template: src={{ item }} dest={{templates_dest_path}}/{{ item|replace(templates_src_path+'/', '')|replace('.j2', '') }}
with_items: "{{ lookup('pipe', 'find '+ templates_src_path +'/*.j2 -type f').split('\n') }}"
I did it and it worked. \o/
- name: "Create file template"
template:
src: "{{ item.src }}"
dest: "{{ your_dir_remoto }}/{{ item.dest }}"
loop:
- { src: '../templates/file1.yaml.j2', dest: 'file1.yaml' }
- { src: '../templates/file2.yaml.j2', dest: 'file2.yaml' }