Ansible: Archive module does not take into account exclude_path param - ansible

I want to archive the following directory:
temp_build_directory: /tmp/mdr-upgrade
Where
$ ls -1 /tmp/mdr-upgrade
ansible
atr
composefiles
data
images
packs
wheelhouse
and the task is:
- name: archive_artifacts.yml --> Archive artifacts
archive:
path: "{{ temp_build_directory }}/*"
dest: "{{ target_tmp_dir }}/{{ artifacts_file_name }}"
exclude_path: "{{ target_tmp_dir }}/{{ ansible_dir }}"
And ansible_dir: ansible
Tarball ends up always containing the ansible folder.
Why is that?
edit: I am using target_tmp_dir: "/tmp"

exclude_path needs an absolute path (see docs).
Try again with:
- name: archive_artifacts.yml --> Archive artifacts
archive:
path: "{{ temp_build_directory }}/*"
dest: "{{ target_tmp_dir }}/{{ artifacts_file_name }}"
exclude_path: "{{ temp_build_directory }}/{{ ansible_dir }}"

Related

Iterate over dict object using ansible

I have a local.yml file shown below where I am defining the variable JDK_VERSION as list. calling the role from different repo
- role: jdk_install
vars:
JDK_VERSION: [ 'jdk1.8', 'jdk11', 'jdk17' ]
I have defined list varibles in local.yml as key in vars.yml
JDK_VERSIONS:
"jdk1.8": ["1.8.0_352", "1.8.0_322"]
"jdk11": ["11.0.17_8", "11.0.18_8"
"jdk17": "17.0.5_8"
1.8.0_352:
"package_name": "sincro-jdk-1.8.0_352"
"jdk_dirname": "jdk1.8.0_352"
"sym_link": "/opt/jdk1.8"
"installer": "rpm"
1.8.0_322:
"package_name": "sincro-jdk-1.8.0_322"
"jdk_dirname": "jdk1.8.0_322"
"sym_link": "/opt/jdk1.8"
"installer": "rpm"
11.0.17_8:
"download_url": https://artifactory.sincrod.com/artifactory/github-releases/adoptium/temurin11-binaries/releases/download/jdk-11.0.17+8/OpenJDK11U-jdk_x64_linux_hotspot_11.0.17_8.tar.gz
"package_name": "OpenJDK11U-jdk_x64_linux_hotspot_11.0.17_8.tar.gz"
"jdk_dirname": "jdk-11.0.17+8"
"sym_link": "/opt/jdk11"
"installer": "tar"
17.0.5_8:
"download_url": https://artifactory.sincrod.com/artifactory/github-releases/adoptium/temurin17-binaries/releases/download/jdk-17.0.5+8/OpenJDK17U-jdk_x64_linux_hotspot_17.0.5_8.tar.gz
"package_name": "OpenJDK17U-jdk_x64_linux_hotspot_17.0.5_8.tar.gz"
"jdk_dirname": "jdk-17.0.5+8"
"sym_link": "/opt/jdk17"
"installer": "tar"
The main.yml file looks like below
---
- name: Install Java
include: install_jdk.yml
vars:
"install_jdk" : "{{ item }}"
with_items:
- "{{ JDK_VERSION }}"
- name: Setting default jdk
include: default_jdk.yml
vars:
"default_jdk" : "{{ JDK_VERSION.0 }}"
And install.yml file looks like below
---
- name: JDK version
debug:
msg: "{{ install_jdk }}"
- name: Process JDK details
set_fact:
jdk_details: "{{ lookup('vars', JDK_VERSIONS[install_jdk], default='1.8.0_352') }}"
- name: Print JDK version
debug:
msg: "{{ jdk_details }}"
- name: Installing JDK from rpm
yum:
name: "{{ item }}"
update_cache: true
state: installed
when: jdk_details.installer == "rpm"
- name: Installing JDK from source
block:
- name: download jdk tar
get_url:
url: "{{ jdk_details.download_url }}"
dest: /tmp
mode: 0755
group: root
owner: root
- name: Create jdk installation directory path
file:
path: "/opt/data/services/jdks/"
state: directory
- name: Untar JDK installtion files
unarchive:
src: /tmp/{{ jdk_details.package_name }}
dest: /opt/data/services/jdks
remote_src: True
when: jdk_details.installer == "tar"
- name: create symlink for JDK version
file:
src: "/opt/data/services/jdks/{{ jdk_details.jdk_dirname }}"
dest: "{{ jdk_details.sym_link }}"
state: link
force: yes
follow: False
I want to iterate over the list mentioned in local.yml.
Also local.yml list item act as key for vars.yml. Now vars.yml or vars/main.yml has keys with multiple values.
So, basically I want to iterate similar to nested list or list[dict{[list]}] kind of structure.
In the above mentioned example the key "jdk1.8": ["1.8.0_352", "1.8.0_322"] has two item, so it should install two packages

how to pass many variables in ansible?

I faced problems when I don't know how to pass many variables in with_items:
I have these vars:
- vars:
paths:
- /tmp/tecom/python3/
- /tmp/tecom/pip/
- /tmp/tecom/psycopg2/
- /tmp/tecom/docker/
- /tmp/tecom/postgresql/
files:
- python3.tar
- pip.tar
- psycopg2.tar
- docker.tar
- postgresql.tar
And I have the task that should extract the archives:
- name: unarchive "{{ item }}"
unarchive:
src: "{{ item }}"
dest: # should take the paths above. Every path should match it's own `.tar` file.
with_items: "{{ files }}"
Every path should match its own .tar file.
For example,
- debug:
msg: "unarchive {{ item }} to {{ _path }}"
loop: "{{ files }}"
vars:
_path: "{{ path }}/{{ item|splitext|first }}/"
path: /tmp/tecom
files:
- python3.tar
- pip.tar
- psycopg2.tar
- docker.tar
- postgresql.tar
gives (abridged)
msg: unarchive python3.tar to /tmp/tecom/python3/
msg: unarchive pip.tar to /tmp/tecom/pip/
msg: unarchive psycopg2.tar to /tmp/tecom/psycopg2/
msg: unarchive docker.tar to /tmp/tecom/docker/
msg: unarchive postgresql.tar to /tmp/tecom/postgresql/
Q: "How do I unarchive those files to the folders?"
A: Use the expressions as appropriate, e.g.
- name: "unarchive {{ item }}"
unarchive:
src: "{{ item }}"
dest: "{{ path }}/{{ item|splitext|first }}/"
loop: "{{ files }}"
(not tested)
It's up to you where you put the variables files and path. See Variable precedence: Where should I put a variable?

Ansible delete Files with wildcard/regex/glob with exception

I want to delete files based on a wildcard but also add exceptions to the rule.
- hosts: all
tasks:
- name: Ansible delete file wildcard
find:
paths: /etc/wild_card/example
patterns: "*.txt"
use_regex: true
register: wildcard_files_to_delete
- name: Ansible remove file wildcard
file:
path: "{{ item.path }}"
state: absent
with_items: "{{ wildcard_files_to_delete.files }}"
For example I want to except a file named "important.txt". How can I do that?
Just add a when condition to the task that deletes files. E.g., something like:
- name: Ansible remove file wildcard
file:
path: "{{ item.path }}"
state: absent
when: item.path != '/etc/wild_card/example/important.txt'
with_items: "{{ wildcard_files_to_delete.files }}"
This will skip a specific file. If you have a list of files to skip you could do this instead:
- name: Ansible remove file wildcard
file:
path: "{{ item.path }}"
state: absent
when: item.path not in files_to_skip
with_items: "{{ wildcard_files_to_delete.files }}"
vars:
files_to_skip:
- /etc/wild_card/example/important.txt
- /etc/wild_card/example/saveme.txt
And if you want to preserve files based on some sort of pattern, you could make use of ansible's match or search tests:
- name: Ansible remove file wildcard
file:
path: "{{ item.path }}"
state: absent
when: item.path is not search('important.txt')
with_items: "{{ wildcard_files_to_delete.files }}"

Pass get_url downloaded zip files as variable to unarchive

I've just started to use Ansible to automate binary deployments.
When downloading the zip files and trying to unzip it by passing the downloaded zip files as variable to be unzipped/unarchived but an error is always thrown.
Snippet of the YML below:
- name: Download binaries
get_url:
url={{ download_server }}
url_username={{ username }}
url_password={{ passwd }}
dest={{ base_dir }}
register: bin_files
- set_fact:
my_unzipped_file: "{{ bin_files[0].stdout }}"
- name: UNZIPPING the files
unarchive: src={{ base_dir }}/{{ item }} dest={{ base_dir }} copy=no
with_items: my_unzipped_file
If it wasn't a user/pass protected URL you could erase the 'get_url' module and place the URL in the src: of Unarchive module.
Check the examples:
http://docs.ansible.com/ansible/latest/modules/unarchive_module.html
another way is to download all your files into a directory {{ bin_dir }} for example and use within the unarchive module 'with_fileglob' to unzip all .zip/.tar.gz and such
Example:
- name: UNZIPPING the files
unarchive:
src: "{{ item }}"
dest: "{{ base_dir }}/"
copy: no
with_fileglob:
- "{{ base_dir }}/*.zip"
- "{{ base_dir }}/*.tar.gz"
another tip for you IMHO you should drop the '=' code style in modules and move to ':' as you can see above, it's more human-readable
You corrected SNIPPET:
- name: Download binaries
get_url:
url: {{ download_server }}
url_username: {{ username }}
url_passwor: {{ passwd }}
dest: {{ base_dir }}
register: bin_files
- name: UNZIPPING the files
unarchive:
src: {{ base_dir }}/{{ item }}
dest: {{ base_dir }}
copy: no
with_items:
- "{{ bin_files.stdout }}"

Can the templates module handle multiple templates / directories?

I believe the Ansible copy module can take a whole bunch of "files" and copy them in one hit. This I believe can be achieved by copying a directory recursively.
Can the Ansible template module take a whole bunch of "templates" and deploy them in one hit? Is there such a thing as deploying a folder of templates and applying them recursively?
The template module itself runs the action on a single file, but you can use with_filetree to loop recursively over a specified path:
- name: Ensure directory structure exists
ansible.builtin.file:
path: '{{ templates_destination }}/{{ item.path }}'
state: directory
with_community.general.filetree: '{{ templates_source }}'
when: item.state == 'directory'
- name: Ensure files are populated from templates
ansible.builtin.template:
src: '{{ item.src }}'
dest: '{{ templates_destination }}/{{ item.path }}'
with_community.general.filetree: '{{ templates_source }}'
when: item.state == 'file'
And for templates in a single directory you can use with_fileglob.
This answer provides a working example of the approach laid down by #techraf
with_fileglob expects only files to live within the templates folder - see https://serverfault.com/questions/578544/deploying-a-folder-of-template-files-using-ansible
with_fileglob will only parse files in the templates folder
with_filetree maintains the directory structure when moving the template files to dest. It auto creates those directories for you at dest.
with_filetree will parse all files in the templates folder and nested directories
- name: Approve certs server directories
file:
state: directory
dest: '~/{{ item.path }}'
with_filetree: '../templates'
when: item.state == 'directory'
- name: Approve certs server files
template:
src: '{{ item.src }}'
dest: '~/{{ item.path }}'
with_filetree: '../templates'
when: item.state == 'file'
Essentially, think of this approach as copying and pasting a directory and all its contents from A to B and whilst doing so, parsing all templates.
I could not manage to do it with the other answers. This is what worked for me:
- name: Template all the templates and place them in the corresponding path
template:
src: "{{ item.src }}"
dest: "{{ destination_path }}/{{ item.path | regex_replace('\\.j2$', '') }}"
force: yes
with_filetree: '{{ role_path }}/templates'
when: item.state == 'file'
In my case folder contain both files and jinja2 templates.
- name: copy all directories recursively
file: dest={{templates_dest_path}}/{{ item|replace(templates_src_path+'/', '') }} state=directory
with_items: "{{ lookup('pipe', 'find '+ templates_src_path +'/ -type d').split('\n') }}"
- name: copy all files recursively
copy: src={{ item }} dest={{templates_dest_path}}/{{ item|replace(templates_src_path+'/', '') }}
with_items: "{{ lookup('pipe', 'find '+ templates_src_path +'/ -type f -not -name *.j2 ').split('\n') }}"
- name: copy templates files recursively
template: src={{ item }} dest={{templates_dest_path}}/{{ item|replace(templates_src_path+'/', '')|replace('.j2', '') }}
with_items: "{{ lookup('pipe', 'find '+ templates_src_path +'/*.j2 -type f').split('\n') }}"
I did it and it worked. \o/
- name: "Create file template"
template:
src: "{{ item.src }}"
dest: "{{ your_dir_remoto }}/{{ item.dest }}"
loop:
- { src: '../templates/file1.yaml.j2', dest: 'file1.yaml' }
- { src: '../templates/file2.yaml.j2', dest: 'file2.yaml' }

Resources