Ansible create tar.gz with archive module from a list - ansible

I'm trying to find files older than one day in Ansible and after that, create a tar.gz of those files, I've tried archive module, although it is creating only a tar of the last element in the list. Is there any way to create tar.gz including all files?
Below is my script:
- name: Check all files
find: paths=/myfiles
file_type=file
age=1
age_stamp=mtime
register: files
failed_when: files.matched < 10
- name: Remove previous tarFile
file: path=/tmp/test.tar.gz
state=absent
- name: compress all the files in tar.gz
archive: path="{{ item.path }}"
dest=/tmp/test.tar.gz
format=gz
with_items:
"{{ files.files }}"

it is creating only a tar of the last element in the list
It is creating and overwriting /tmp/test.tar.gz for each file in the loop. When you check, you see only the last one.
If you look at the archive module docs, you will see:
path Remote absolute path, glob, or list of paths or globs for the file or files to compress or archive.
So you can provide the list of files as a value for the path parameter:
- name: compress all the files in tar.gz
archive:
path: "{{ files_to_archive }}"
dest: /tmp/test.tar.gz
format: gz
vars:
files_to_archive: "{{ files.files | map(attribute='path') | list }}"

i have created multiple zip files using below method.
- name: 'Create zip archive of {{ date_input }} NMON HTML files'
archive:
path: /tmp/{{ inventory_hostname }}_*.html
dest: /tmp/NMON-{{ inventory_hostname }}.zip
format: zip
when: option == "5"
delegate_to: localhost

Related

Ansible playbook for unzipping GZ and ZIP files

I have an integration where I download one or more ZIP files. Within those ZIP files, there are dozens of GZ files that also need to be uncompressed. Below is an example of the file structure:
metrics.zip
-> 239238923323.gz
-> 839389239232.gz
-> 928392892839.gz
metrics-001.zip
-> 29389238923.gz
-> 39828393822.gz
-> 09320930323.gz
(etc)
I was struggling to write the playbook needed to loop through the ZIP file(s), then all of the GZ files and uncompress them all.
Created an answer from the author's original post:
- hosts: localhost
gather_facts: no
tasks:
# Download the report into a temporary directory on the Ansible Playbook host
- name: Create temporary directory
ansible.builtin.tempfile:
state: directory
suffix: unique_suffix
register: temp_dir
- name: Download File
ansible.builtin.get_url:
url: "https://path.to/file/download.zip"
dest: "{{ temp_dir.path }}/download_file_name.zip"
# Unzip the ZIP files
- name: Extract all ZIP files
ansible.builtin.unarchive:
src: "{{ item }}"
dest: "{{ temp_dir.path }}"
with_fileglob:
- "{{ temp_dir.path }}/*.zip"
# Unzip the GZ files
- name: Extract all of the GZ files
ansible.builtin.command: find "{{ temp_dir.path }}" -name '*.gz' -exec gzip -d {} \;
- name: Merge CSVs into a single file
ansible.builtin.assemble:
src: "{{ temp_dir.path }}"
dest: "{{ temp_dir.path }}/extract.x"
regexp: '\.csv$'
# Use a Windows host to copy the file over to a file share (copying from Linux to a Windows file share requires mounting the FS. Using a Windows host is easier)
- hosts: "{{ windows_host_name_in_inventory }}"
tasks:
- name: Copy Extract file to File Share
win_copy:
src: "{{ hostvars['localhost']['temp_dir'].path }}/extract.x"
dest: "{{ Extract_To }}\\unique_name.csv"
# Remove the Temporary folder
- hosts: localhost
tasks:
- name: Remove Temporary Directory
ansible.builtin.file:
path: "{{ hostvars['localhost']['temp_dir'].path }}"
state: absent
when: hostvars['localhost']['temp_dir'].path is defined

Ansible archive file for few path earlier

Below is the folder structure
playbook
|-groups_Vars
|-host
|-roles
|-archive-artifact
|-task
|-main.yml
|-archive-playbook.yml
myfile
In my main.yml, I need to archive the playbook in playbook.tar.gz.
- archive:
path: "<earlier path>/playbook/*"
dest: "<earlier path>/playbook.tar.gz"
format: gz
The folder that holds a playbooks is accessible in the special variable playbook_dir.
Getting the parent directory of a file or directory in Ansible is possible via the filter dirname.
And, as pointed in the documentation, path can be either a single element or a list of elements, so you could also have myfile included in that list.
So, to archive the playbook directory in the parent folder of the playbook directory, one could do:
- archive:
path:
- "{{ playbook_dir }}/*"
- "{{ playbook_dir | dirname }}/myfile"
dest: "{{ playbook_dir | dirname }}/playbook.tar.gz"
format: gz

Need assistance in ansible playbook for compressing files and excluding files based on certain condition

I have created an ansible playbook which finds files older than 7 days ,trying to exclude
files from particular directory or already zipped but not able to exclude.
tasks:
- name : Find files ending with extensions
become: true
find:
path:
- /home/test/replicate/vol/gunnsc01/prod/scoreout
recurse: yes
file_type: any
age: 7d
age_stamp: mtime
excludes: "delete"
register: output
- name: archive the files
become: yes
archive:
remove: yes
path: "{{ item.path }}"
dest: "{{ item.path }}-{{ansible_date_time.date.replace('-','')}}.gz"
format: zip
with_items: "{{ output.files }}"
I am not able to exclude specific directory ,I have tried exclude,excludes,exclude_path,exclude_paths and exclude_patterns
however I am not able to exclude the files under the directory delete

Ansible : how to rename a file when copying it

I have a task , where should I copy a file from its source , to its destination while renaming it in the destination .
My task looks like this :
- name: Go to the target folder
shell: ls
args:
chdir: "{{pathTest}}/target"
register: resultLS
- debug:
msg: "{{resultLS}}"
- name: copy jar file
copy:
src: "{{resultLS.stdout}}"
dest: "{{pathTest}}"
mode: 0777
But, like this it copies the jar file with its same name , my purpose is how to rename it in the dest (ideally with the copy action)
Ideas ?
rename it to: renamed.jar
Here you are:
- name: Ensure the first matched file from {{ pathTest }}/target is present on the target
copy:
src: "{{ lookup('fileglob', pathTest + '/target/*') | first }}"
dest: "{{ pathTest }}/renamed.jar"
mode: 0777
Remarks:
Don't parse ls output!
Think how you should handle multiple files.
in the example above - copy only the first one

Ansible: copy a directory content to another directory

I am trying to copy the content of dist directory to nginx directory.
- name: copy html file
copy: src=/home/vagrant/dist/ dest=/usr/share/nginx/html/
But when I execute the playbook it throws an error:
TASK [NGINX : copy html file] **************************************************
fatal: [172.16.8.200]: FAILED! => {"changed": false, "failed": true, "msg": "attempted to take checksum of directory:/home/vagrant/dist/"}
How can I copy a directory that has another directory and a file inside?
You could use the synchronize module. The example from the documentation:
# Synchronize two directories on one remote host.
- synchronize:
src: /first/absolute/path
dest: /second/absolute/path
delegate_to: "{{ inventory_hostname }}"
This has the added benefit that it will be more efficient for large/many files.
EDIT: This solution worked when the question was posted. Later Ansible deprecated recursive copying with remote_src
Ansible Copy module by default copies files/dirs from control machine to remote machine. If you want to copy files/dirs in remote machine and if you have Ansible 2.0, set remote_src to yes
- name: copy html file
copy: src=/home/vagrant/dist/ dest=/usr/share/nginx/html/ remote_src=yes directory_mode=yes
To copy a directory's content to another directory you CAN use ansibles copy module:
- name: Copy content of directory 'files'
copy:
src: files/ # note the '/' <-- !!!
dest: /tmp/files/
From the docs about the src parameter:
If (src!) path is a directory, it is copied recursively...
... if path ends with "/", only inside contents of that directory are copied to destination.
... if it does not end with "/", the directory itself with all contents is copied.
Resolved answer:
To copy a directory's content to another directory I use the next:
- name: copy consul_ui files
command: cp -r /home/{{ user }}/dist/{{ item }} /usr/share/nginx/html
with_items:
- "index.html"
- "static/"
It copies both items to the other directory. In the example, one of the items is a directory and the other is not. It works perfectly.
The simplest solution I've found to copy the contents of a folder without copying the folder itself is to use the following:
- name: Move directory contents
command: cp -r /<source_path>/. /<dest_path>/
This resolves #surfer190's follow-up question:
Hmmm what if you want to copy the entire contents? I noticed that * doesn't work – surfer190 Jul 23 '16 at 7:29
* is a shell glob, in that it relies on your shell to enumerate all the files within the folder before running cp, while the . directly instructs cp to get the directory contents (see https://askubuntu.com/questions/86822/how-can-i-copy-the-contents-of-a-folder-to-another-folder-in-a-different-directo)
Ansible remote_src does not support recursive copying.See remote_src description in Ansible copy docs
To recursively copy the contents of a folder and to make sure the task stays idempotent I usually do it this way:
- name: get file names to copy
command: "find /home/vagrant/dist -type f"
register: files_to_copy
- name: copy files
copy:
src: "{{ item }}"
dest: "/usr/share/nginx/html"
owner: nginx
group: nginx
remote_src: True
mode: 0644
with_items:
- "{{ files_to_copy.stdout_lines }}"
Downside is that the find command still shows up as 'changed'
the ansible doc is quite clear https://docs.ansible.com/ansible/latest/collections/ansible/builtin/copy_module.html for parameter src it says the following:
Local path to a file to copy to the remote server.
This can be absolute or relative.
If path is a directory, it is copied recursively. In this case, if path ends with "/",
only inside contents of that directory are copied to destination. Otherwise, if it
does not end with "/", the directory itself with all contents is copied. This behavior
is similar to the rsync command line tool.
So what you need is skip the / at the end of your src path.
- name: copy html file
copy: src=/home/vagrant/dist dest=/usr/share/nginx/html/
I found a workaround for recursive copying from remote to remote :
- name: List files in /usr/share/easy-rsa
find:
path: /usr/share/easy-rsa
recurse: yes
file_type: any
register: find_result
- name: Create the directories
file:
path: "{{ item.path | regex_replace('/usr/share/easy-rsa','/etc/easy-rsa') }}"
state: directory
mode: "{{ item.mode }}"
with_items:
- "{{ find_result.files }}"
when:
- item.isdir
- name: Copy the files
copy:
src: "{{ item.path }}"
dest: "{{ item.path | regex_replace('/usr/share/easy-rsa','/etc/easy-rsa') }}"
remote_src: yes
mode: "{{ item.mode }}"
with_items:
- "{{ find_result.files }}"
when:
- item.isdir == False
I got involved whole a day, too! and finally found the solution in shell command instead of copy: or command: as below:
- hosts: remote-server-name
gather_facts: no
vars:
src_path: "/path/to/source/"
des_path: "/path/to/dest/"
tasks:
- name: Ansible copy files remote to remote
shell: 'cp -r {{ src_path }}/. {{ des_path }}'
strictly notice to:
1. src_path and des_path end by / symbol
2. in shell command src_path ends by . which shows all content of directory
3. I used my remote-server-name both in hosts: and execute shell
section of jenkins, instead of remote_src: specifier in playbook.
I guess it is a good advice to run below command in Execute Shell section in jenkins:
ansible-playbook copy-payment.yml -i remote-server-name
Below worked for me,
-name: Upload html app directory to Deployment host
copy: src=/var/lib/jenkins/workspace/Demoapp/html dest=/var/www/ directory_mode=yes
This I found an ideal solution for copying file from Ansible server to remote.
copying yaml file
- hosts: localhost
user: {{ user }}
connection: ssh
become: yes
gather_facts: no
tasks:
- name: Creation of directory on remote server
file:
path: /var/lib/jenkins/.aws
state: directory
mode: 0755
register: result
- debug:
var: result
- name: get file names to copy
command: "find conf/.aws -type f"
register: files_to_copy
- name: copy files
copy:
src: "{{ item }}"
dest: "/var/lib/jenkins/.aws"
owner: {{ user }}
group: {{ group }}
remote_src: True
mode: 0644
with_items:
- "{{ files_to_copy.stdout_lines }}"
How to copy directory and sub dirs's and files from ansible server to remote host
- name: copy nmonchart39 directory to {{ inventory_hostname }}
copy:
src: /home/ansib.usr.srv/automation/monitoring/nmonchart39
dest: /var/nmon/data
Where:
copy entire directory: src: /automation/monitoring/nmonchart39
copy directory contents src: nmonchart39/

Resources