Copy multiple files with Ansible - ansible

How can I copy more than a single file into remote nodes by Ansible in a task?
I've tried to duplicate the copy module line in my task to define files but it only copies the first file.

You can use the with_fileglob loop for this:
- copy:
src: "{{ item }}"
dest: /etc/fooapp/
owner: root
mode: 600
with_fileglob:
- "/playbooks/files/fooapp/*"

- name: copy multiple items
copy:
src: "{{ item.src }}"
dest: "{{ item.dest }}"
loop:
- src: containerizers
dest: /etc/mesos/containerizers
- src: another_file
dest: /etc/somewhere
- src: dynamic
dest: "{{ var_path }}"

Since Ansible 2.5 the with_* constructs are not recommended, and loop syntax should be used. A simple practical example:
- name: Copy CA files
copy:
src: '{{item}}'
dest: '/etc/pki/ca-trust/source/anchors'
owner: root
group: root
mode: 0644
loop:
- symantec-private.crt
- verisignclass3g2.crt

You can use with_together for this purpose:
- name: Copy multiple files to multiple directories
copy: src={{ item.0 }} dest={{ item.1 }}
with_together:
- [ 'file1', 'file2', 'file3' ]
- [ '/dir1/', '/dir2/', '/dir3/' ]

If you need more than one location, you need more than one task. One copy task can copy only from one location (including multiple files) to another one on the node.
- copy: src=/file1 dest=/destination/file1
- copy: src=/file2 dest=/destination/file2
# copy each file over that matches the given pattern
- copy: src={{ item }} dest=/destination/
with_fileglob:
- /files/*

You can use a find, and then copy those files.
---
- hosts: lnx
tasks:
- find:
paths: /appl/scripts/inq
recurse: true
patterns: "inq.Linux*"
register: file_to_copy
- copy:
src: "{{ item.path }}"
dest: /usr/local/sbin/
owner: root
mode: 0775
loop: "{{ files_to_copy.files }}"

Or you can use with_items:
- copy:
src: "{{ item }}"
dest: /etc/fooapp/
owner: root
mode: 600
with_items:
- dest_dir

- name: find inq.Linux*
find: paths="/appl/scripts/inq" recurse=yes patterns="inq.Linux*"
register: find_files
- name: set fact
set_fact:
all_files:
- "{{ find_files.files | map(attribute='path') | list }}"
when: find_files > 0
- name: copy files
copy:
src: "{{ item }}"
dest: /destination/
with_items: "{{ all_files }}"
when: find_files > 0

copy module is a wrong tool for copying many files and/or directory structure, use synchronize module instead which uses rsync as backend. Mind you, it requires rsync installed on both controller and target host. It's really powerful, check ansible documentation.
Example - copy files from build directory (with subdirectories) of controller to /var/www/html directory on target host:
synchronize:
src: ./my-static-web-page/build/
dest: /var/www/html
rsync_opts:
- "--chmod=D2755,F644" # copy from windows - force permissions

You can loop through variable with list of directories:
- name: Copy files from several directories
copy:
src: "{{ item }}"
dest: "/etc/fooapp/"
owner: root
mode: "0600"
loop: "{{ files }}"
vars:
files:
- "dir1/"
- "dir2/"

Use the following source code for copy multiple files on your client machine.
- name: Copy data to the client machine
hosts: hostname
become_method: sudo
become_user: root
become: true
tasks:
# Copy twice as sometimes files get skipped (mostly only one file skipped from a folder if the folder does not exist)
- name: Copy UFO-Server
copy:
src: "source files path"
dest: "destination file path"
owner: root
group: root
mode: 0644
backup: yes
ignore_errors: true
Note:
If you are passing multiple paths by using variable then
src: "/root/{{ item }}"
If you are passing path by using a variable for different items then
src: "/root/{{ item.source_path }}"

Copy files from multiple directories to multiple directories with Ansible
I found the guenhter answer helpful but needed to change also the remote files' mode. I don't have enough reputation to put this as a comment, which would be a more appropriate place for this. In the example, I copy two files from two directories into /tmp and /tmp/bin, which I create first and modify remote files mode.
- name: cpldupd
hosts: test
remote_user: root
become: true
vars:
- rpth: /tmp
tasks:
- name: Create '{{rpth}}/bin'
file:
path: '{{rpth}}/bin'
state: directory
- name: Transfer
copy: src={{ item.src }} dest={{ item.dest }} mode=0775
with_items:
- { src: '../utils/cpldupd', dest: '{{rpth}}/cpldupd' }
- { src: '../utils/bin/cpldupd', dest: '{{rpth}}/bin/cpldupd' }

Here is a generic solution for copying files:
...
- name: Find files you want to move
ansible.builtin.find:
paths: /path/to/files/
file_type: file
excludes: "*.txt" # Whatever pattern you want to exclude
register: files_output
- name: Copy the files
ansible.builtin.copy:
src: "{{ item.path }}"
dest: /destination/directory/
loop: "{{ files_output.files }}"
...
This is more powerful than using with_fileglob as you can match using regexes. Here is this play in action:
$ ls /path/to/files
demo.yaml test.sh ignore.txt
$ ls /destination/directory
file.h
$ ansible-playbook playbook.yaml
...[some output]...
$ ls /destination/directory
file.h demo.yaml test.sh
As you can see from the above example, ignore.txt was not copied over to the destination directory because of the excludes regex in the playbook. Ignoring files like this is not possible as simply using with_fileglob.
Additionally, you can move files from multiple directories with relative ease:
...
- name: Find files you want to move
ansible.builtin.find:
paths: /path/to/files/
# ... the rest of the task
register: list1
- name: Find more files you want to move
ansible.builtin.find:
paths: /different/path/
# ... the rest of the task
register: list2
- name: Copy the files
ansible.builtin.copy:
src: "{{ item.path }}"
dest: /destination/directory/
loop: "{{ list1.files + list2.files }}"
...

Here is a sample Ansible Script to copy multiple Files on remote Hosts
- name: Copy Multiple Files on remote Hosts
ansible.windows.win_copy:
src: "{{ srcPath }}/{{ item }}" # Remeber to us {{item}}
# as a postfix to source path
dest: "{{ destPath }}"
remote_src: yes # if source path is available on remote Host
with_items:
- abc.txt
- abc.properties

hosts: test
gather_facts: false
become: true
vars:
path: '/home/ansibm/playbooks'
remote_path: '/home/{{ansible_ssh_user}}'
dir: 'yml_files'
tasks:
name: "creating directory for backup file"
file:
path: '{{ remote_path }}/{{ dir }}'
state: directory
owner: '{{ansible_ssh_user}}'
group: '{{ansible_ssh_user}}'
mode: 0700
name: "copying yml files"
copy:
src: '{{item}}'
dest: '{{ remote_path }}/{{ dir }}'
owner: '{{ansible_ssh_user}}'
group: '{{ansible_ssh_user}}'
mode: 0644
loop:
- '{{ path }}/ab.html'
- '{{ path }}/cp.yml'

Related

Ansible playbook for unzipping GZ and ZIP files

I have an integration where I download one or more ZIP files. Within those ZIP files, there are dozens of GZ files that also need to be uncompressed. Below is an example of the file structure:
metrics.zip
-> 239238923323.gz
-> 839389239232.gz
-> 928392892839.gz
metrics-001.zip
-> 29389238923.gz
-> 39828393822.gz
-> 09320930323.gz
(etc)
I was struggling to write the playbook needed to loop through the ZIP file(s), then all of the GZ files and uncompress them all.
Created an answer from the author's original post:
- hosts: localhost
gather_facts: no
tasks:
# Download the report into a temporary directory on the Ansible Playbook host
- name: Create temporary directory
ansible.builtin.tempfile:
state: directory
suffix: unique_suffix
register: temp_dir
- name: Download File
ansible.builtin.get_url:
url: "https://path.to/file/download.zip"
dest: "{{ temp_dir.path }}/download_file_name.zip"
# Unzip the ZIP files
- name: Extract all ZIP files
ansible.builtin.unarchive:
src: "{{ item }}"
dest: "{{ temp_dir.path }}"
with_fileglob:
- "{{ temp_dir.path }}/*.zip"
# Unzip the GZ files
- name: Extract all of the GZ files
ansible.builtin.command: find "{{ temp_dir.path }}" -name '*.gz' -exec gzip -d {} \;
- name: Merge CSVs into a single file
ansible.builtin.assemble:
src: "{{ temp_dir.path }}"
dest: "{{ temp_dir.path }}/extract.x"
regexp: '\.csv$'
# Use a Windows host to copy the file over to a file share (copying from Linux to a Windows file share requires mounting the FS. Using a Windows host is easier)
- hosts: "{{ windows_host_name_in_inventory }}"
tasks:
- name: Copy Extract file to File Share
win_copy:
src: "{{ hostvars['localhost']['temp_dir'].path }}/extract.x"
dest: "{{ Extract_To }}\\unique_name.csv"
# Remove the Temporary folder
- hosts: localhost
tasks:
- name: Remove Temporary Directory
ansible.builtin.file:
path: "{{ hostvars['localhost']['temp_dir'].path }}"
state: absent
when: hostvars['localhost']['temp_dir'].path is defined

How to define multiple with_items using registered variables in Ansible

AM in a process of achieving below list of tasks, and could someone please rectify the playbook or suggest a way to get the requirement done.
High level purpose of the activity is below:
find previous day's log files in multiple paths and archive them under a date wise folder (folder has to be created for particular date) in a different path.
My approach is:
Create a date wise directory and then search the previous day's log files and then copy them in to the newly created directory and then archive it.
I am having an issue when defining paths and variables in copy section. Can someone help with this?
- name: Purge old spider logs
become: true
hosts: node1
vars:
date: "{{ lookup('pipe', 'date +%Y-%m-%d') }}"
tasks:
- name: create a directory
file:
path: /path/{{ date }}
state: directory
mode: '777'
register: logdir
- name: Find log files
find:
path: /test/logs
age: 3600
patterns:
- "name.log.*"
recurse: yes
register: testlogs
- debug:
var: testlogs.path
- debug:
var=item.files
with_items: '{{ testlogs.files }}'
- name: Copy files in to backup location
copy:
src: "{{ item.files }}"
dest: "{{ item.path }}"
with_items:
- '{{ item.files.testlog.files }}'
- '{{ item.path.logdir.path }}'
if i understand your problem you want to copy all remote log files to another destination with a folder dated:
- name: Purge old spider logs
become: true
hosts: node1
vars:
date: "{{ lookup('pipe', 'date +%Y-%m-%d') }}"
tasks:
- name: create a remote directory
file:
path: /path/{{ date }}
state: directory
mode: '777'
register: logdir
- name: Find log files
find:
path: logs
age: 3600
patterns:
- "name.log.*"
recurse: yes
register: testlogs
- name: Copy (remote) files in to backup location (remote)
copy:
remote_src: yes
src: "{{ item.path }}"
dest: "{{logdir.path}}/"
with_items:
- '{{ testlogs.files }}'

How to capture and display filename

I used below Ansible playbook to capture the filename, but I'm facing error
task:
- copy:
src: /tmp/example.tar.gz
dest: /opt/
register: filename
- debug: var=filename
I need to capture the filename and to be displayed on the console.
According your example of copy files to remote locations the filename seems to be already known.
If you like to set host variable(s) and fact(s) you could use a construct of
- name: Set fact
set_fact:
FILENAME: "example.tar.gz"
- name: Show fact
debug:
msg: "{{ FILENAME }}"
- name: Copy file {{ FILENAME }}
copy:
src: /tmp/{{ FILENAME }}
dest: /opt
If you are interested in registering the return values of the copy module you could use
- name: Copy file
copy:
src: /tmp/example.tar.gz
dest: /opt
register: result
- name: Show copied file
debug:
msg: "{{ result.dest }}"

How do you use Ansible win_find and fetch to copy a group of files from remote Windows to local cansible controller?

What's the right way to copy a group of files from a Windows system to the ansible controller?
I can find the files but I don't know how to reference the registered variable data to locate the path to hand to fetch
- win_find: paths="C:\\ADirectory" recurse=no patterns="*.log"
register: file_to_copy
- name: copy files
fetch: src="{{ item }}" dest=output
with_items: files_to_copy.files.path
You need to iterate over a list and it's the files that is a list in the output of win_find, not path.
This should work for you:
- name: copy files
fetch: src="{{ item.path }}" dest=output
with_items: "{{ files_to_copy.files }}"
This seems to work
- name: copy files
fetch:
src: "{{ item.path }}"
dest: output/
flat: yes
with_items: "{{ files_to_copy.files }}"
- name: Copy files
win_copy:
remote_src: yes
src: "{{ item.path }}"
dest: \\Xxx\\XXX
with_items: "{{ files_matched.files }}"
become: yes
become_method: runas
Used this to copy matched files across fileshare.

Ansible: How to delete files and folders inside a directory?

The below code only deletes the first file it gets inside the web dir. I want to remove all the files and folders inside the web directory and retain the web directory. How can I do that?
- name: remove web dir contents
file: path='/home/mydata/web/{{ item }}' state=absent
with_fileglob:
- /home/mydata/web/*
Note: I've tried rm -rf using command and shell, but they don't work. Perhaps I am using them wrongly.
Any help in the right direction will be appreciated.
I am using ansible 2.1.0.0
- name: Delete content & directory
file:
state: absent
path: /home/mydata/web/
Note: this will delete the directory too.
Remove the directory (basically a copy of https://stackoverflow.com/a/38201611/1695680), Ansible does this operation with rmtree under the hood.
- name: remove files and directories
file:
state: "{{ item }}"
path: "/srv/deleteme/"
owner: 1000 # set your owner, group, and mode accordingly
group: 1000
mode: '0777'
with_items:
- absent
- directory
If you don't have the luxury of removing the whole directory and recreating it, you can scan it for files, (and directories), and delete them one by one. Which will take a while. You probably want to make sure you have [ssh_connection]\npipelining = True in your ansible.cfg on.
- block:
- name: 'collect files'
find:
paths: "/srv/deleteme/"
hidden: True
recurse: True
# file_type: any # Added in ansible 2.3
register: collected_files
- name: 'collect directories'
find:
paths: "/srv/deleteme/"
hidden: True
recurse: True
file_type: directory
register: collected_directories
- name: remove collected files and directories
file:
path: "{{ item.path }}"
state: absent
with_items: >
{{
collected_files.files
+ collected_directories.files
}}
Using shell module (idempotent too):
- shell: /bin/rm -rf /home/mydata/web/*
If there are dot/hidden files:
- shell: /bin/rm -rf /home/mydata/web/* /home/mydata/web/.*
Cleanest solution if you don't care about creation date and owner/permissions:
- file: path=/home/mydata/web state=absent
- file: path=/home/mydata/web state=directory
I really didn't like the rm solution, also ansible gives you warnings about using rm.
So here is how to do it without the need of rm and without ansible warnings.
- hosts: all
tasks:
- name: Ansible delete file glob
find:
paths: /etc/Ansible
patterns: "*.txt"
register: files_to_delete
- name: Ansible remove file glob
file:
path: "{{ item.path }}"
state: absent
with_items: "{{ files_to_delete.files }}"
source: http://www.mydailytutorials.com/ansible-delete-multiple-files-directories-ansible/
try the below command, it should work
- shell: ls -1 /some/dir
register: contents
- file: path=/some/dir/{{ item }} state=absent
with_items: {{ contents.stdout_lines }}
That's what I come up with:
- name: Get directory listing
find:
path: "{{ directory }}"
file_type: any
hidden: yes
register: directory_content_result
- name: Remove directory content
file:
path: "{{ item.path }}"
state: absent
with_items: "{{ directory_content_result.files }}"
loop_control:
label: "{{ item.path }}"
First, we're getting directory listing with find, setting
file_type to any, so we wouldn't miss nested directories and links
hidden to yes, so we don't skip hidden files
also, do not set recurse to yes, since it is not only unnecessary, but may increase execution time.
Then, we go through that list with file module. It's output is a bit verbose, so loop_control.label will help us with limiting output (found this advice here).
But I found previous solution to be somewhat slow, since it iterates through the content, so I went with:
- name: Get directory stats
stat:
path: "{{ directory }}"
register: directory_stat
- name: Delete directory
file:
path: "{{ directory }}"
state: absent
- name: Create directory
file:
path: "{{ directory }}"
state: directory
owner: "{{ directory_stat.stat.pw_name }}"
group: "{{ directory_stat.stat.gr_name }}"
mode: "{{ directory_stat.stat.mode }}"
get directory properties with the stat
delete directory
recreate directory with the same properties.
That was enough for me, but you can add attributes as well, if you want.
Using file glob also it will work. There is some syntax error in the code you posted. I have modified and tested this should work.
- name: remove web dir contents
file:
path: "{{ item }}"
state: absent
with_fileglob:
- "/home/mydata/web/*"
Following up on the most upvoted answer here (which I cannot edit since "edit queue is full"):
- name: Delete content & directory
file:
state: absent
path: /home/mydata/web/
- name: Re-create the directory
file:
state: directory
path: /home/mydata/web/
While Ansible is still debating to implement state = empty
https://github.com/ansible/ansible-modules-core/issues/902
my_folder: "/home/mydata/web/"
empty_path: "/tmp/empty"
- name: "Create empty folder for wiping."
file:
path: "{{ empty_path }}"
state: directory
- name: "Wipe clean {{ my_folder }} with empty folder hack."
synchronize:
mode: push
#note the backslash here
src: "{{ empty_path }}/"
dest: "{{ nl_code_path }}"
recursive: yes
delete: yes
delegate_to: "{{ inventory_hostname }}"
Note though, with synchronize you should be able to sync your files (with delete) properly anyway.
Created an overall rehauled and fail-safe implementation from all comments and suggestions:
# collect stats about the dir
- name: check directory exists
stat:
path: '{{ directory_path }}'
register: dir_to_delete
# delete directory if condition is true
- name: purge {{directory_path}}
file:
state: absent
path: '{{ directory_path }}'
when: dir_to_delete.stat.exists and dir_to_delete.stat.isdir
# create directory if deleted (or if it didn't exist at all)
- name: create directory again
file:
state: directory
path: '{{ directory_path }}'
when: dir_to_delete is defined or dir_to_delete.stat.exist == False
Below code worked for me :
- name: Get directory listing
become: yes
find:
paths: /applications/cache
patterns: '*'
hidden: yes
register: directory_content_result
- name: Remove directory content
become: yes
file:
path: "{{ item.path }}"
state: absent
with_items: "{{ directory_content_result.files }}"
There is an issue open with respect to this.
For now, the solution works for me: create a empty folder locally and synchronize it with the remote one.
Here is a sample playbook:
- name: "Empty directory"
hosts: *
tasks:
- name: "Create an empty directory (locally)"
local_action:
module: file
state: directory
path: "/tmp/empty"
- name: Empty remote directory
synchronize:
src: /tmp/empty/
dest: /home/mydata/web/
delete: yes
recursive: yes
I want to make sure that the find command only deletes everything inside the directory and leave the directory intact because in my case the directory is a filesystem. The system will generate an error when trying to delete a filesystem but that is not a nice option. Iam using the shell option because that is the only working option I found so far for this question.
What I did:
Edit the hosts file to put in some variables:
[all:vars]
COGNOS_HOME=/tmp/cognos
find=/bin/find
And create a playbook:
- hosts: all
tasks:
- name: Ansible remove files
shell: "{{ find }} {{ COGNOS_HOME }} -xdev -mindepth 1 -delete"
This will delete all files and directories in the COGNOS_HOME variable directory/filesystem. The "-mindepth 1" option makes sure that the current directory will not be touched.
I have written an custom ansible module to cleanup files based on multiple filters like age, timestamp, glob patterns, etc.
It is also compatible with ansible older versions. It can be found here.
Here is an example:
- cleanup_files:
path_pattern: /tmp/*.log
state: absent
excludes:
- foo*
- bar*
Just a small cleaner copy & paste template of ThorSummoners answer, if you are using Ansible >= 2.3 (distinction between files and dirs not necessary anymore.)
- name: Collect all fs items inside dir
find:
path: "{{ target_directory_path }}"
hidden: true
file_type: any
changed_when: false
register: collected_fsitems
- name: Remove all fs items inside dir
file:
path: "{{ item.path }}"
state: absent
with_items: "{{ collected_fsitems.files }}"
when: collected_fsitems.matched|int != 0
Isn't it that simple ... tested working ..
eg.
---
- hosts: localhost
vars:
cleandir: /var/lib/cloud/
tasks:
- shell: ls -a -I '.' -I '..' {{ cleandir }}
register: ls2del
ignore_errors: yes
- name: Cleanup {{ cleandir }}
file:
path: "{{ cleandir }}{{ item }}"
state: absent
with_items: "{{ ls2del.stdout_lines }}"
- name: Files to delete search
find:
paths: /home/mydata/web/
file_type: any
register: files_to_delete
- name: Deleting files to delete
file:
path: '{{ item.path }}'
state: absent
with_items: "{{ files_to_delete.files }}"
I like the following solution:
- name: remove web dir contents
command:
cmd: "find . -path '*/*' -delete -print"
chdir: "/home/mydata/web/"
register: web_files_list
changed_when: web_files_list.stdout | length > 0
because it is:
simple
idempotent
fast
Assuming you are always in Linux, try the find cmd.
- name: Clean everything inside {{ item }}
shell: test -d {{ item }} && find {{ item }} -path '{{ item }}/*' -prune -exec rm -rf {} \;
with_items: [/home/mydata/web]
This should wipe out files/folders/hidden under /home/mydata/web
- name: delete old data and clean cache
file:
path: "{{ item[0] }}"
state: "{{ item[1] }}"
with_nested:
- [ "/data/server/{{ app_name }}/webapps/", "/data/server/{{ app_name }}/work/" ]
- [ "absent", "directory" ]
ignore_errors: yes
Below worked for me,
- name: Ansible delete html directory
file:
path: /var/www/html
state: directory

Resources