AM in a process of achieving below list of tasks, and could someone please rectify the playbook or suggest a way to get the requirement done.
High level purpose of the activity is below:
find previous day's log files in multiple paths and archive them under a date wise folder (folder has to be created for particular date) in a different path.
My approach is:
Create a date wise directory and then search the previous day's log files and then copy them in to the newly created directory and then archive it.
I am having an issue when defining paths and variables in copy section. Can someone help with this?
- name: Purge old spider logs
become: true
hosts: node1
vars:
date: "{{ lookup('pipe', 'date +%Y-%m-%d') }}"
tasks:
- name: create a directory
file:
path: /path/{{ date }}
state: directory
mode: '777'
register: logdir
- name: Find log files
find:
path: /test/logs
age: 3600
patterns:
- "name.log.*"
recurse: yes
register: testlogs
- debug:
var: testlogs.path
- debug:
var=item.files
with_items: '{{ testlogs.files }}'
- name: Copy files in to backup location
copy:
src: "{{ item.files }}"
dest: "{{ item.path }}"
with_items:
- '{{ item.files.testlog.files }}'
- '{{ item.path.logdir.path }}'
if i understand your problem you want to copy all remote log files to another destination with a folder dated:
- name: Purge old spider logs
become: true
hosts: node1
vars:
date: "{{ lookup('pipe', 'date +%Y-%m-%d') }}"
tasks:
- name: create a remote directory
file:
path: /path/{{ date }}
state: directory
mode: '777'
register: logdir
- name: Find log files
find:
path: logs
age: 3600
patterns:
- "name.log.*"
recurse: yes
register: testlogs
- name: Copy (remote) files in to backup location (remote)
copy:
remote_src: yes
src: "{{ item.path }}"
dest: "{{logdir.path}}/"
with_items:
- '{{ testlogs.files }}'
Using Ansible 2.9.12
Question: How do I configure Ansible to ensure the contents of a file is equal amongst at least 3 hosts, when the file is present at at least one host?
Imagine there are 3 hosts.
Host 1 does not has /file.txt.
Host 2 has /file.txt with contents hello.
Host 3 has /file.txt with contents hello.
Before the play is run, I am unaware whether the file is present or not. So the file could exist on host1, or host2 or host3. But the file exists on at least one of the hosts.
How would I ensure each time Ansible runs, the files across the hosts are equal. So in the end, Host 1 has the same file with the same contents as Host 2 or Host 3.
I'd like this to be dynamically set, instead of specifying the host names or group names, e.g. when: inventory_hostname == host1.
I am not expecting a check to see whether the contents of host 2 and 3 are equal
I do however, want this to be setup in an idempotent fashion.
The play below does the job, I think
shell> cat pb.yml
- hosts: all
tasks:
- name: Get status.
stat:
path: /file.txt
register: status
- block:
- name: Create dictionary status.
set_fact:
status: "{{ dict(keys|zip(values)) }}"
vars:
keys: "{{ ansible_play_hosts }}"
values: "{{ ansible_play_hosts|
map('extract', hostvars, ['status','stat','exists'])|
list }}"
- name: Fail. No file exists.
fail:
msg: No file exists
when: status.values()|list is not any
- name: Set reference to first host with file present.
set_fact:
reference: "{{ status|dict2items|
selectattr('value')|
map(attribute='key')|
first }}"
- name: Fetch file.
fetch:
src: /file.txt
dest: /tmp
delegate_to: "{{ reference }}"
run_once: true
- name: Copy file if not exist
copy:
src: "/tmp/{{ reference }}/file.txt"
dest: /file.txt
when: not status[inventory_hostname]
But, this doesn't check the existing files are in sync. It would be safer to sync all hosts, I think
- name: Synchronize file
synchronize:
src: "/tmp/{{ reference }}/file.txt"
dest: /file.txt
when: not status[inventory_hostname]
Q: "FATAL. could not find or access '/tmp/test-multi-01/file.txt on the Ansible controller. However, folder /tmp/test-multi-03 is present with the file.txt in it."
A: There is a problem with the fetch module when the task is delegated to another host. When the TASK [Fetch file.] is delegated to test-multi-01 which is localhost in this case changed: [test-multi-03 -> 127.0.0.1] the file will be fetched from test-multi-01 but will be stored in /tmp/test-multi-03/file.txt. The conclusion is, the fetch module ignores delegate_to when it comes to creating host-specific directories (not reported yet).
As a workaround, it's possible to set flat: true and store the files in a specific directory. For example, add the variable sync_files_dir with the directory, set fetch flat: true, and use the directory to both fetch and copy the file
- hosts: all
vars:
sync_files_dir: /tmp/sync_files
tasks:
- name: Get status.
stat:
path: /file.txt
register: status
- block:
- name: Create dir for files to be fetched and synced
file:
state: directory
path: "{{ sync_files_dir }}"
delegate_to: localhost
- name: Create dictionary status.
set_fact:
status: "{{ dict(keys|zip(values)) }}"
vars:
keys: "{{ ansible_play_hosts }}"
values: "{{ ansible_play_hosts|
map('extract', hostvars, ['status','stat','exists'])|
list }}"
- debug:
var: status
- name: Fail. No file exists.
fail:
msg: No file exists
when: status.values()|list is not any
- name: Set reference to first host with file present.
set_fact:
reference: "{{ status|dict2items|
selectattr('value')|
map(attribute='key')|
first }}"
- name: Fetch file.
fetch:
src: /file.txt
dest: "{{ sync_files_dir }}/"
flat: true
delegate_to: "{{ reference }}"
run_once: true
- name: Copy file if not exist
copy:
src: "{{ sync_files_dir }}/file.txt"
dest: /file.txt
when: not status[inventory_hostname]
We can achieve it by fetching the file from hosts where the file exists. The file(s) will be available on the control machine. However if the file which will be the source, exists on more than 1 node, then there will be no single source of truth.
Consider an inventory:
[my_hosts]
host1
host2
host3
Then the below play can fetch the file, then use that file to copy to all nodes.
# Fetch the file from remote host if it exists
- hosts: my_hosts
tasks:
- stat:
path: /file.txt
register: my_file
- fetch:
src: /file.txt
dest: /tmp/
when: my_file.stat.exists
- find:
paths:
- /tmp
patterns: file.txt
recurse: yes
register: local_file
delegate_to: localhost
- copy:
src: "{{ local_file.files[0].path }}"
dest: /tmp
If multiple hosts had this file then it would be in /tmp/{{ ansible_host }}. Then as we won't have a single source of truth, our best estimate can be to use the first file and apply on all hosts.
Well i believe the get_url module is pretty versatile - allows for local file paths or paths from a web server. Try it and let me know.
- name: Download files in all host
hosts: all
tasks:
- name: Download file from a file path
get_url:
url: file:///tmp/file.txt
dest: /tmp/
Edited ans:
(From documentation: For the synchronize module, the “local host” is the host the synchronize task originates on, and the “destination host” is the host synchronize is connecting to)
- name: Check that the file exists
stat:
path: /etc/file.txt
register: stat_result
- name: copy the file to other hosts by delegating the task to the source host
synchronize:
src: path/host
dest: path/host
delegate_to: my_source_host
when: stat_result.stat.exists
I am using unarchive module in ansible to extract tar.gz files to a folder. After that, i want to copy the extracted folder to another location. Is there a ansible way of doing it? I tried below , it does not work
- name : Extract Files
unarchive:
src: "/home/app/{{ item }}"
dest: "/home/app/"
mode: 0777
copy: no
with_items:
- "{{ sharef }}"
delegate_to: localhost
register: extractedfiles
- name : Copy Files
copy:
src: "{{extractedfiles.results}}"
dest: "{{ location }}"
delegate_to: localhost
How do i extract the filie names from the previously executed task extractedfiles.results does not work. Any help would be appreciated.
What's the right way to copy a group of files from a Windows system to the ansible controller?
I can find the files but I don't know how to reference the registered variable data to locate the path to hand to fetch
- win_find: paths="C:\\ADirectory" recurse=no patterns="*.log"
register: file_to_copy
- name: copy files
fetch: src="{{ item }}" dest=output
with_items: files_to_copy.files.path
You need to iterate over a list and it's the files that is a list in the output of win_find, not path.
This should work for you:
- name: copy files
fetch: src="{{ item.path }}" dest=output
with_items: "{{ files_to_copy.files }}"
This seems to work
- name: copy files
fetch:
src: "{{ item.path }}"
dest: output/
flat: yes
with_items: "{{ files_to_copy.files }}"
- name: Copy files
win_copy:
remote_src: yes
src: "{{ item.path }}"
dest: \\Xxx\\XXX
with_items: "{{ files_matched.files }}"
become: yes
become_method: runas
Used this to copy matched files across fileshare.
How can I copy more than a single file into remote nodes by Ansible in a task?
I've tried to duplicate the copy module line in my task to define files but it only copies the first file.
You can use the with_fileglob loop for this:
- copy:
src: "{{ item }}"
dest: /etc/fooapp/
owner: root
mode: 600
with_fileglob:
- "/playbooks/files/fooapp/*"
- name: copy multiple items
copy:
src: "{{ item.src }}"
dest: "{{ item.dest }}"
loop:
- src: containerizers
dest: /etc/mesos/containerizers
- src: another_file
dest: /etc/somewhere
- src: dynamic
dest: "{{ var_path }}"
Since Ansible 2.5 the with_* constructs are not recommended, and loop syntax should be used. A simple practical example:
- name: Copy CA files
copy:
src: '{{item}}'
dest: '/etc/pki/ca-trust/source/anchors'
owner: root
group: root
mode: 0644
loop:
- symantec-private.crt
- verisignclass3g2.crt
You can use with_together for this purpose:
- name: Copy multiple files to multiple directories
copy: src={{ item.0 }} dest={{ item.1 }}
with_together:
- [ 'file1', 'file2', 'file3' ]
- [ '/dir1/', '/dir2/', '/dir3/' ]
If you need more than one location, you need more than one task. One copy task can copy only from one location (including multiple files) to another one on the node.
- copy: src=/file1 dest=/destination/file1
- copy: src=/file2 dest=/destination/file2
# copy each file over that matches the given pattern
- copy: src={{ item }} dest=/destination/
with_fileglob:
- /files/*
You can use a find, and then copy those files.
---
- hosts: lnx
tasks:
- find:
paths: /appl/scripts/inq
recurse: true
patterns: "inq.Linux*"
register: file_to_copy
- copy:
src: "{{ item.path }}"
dest: /usr/local/sbin/
owner: root
mode: 0775
loop: "{{ files_to_copy.files }}"
Or you can use with_items:
- copy:
src: "{{ item }}"
dest: /etc/fooapp/
owner: root
mode: 600
with_items:
- dest_dir
- name: find inq.Linux*
find: paths="/appl/scripts/inq" recurse=yes patterns="inq.Linux*"
register: find_files
- name: set fact
set_fact:
all_files:
- "{{ find_files.files | map(attribute='path') | list }}"
when: find_files > 0
- name: copy files
copy:
src: "{{ item }}"
dest: /destination/
with_items: "{{ all_files }}"
when: find_files > 0
copy module is a wrong tool for copying many files and/or directory structure, use synchronize module instead which uses rsync as backend. Mind you, it requires rsync installed on both controller and target host. It's really powerful, check ansible documentation.
Example - copy files from build directory (with subdirectories) of controller to /var/www/html directory on target host:
synchronize:
src: ./my-static-web-page/build/
dest: /var/www/html
rsync_opts:
- "--chmod=D2755,F644" # copy from windows - force permissions
You can loop through variable with list of directories:
- name: Copy files from several directories
copy:
src: "{{ item }}"
dest: "/etc/fooapp/"
owner: root
mode: "0600"
loop: "{{ files }}"
vars:
files:
- "dir1/"
- "dir2/"
Use the following source code for copy multiple files on your client machine.
- name: Copy data to the client machine
hosts: hostname
become_method: sudo
become_user: root
become: true
tasks:
# Copy twice as sometimes files get skipped (mostly only one file skipped from a folder if the folder does not exist)
- name: Copy UFO-Server
copy:
src: "source files path"
dest: "destination file path"
owner: root
group: root
mode: 0644
backup: yes
ignore_errors: true
Note:
If you are passing multiple paths by using variable then
src: "/root/{{ item }}"
If you are passing path by using a variable for different items then
src: "/root/{{ item.source_path }}"
Copy files from multiple directories to multiple directories with Ansible
I found the guenhter answer helpful but needed to change also the remote files' mode. I don't have enough reputation to put this as a comment, which would be a more appropriate place for this. In the example, I copy two files from two directories into /tmp and /tmp/bin, which I create first and modify remote files mode.
- name: cpldupd
hosts: test
remote_user: root
become: true
vars:
- rpth: /tmp
tasks:
- name: Create '{{rpth}}/bin'
file:
path: '{{rpth}}/bin'
state: directory
- name: Transfer
copy: src={{ item.src }} dest={{ item.dest }} mode=0775
with_items:
- { src: '../utils/cpldupd', dest: '{{rpth}}/cpldupd' }
- { src: '../utils/bin/cpldupd', dest: '{{rpth}}/bin/cpldupd' }
Here is a generic solution for copying files:
...
- name: Find files you want to move
ansible.builtin.find:
paths: /path/to/files/
file_type: file
excludes: "*.txt" # Whatever pattern you want to exclude
register: files_output
- name: Copy the files
ansible.builtin.copy:
src: "{{ item.path }}"
dest: /destination/directory/
loop: "{{ files_output.files }}"
...
This is more powerful than using with_fileglob as you can match using regexes. Here is this play in action:
$ ls /path/to/files
demo.yaml test.sh ignore.txt
$ ls /destination/directory
file.h
$ ansible-playbook playbook.yaml
...[some output]...
$ ls /destination/directory
file.h demo.yaml test.sh
As you can see from the above example, ignore.txt was not copied over to the destination directory because of the excludes regex in the playbook. Ignoring files like this is not possible as simply using with_fileglob.
Additionally, you can move files from multiple directories with relative ease:
...
- name: Find files you want to move
ansible.builtin.find:
paths: /path/to/files/
# ... the rest of the task
register: list1
- name: Find more files you want to move
ansible.builtin.find:
paths: /different/path/
# ... the rest of the task
register: list2
- name: Copy the files
ansible.builtin.copy:
src: "{{ item.path }}"
dest: /destination/directory/
loop: "{{ list1.files + list2.files }}"
...
Here is a sample Ansible Script to copy multiple Files on remote Hosts
- name: Copy Multiple Files on remote Hosts
ansible.windows.win_copy:
src: "{{ srcPath }}/{{ item }}" # Remeber to us {{item}}
# as a postfix to source path
dest: "{{ destPath }}"
remote_src: yes # if source path is available on remote Host
with_items:
- abc.txt
- abc.properties
hosts: test
gather_facts: false
become: true
vars:
path: '/home/ansibm/playbooks'
remote_path: '/home/{{ansible_ssh_user}}'
dir: 'yml_files'
tasks:
name: "creating directory for backup file"
file:
path: '{{ remote_path }}/{{ dir }}'
state: directory
owner: '{{ansible_ssh_user}}'
group: '{{ansible_ssh_user}}'
mode: 0700
name: "copying yml files"
copy:
src: '{{item}}'
dest: '{{ remote_path }}/{{ dir }}'
owner: '{{ansible_ssh_user}}'
group: '{{ansible_ssh_user}}'
mode: 0644
loop:
- '{{ path }}/ab.html'
- '{{ path }}/cp.yml'