I have three hosts available in my inventory file
[controller]
1.1.1.1
2.2.2.2
3.3.3.3
i have a variable in group_var folder which specifies the master node
master=1.1.1.1
sql.conf is available in my home directory (/home/ubuntu/sql.conf) of all 3 controller hosts.
Now, i need to copy the file (test.txt) from master to others . Is there any way in ansible to copy the files from one specific server to others.
i am trying like this but couldnt achieve though.
- hosts: all
sudo: yes
tasks:
- name: copy files
local_action: command rsync -a /home/ubuntu/test.txt {{ master }}:///home/ubuntu/test.txt
One option is to use the fetch module to copy the file from the master node to your local node, and then use the copy module normally to distribute that file to other nodes. Something like:
- hosts: master
tasks:
- fetch:
src: /path/to/myfile.txt
dest: tmp/
- hosts: all:!master
tasks:
- copy:
src: tmp/master/myfile.txt
dest: /path/to/myfile.txt
Related
I write a simple playbook to copy some configuration files on a certain machine.
I need to copy this file in a different host too for backup. Is possible to declare different host in the same playbook?
I need this cause my "backup host" can be different and I retrieve it from the hostname I use.
I tried both copy and raw module and nothing seems to work
here the example of playbook
- name: find file
find:
file_type: directory
paths: /prd/viv/dat/repository/
patterns: "{{inventory_hostname}}"
recurse: yes
register: find
delegate_to: localhost
- name: Copy MASTER
raw: echo xmonit$(echo {{find.files[0].path}} | cut -d "/" -f7 )
delegate_to: localhost
register: xmonit
- debug:
msg: "{{xmonit.stdout}}"
- name: Copy MASTER raw
raw: sshpass -p "mypass" scp {{find.files[0].path}}/master.cfg myuser#{{xmonit.stdout}}:/prd
delegate_to: localhost
#- name: Copy MASTER
#copy:
#src: "{{find.files[0].path}}/master.cfg"
#dest: /prd/cnf/dat/{{inventory_hostname}}/
edit: if I use the copy module the destination remains that of the main host while the goal is to copy to a third host.
I need to declare a different host for this single task
- name: Copy MASTER
copy:
src: "{{find.files[0].path}}/master.cfg"
dest: /prd/cnf/dat/{{inventory_hostname}}/
Like Zeitounator told me in the comments copy module are the best way to act.
like this it work for me
- name: Copy MASTER
copy:
src: "{{find.files[0].path}}/master.cfg"
dest: /prd/cnf/dat/{{inventory_hostname}}/
delegate_to: xmonit.stdout_lines[0]
I am trying to copy a script to a remote machine and I have to preserve mtime of the file.
I have tried using mode:preserve and mode=preserve as stated in ansible documentation, here!
but it seems to preserve permissions and ownership only.
- name: Transfer executable script script
copy:
src: /home/avinash/Downloads/freeradius.sh
dest: /home/ssrnd09/ansible
mode: preserve
In respect to your question and the comment of Zeitounator, I've setup a test with synchronize_module and with parameter times, which was working as required.
---
- hosts: test.example.com
become: no
gather_facts: no
tasks:
- name: Synchronize passing in extra rsync options
synchronize:
src: test.sh
dest: "/home/{{ ansible_user }}/test.sh"
times: yes
Thanks to
How to tell rsync to preserve time stamp on files
I need to analyze nginx log file from multi hosts.
First, i want to copy them to a host directory.
For example, i want to copy nginx error log file from 6 hosts to a destination host directory.
The 6 hosts ips are 192.168.0.2 - 192.168.0.7. The nginx error log path is /var/log/nginx/nginx_error.log. I want to copy them to /var/log/nginx_error directory in destination host 192.168.0.10. Every file is named by source host ip. How can i write playbook using ansible?
[serverB]
192.168.0.10
[serverA]
192.168.0.2
192.168.0.3
192.168.0.4
192.168.0.5
- hosts: serverB
tasks:
- name: Copy Remote-To-Remote (from serverA to serverB)
synchronize: src=/var/log/nginx/nginx_error.log dest=/var/log/nginx_error/
delegate_to: serverA
The problem is that i can't know how to name dest file using source ip address?
I don't have an elegant solution. some workaround here. to name dest file using source ip you can use ansible_hostname or inventory_hostname variable defined by ansible.
Make sure that you've created the /tmp/nginx_logs/ directory in ansible controller before executing.
---
- name : Copy Remote
hosts: serverA, serverB
tasks:
- name: Copy the files from the Source Machine(serverA) to Ansible Controller
synchronize:
mode: pull
src: /var/log/nginx/nginx_error.log
dest: "/tmp/nginx_logs/test_{{ ansible_hostname }}.txt"
when: "inventory_hostname in groups['serverA']"
- name: Copy log files to the destination server
copy:
src: /tmp/nginx_logs/
dest: /var/log/nginx_error/
when: "inventory_hostname in groups['serverB']"
I want to ovrwrite file on remote location using Ansible. No matter content in zip file is changes or not, everytime I run playbook file needs to be overwrite on destination server.
Below is my playbook
- hosts: localhost
tasks:
- name: Checking if File is exsists to copy to update servers.
stat:
path: "/var/lib/abc.zip"
get_checksum: False
get_md5: False
register: win_stat_result
- debug:
var: win_stat_result.stat.exists
- hosts: uploads
tasks:
- name: Getting VARs
debug:
var: hostvars['localhost']['win_stat_result']['stat'] ['exists']
- name: copy Files to Destination Servers
win_copy:
src: "/var/lib/abc.zip"
dest: E:\xyz\data\charts.zip
force: yes
when: hostvars['localhost']['win_stat_result']['stat']['exists']
When I run this playbook it didn't overwrite file on destination as file is already exists. I used force=yes but it didn't worked.
Try the Ansible copy module.
The copy module defaults to overwriting an existing file that is set to the dest parameter (i.e. force defaults to yes). The source file can either come from the remote server you're connected to or the local machine your playbook runs from. Here's a code snippet:
- name: Overwrite file if it exists, the remote server has the source file bc remote_src is set below
copy:
src: "/var/lib/abc.zip"
dest: E:\xyz\data\charts.zip
remote_src: yes
You can remove the file before copy the new one:
- name: Delete file before copy
win_file:
path: E:\xyz\data\charts.zip
state: absent
I'm running an Ansible playbook for host_a. Some tasks I delegate to host_b.
Now I would like to use the synchronize module to copy a directory from localhost to host_b. But delegate_to is the wrong option here, since this results in copying from host_b to host_a.
Is there a possibility to do that?
- hosts: host_a
tasks:
- name: rsync directory from localhost to host_b
synchronize:
# files on localhost
src: files/directory
dest: /directory/on/host_b
# delegate_to does not work here
# delegate_to: host_b
The only solution I can think of is deleting the target directory and then using a recursive copy with the copy module.
I couldn't find anything in the module documentation.
(Using ansible 2.4.2.0)
Doing this task in its own play for host_b is also not really an option because the variables I need for this task depend on host_a.
The easiest solution in this case is to use rsync command with local_action, i.e
- hosts: cache1
tasks:
- name: rsync directory from localhost to host_b
local_action: command rsync -az "{{ playbook_dir }}/files/directory" "{{ hostvars['host_b']['ansible_host'] }}:/directory/on/host_b"
{{ playbook_dir }} helps by not hardcoding paths on local system.