I write a simple playbook to copy some configuration files on a certain machine.
I need to copy this file in a different host too for backup. Is possible to declare different host in the same playbook?
I need this cause my "backup host" can be different and I retrieve it from the hostname I use.
I tried both copy and raw module and nothing seems to work
here the example of playbook
- name: find file
find:
file_type: directory
paths: /prd/viv/dat/repository/
patterns: "{{inventory_hostname}}"
recurse: yes
register: find
delegate_to: localhost
- name: Copy MASTER
raw: echo xmonit$(echo {{find.files[0].path}} | cut -d "/" -f7 )
delegate_to: localhost
register: xmonit
- debug:
msg: "{{xmonit.stdout}}"
- name: Copy MASTER raw
raw: sshpass -p "mypass" scp {{find.files[0].path}}/master.cfg myuser#{{xmonit.stdout}}:/prd
delegate_to: localhost
#- name: Copy MASTER
#copy:
#src: "{{find.files[0].path}}/master.cfg"
#dest: /prd/cnf/dat/{{inventory_hostname}}/
edit: if I use the copy module the destination remains that of the main host while the goal is to copy to a third host.
I need to declare a different host for this single task
- name: Copy MASTER
copy:
src: "{{find.files[0].path}}/master.cfg"
dest: /prd/cnf/dat/{{inventory_hostname}}/
Like Zeitounator told me in the comments copy module are the best way to act.
like this it work for me
- name: Copy MASTER
copy:
src: "{{find.files[0].path}}/master.cfg"
dest: /prd/cnf/dat/{{inventory_hostname}}/
delegate_to: xmonit.stdout_lines[0]
Related
I want to read a file by ansible and find specific thing and store all of them in a file in my localhost
for example there is /tmp/test file in all host and I want to grep specific thing in this file and store all of them in my home.
What should I do?
There might be many ways to accomplish this. The choice of Ansible modules (or even tools) can vary.
One approach is (using only Ansible):
Slurp the remote file
Write new file with filtered content
Fetch the file to Control machine
Example:
- hosts: remote_host
tasks:
# Slurp the file
- name: Get contents of file
slurp:
src: /tmp/test
register: testfile
# Filter the contents to new file
- name: Save contents to a variable for looping
set_fact:
testfile_contents: "{{ testfile.content | b64decode }}"
- name: Write a filtered file
lineinfile:
path: /tmp/filtered_test
line: "{{ item }}"
create: yes
when: "'TEXT_YOU_WANT' in item"
with_items: "{{ testfile_contents.split('\n') }}"
# Fetch the file
- name: Fetch the filtered file
fetch:
src: /tmp/filtered_test
dest: /tmp/
This will fetch the file to /tmp/<ANSIBLE_HOSTNAME>/tmp/filtered_test.
You can use the Ansible fetch module to download files from the remote system to your local system. You can then do the processing locally, as shown in this Ansible cli example:
REMOTE=[YOUR_REMOTE_SERVER]; \
ansible -m fetch -a "src=/tmp/test dest=/tmp/ansiblefetch/" $REMOTE && \
grep "[WHAT_YOU_ARE_INTERESTED_IN]" /tmp/ansiblefetch/$REMOTE/tmp/test > /home/user/ansible_files/$REMOTE
This snippet runs the ad-hoc version of Ansible, calling the module fetch with the source folder (on the remote) and the destination folder (locally) as arguments. Fetch copies the file into a folder [SRC]/[REMOTE_NAME]/[DEST], from which we then grep what we are interested in, and output that in the /home/user/ansible_files/$REMOTE.
I want to ovrwrite file on remote location using Ansible. No matter content in zip file is changes or not, everytime I run playbook file needs to be overwrite on destination server.
Below is my playbook
- hosts: localhost
tasks:
- name: Checking if File is exsists to copy to update servers.
stat:
path: "/var/lib/abc.zip"
get_checksum: False
get_md5: False
register: win_stat_result
- debug:
var: win_stat_result.stat.exists
- hosts: uploads
tasks:
- name: Getting VARs
debug:
var: hostvars['localhost']['win_stat_result']['stat'] ['exists']
- name: copy Files to Destination Servers
win_copy:
src: "/var/lib/abc.zip"
dest: E:\xyz\data\charts.zip
force: yes
when: hostvars['localhost']['win_stat_result']['stat']['exists']
When I run this playbook it didn't overwrite file on destination as file is already exists. I used force=yes but it didn't worked.
Try the Ansible copy module.
The copy module defaults to overwriting an existing file that is set to the dest parameter (i.e. force defaults to yes). The source file can either come from the remote server you're connected to or the local machine your playbook runs from. Here's a code snippet:
- name: Overwrite file if it exists, the remote server has the source file bc remote_src is set below
copy:
src: "/var/lib/abc.zip"
dest: E:\xyz\data\charts.zip
remote_src: yes
You can remove the file before copy the new one:
- name: Delete file before copy
win_file:
path: E:\xyz\data\charts.zip
state: absent
How would you resolve this small script in ansible playbook ?
Files to copy are named [ServerName].[extension]
The destination server is ServeName
for file in $(ls /var/tmp)
do
ServerName=$(echo $file | awk -F. 'NF{NF--};1'
scp /var/tmp/$file $ServerName:/var/tmp/
scp /var/tmp/pkg.rpm $ServerName:/var/tmp/
ssh $ServerName "cd /var/tmp; yum -y localinstall pkg.rpm "
done
Thanks for your help
The idea would be to have something like this (but working, of course)
- name: main loop
copy:
src: "{{ item }}"
dest: "/var/tmp/myfile.json"
- name: Install package
yum:
name: "packageToInstall"
state: present
delegate_to: "{{ item.split('/')[-1][:-5] }}"
with_fileglob:
- "/var/temp/*json"
If you were to write this in yaml, you should use ansible_hostname, ansible already has several informations about the host via setup.
- name: copying files
copy:
src: /mine/file.ext
dest: /etc/{{ ansible_hostname }}.ext
More about copy in module description.
All facts gather during setup are available here.
I'm running an Ansible playbook for host_a. Some tasks I delegate to host_b.
Now I would like to use the synchronize module to copy a directory from localhost to host_b. But delegate_to is the wrong option here, since this results in copying from host_b to host_a.
Is there a possibility to do that?
- hosts: host_a
tasks:
- name: rsync directory from localhost to host_b
synchronize:
# files on localhost
src: files/directory
dest: /directory/on/host_b
# delegate_to does not work here
# delegate_to: host_b
The only solution I can think of is deleting the target directory and then using a recursive copy with the copy module.
I couldn't find anything in the module documentation.
(Using ansible 2.4.2.0)
Doing this task in its own play for host_b is also not really an option because the variables I need for this task depend on host_a.
The easiest solution in this case is to use rsync command with local_action, i.e
- hosts: cache1
tasks:
- name: rsync directory from localhost to host_b
local_action: command rsync -az "{{ playbook_dir }}/files/directory" "{{ hostvars['host_b']['ansible_host'] }}:/directory/on/host_b"
{{ playbook_dir }} helps by not hardcoding paths on local system.
I have three hosts available in my inventory file
[controller]
1.1.1.1
2.2.2.2
3.3.3.3
i have a variable in group_var folder which specifies the master node
master=1.1.1.1
sql.conf is available in my home directory (/home/ubuntu/sql.conf) of all 3 controller hosts.
Now, i need to copy the file (test.txt) from master to others . Is there any way in ansible to copy the files from one specific server to others.
i am trying like this but couldnt achieve though.
- hosts: all
sudo: yes
tasks:
- name: copy files
local_action: command rsync -a /home/ubuntu/test.txt {{ master }}:///home/ubuntu/test.txt
One option is to use the fetch module to copy the file from the master node to your local node, and then use the copy module normally to distribute that file to other nodes. Something like:
- hosts: master
tasks:
- fetch:
src: /path/to/myfile.txt
dest: tmp/
- hosts: all:!master
tasks:
- copy:
src: tmp/master/myfile.txt
dest: /path/to/myfile.txt