Bulk Copy Using Wildcard in Ansible - ansible

Please be informed that, im trying to copy a bulk files from my source server to the destination server using ansible. While trying an error. Please help me.
---
- name: Going to copy bulk files
hosts: test
vars_prompt:
- name: copy
prompt: Enter the Bulk File to Copy
private: no
tasks:
- name: Copy bulk files
shell: cp /tmp/guru/{{ copy }}* /ansible/sri

The shell module executes a shell command on the destination server, which explains the error message cp: cannot stat ‘/tmp/guru/a*’: No such file or directory: the source files of the cp does not exists on the destination server.
Ansible provide a lot of modules which are more appropriate to use than executing shell commands.
In your case, the copy module is the one you need: it copies files from source server to destination server. You can combine it with a with_fileglob loop:
tasks:
- name: Copy bulk files
copy:
src: "{{ item }}"
dest: /ansible/sri
with_fileglob: "/tmp/guru/{{ copy }}*"

Related

Ansible can't copy files on remote server, but the command runs correctly if run from command line

I'm trying to move everything under /opt/* to a new location on the remote server. I've tried this using command to run rsync directly, as well as using both the copy and the sychronize ansible module. In all cases I get the same error message saying:
"msg": "rsync: link_stat \"/opt/*\" failed: No such file or directory
If I run the command listed in the "cmd" part of the ansible error message directly on my remote server it works without error. I'm not sure why ansible is failing.
Here is the current attempt using sychronize:
- name: move /opt to new partition
become: true
synchronize:
src: /opt/*
dest: /mnt/opt/*
delegate_to: "{{ inventory_hostname }}"
You should skip the wildcards that is a common mistake:
UPDATE
Thanks to the user: # Zeitounator, I managed to do it with synchronize.
The advantage of using synchronize instead of copy module is performance, it's much faster if you have a lot of files to copy.
- name: move /opt to new partition
become: true
synchronize:
src: /opt/
dest: /mnt/opt
delegate_to: "{{ inventory_hostname }}"
So basically the initial answer was right but you needed to delete the wildcards "*" and the slash on dest path.
Also, you should add the deletion of files on /opt/

Extracting multi part zip files with Ansible (Example case: WebSphere installation)

For HCL Connections, we still need WebSphere and I want to automate this complex and slow process with Ansible. WebSphere needs to be manually downloaded with differenet ZIP files for each component, for example:
├── CIK1VML.zip
├── CIK1WML.zip
└── CIK1XML.zip
The char after CIK1 identifies the part. On the command line, I can unzip them by replacing those part identifier with a question mark:
unzip '/cnx-smb/was/supplements/CIK1?ML.zip' -d /tmp/was-suppl-manual
I'd like to use the unarchive module cause it supports features like remote_src which would be usefull for me, so I tried a simple POC playbook:
- hosts: 127.0.0.1
connection: local
tasks:
- name: Unpack test
become: yes
unarchive:
src: "/cnx-smb/was/supplements/CIK1?ML.zip"
remote_src: no
dest: "/tmp/was-extracted"
But this doesn't work:
TASK [Unpack test] **********************************************************************************************************************************************************************************************************************************************************
Wednesday 10 February 2021 16:17:25 +0000 (0:00:00.637) 0:00:00.651 ****
fatal: [127.0.0.1]: FAILED! => changed=false
msg: |-
Could not find or access '/cnx-smb/was/supplements/'CIK1?ML.zip'' on the Ansible Controller.
If you are using a module and expect the file to exist on the remote, see the remote_src option
I also tried different src paths like /cnx-smb/was/supplements/'CIK1?ML.zip', cause the unzip CLI call works only when at least the filename is masked in quotes, or alternatively the entire path. Ansible accepts only when the file name is quoted, '/cnx-smb/was/supplements/CIK1?ML.zip' seems to be interpreted as relative path (which obviously fails).
It seems that those multipart zip-archives aren't really "multi part" archives, as I know from compression formats like 7zip where we have File.partX.7z which are only used together. 7zip validates them and throws an error if e.g. a part is missing.
The situation is different on those zip files. I took a look in them and noticed that I can extract every single zip file without the others. Every zip file contains a part of the installation archive. It seems that zip itself doesn't divide a large folder into parts. It's IBM who put some folders like disk2 in a seperate archive file for whatever reason.
This means I can do the same with ansible: Just extract every single file on its own, but in the same directory:
- hosts: 127.0.0.1
connection: local
vars:
base_dir: /cnx-smb/was/supplements/
tasks:
- name: Unpack
become: yes
unarchive:
src: "{{ base_dir }}/{{ item }}"
remote_src: no
dest: "/tmp/was-extracted"
with_items:
- CIK1VML.zip
- CIK1WML.zip
- CIK1XML.zip
Both extracted folderse (Ansible + manually using zip command with ? placeholder) were of the same size and contains the same data:
vagrant#ansible:/cnx-repo$ du -hs /tmp/was-extracted/
3.0G /tmp/was-extracted/
vagrant#ansible:/cnx-repo$ du -hs /tmp/was-suppl-manual
3.0G /tmp/was-suppl-manual

ansible-playbook gather information from a file

I want to read a file by ansible and find specific thing and store all of them in a file in my localhost
for example there is /tmp/test file in all host and I want to grep specific thing in this file and store all of them in my home.
What should I do?
There might be many ways to accomplish this. The choice of Ansible modules (or even tools) can vary.
One approach is (using only Ansible):
Slurp the remote file
Write new file with filtered content
Fetch the file to Control machine
Example:
- hosts: remote_host
tasks:
# Slurp the file
- name: Get contents of file
slurp:
src: /tmp/test
register: testfile
# Filter the contents to new file
- name: Save contents to a variable for looping
set_fact:
testfile_contents: "{{ testfile.content | b64decode }}"
- name: Write a filtered file
lineinfile:
path: /tmp/filtered_test
line: "{{ item }}"
create: yes
when: "'TEXT_YOU_WANT' in item"
with_items: "{{ testfile_contents.split('\n') }}"
# Fetch the file
- name: Fetch the filtered file
fetch:
src: /tmp/filtered_test
dest: /tmp/
This will fetch the file to /tmp/<ANSIBLE_HOSTNAME>/tmp/filtered_test.
You can use the Ansible fetch module to download files from the remote system to your local system. You can then do the processing locally, as shown in this Ansible cli example:
REMOTE=[YOUR_REMOTE_SERVER]; \
ansible -m fetch -a "src=/tmp/test dest=/tmp/ansiblefetch/" $REMOTE && \
grep "[WHAT_YOU_ARE_INTERESTED_IN]" /tmp/ansiblefetch/$REMOTE/tmp/test > /home/user/ansible_files/$REMOTE
This snippet runs the ad-hoc version of Ansible, calling the module fetch with the source folder (on the remote) and the destination folder (locally) as arguments. Fetch copies the file into a folder [SRC]/[REMOTE_NAME]/[DEST], from which we then grep what we are interested in, and output that in the /home/user/ansible_files/$REMOTE.

How to run linux like cp command on same server..but copy says it does not find remote server

I am trying to emulate scenario of copying local file from one directory to another directory on same machine..but ansible copy command is looking for remote server always..
code I am using
- name: Configure Create directory
hosts: 127.0.0.1
connection: local
vars:
customer_folder: "{{ customer }}"
tasks:
- file:
path: /opt/scripts/{ customer_folder }}
state: directory
- copy:
src: /home/centos/absample.txt
dest: /opt/scripts/{{ customer_folder }}
~
I am running this play book like
ansible-playbook ab_deploy.yml --extra-vars "customer=ab"
So two problem i am facing
It should create a directory called ab under /opt/scripts/ but it creating folder as { customer_folder }}..its not taking ab as name of directory
second, copy as i read documentation, copy only work to copy files from local to remote machine, But i want is simply copy from local to local..
how can i achieve this..might be silly, i am just trying out things
Please suggest.
I solved it..i used cmd under shell module then it worked.

How to preserve mtime/ctime of a file while copying file from local to remote using ansible playbook?

I am trying to copy a script to a remote machine and I have to preserve mtime of the file.
I have tried using mode:preserve and mode=preserve as stated in ansible documentation, here!
but it seems to preserve permissions and ownership only.
- name: Transfer executable script script
copy:
src: /home/avinash/Downloads/freeradius.sh
dest: /home/ssrnd09/ansible
mode: preserve
In respect to your question and the comment of Zeitounator, I've setup a test with synchronize_module and with parameter times, which was working as required.
---
- hosts: test.example.com
become: no
gather_facts: no
tasks:
- name: Synchronize passing in extra rsync options
synchronize:
src: test.sh
dest: "/home/{{ ansible_user }}/test.sh"
times: yes
Thanks to
How to tell rsync to preserve time stamp on files

Resources