Modify source file used for the Copy on the target machine - ansible

I'm using the Copy module to transfer a 10G file from my machine to the remote /tmp dir. However, Copy uses an intermediate folder inside home and I need to transfer the file directly to /tmp because /home doesn't have enough space.
Is it possible to control the src path used by the Copy module?
Thanks

In your ansible.cfg change the "remote_tmp" to a location where sufficient space available or you run your playbook as below:
ANSIBLE_REMOTE_TEMP=/dir1/some_dir/large_space/ ansible-playbook copy.yml
Official Documentation for ANSIBLE_REMOTE_TEMP with shell plugin

Related

Run remote files directly in dockerfile

I am wondering if it is possible to RUN a remote file stored in an NFS share when building an image from a dockerfile.
Currently I am using the COPY command and then the RUN command to execute the files, however many of the files I need to create the image are extremely large.
Is it possible to execute files stored in an NFS share directly in the dockerfile without having to copy them all over?
You can only RUN files inside your container - so it needs to copied to your container.
What you can do is move the COPY commands to the beginning of your Dockerfile so that they are cached and don't need to be copied every time you change a command later in the Dockerfile.
You can RUN curl.... to grab the remote file ,then execute it sure.
But this will only run at image build time, not during lifecycle of the container
You could also mount the NFS volume to your host, then COPY the files.
Otherwise, remote execution is a pretty basic security flaw and shouldn't be possible under any circumstances

how to copy directories from remote machines to localhost?

In ansible playbook, fetch module only copies file from target machine to local.
copy/synchronize modules can copy directories/files from local to target machine.
Then how to copy direcotries from remote target machines
Acording to ducumentation :
synchronize – A wrapper around rsync
mode(string) Choices: {pull,push}
Specify the direction of the synchronization.
In push mode the localhost or delegate is the source.
In pull mode the remote host in context is the source.
This is not specific to ansible but you can use scp:
scp me#my-host /path/to/remote/folder /path/to/local/folder
This will copy the folder from your remote machine to a local folder.

Taking backup of a file in remote location using ansible

I am trying to copy a file (example: /home/abc.jar) from controller machine(source system) to remote server. but before copying that file, I want to take the backup of the already existed file (/home/abc.jar) on the remote server to some directory (example /home/bck/abc.jar). then copy the new file ,
How can I write the playbook for this using ansible?
Thank you :)
The docs are your friend:
Ansible Copy Module
Ansible Template Module
Both contain a 'backup' parameter which will automatically take a backup of a file prior to making changes.

ansible: how to copy files from a linked directory

I have a symbolically linked folder in linux and would like to copy the contents of the folder to a folder on a remote machine. I tried so far using the synchronize command because I am trying to copy the entire tree, folders and files.
When I run the synchronize command, it is creating the folder as a symbolic link and the folder is empty. How can I copy the contents of the symbolic link without creating a symbolic link?
you can use
copy_links: yes
with synchronize module.
"Copy symlinks as the item that they point to (the referent) is copied, rather than the symlink."
synchronize module Ref

Can lftp execute a command on the downloaded files (as part of the mirroring process)?

This may be asking too much from an already very powerful tool, but is there a chance that lftp mirror can execute a command during the mirroring process (from remote directory to the local machine)?
Specific example: lftp is asked to mirror a remote directory with xml files into a local folder and as soon as each file is downloaded/updated, it converts the file to JSON format using xml2json.
I can think of a solution that relies on monitoring the local copy of the mirrored folder for changes via find and then executing xml2json on the new/updated files, but perhaps there is a simpler way?
You can use xfer:verify and xfer:verify-command settings to run a local command on every transferred file.

Resources