I am trying to copy a file (example: /home/abc.jar) from controller machine(source system) to remote server. but before copying that file, I want to take the backup of the already existed file (/home/abc.jar) on the remote server to some directory (example /home/bck/abc.jar). then copy the new file ,
How can I write the playbook for this using ansible?
Thank you :)
The docs are your friend:
Ansible Copy Module
Ansible Template Module
Both contain a 'backup' parameter which will automatically take a backup of a file prior to making changes.
Related
I need to copy directory from one remote host(which is not in inventory file/hosts file) to actually my working server(which is in inventory file). I can`t actually add the host from which I am copying to inventory file, cause I have a lot of tasks on the working host. I just need to to something like ssh to needed server, copy directory and paste it to working server. How can I deal with it? Cause fetch is copying only files as I can see and also it's working with hosts file/inventory file. But I need actually something like paste IP -> connect to it -> copy directory
P.S.
Working server(host) - the host from inventory file, on which ansible playbook is running
Remote host - another VM, from which I need copy directory
I have vault and need to restore one of the folder from the vault I have initiated the job using AWS CLI and got the inventory using JSON file but unable to get the complete folder from the inventory. Any one can help me restoring the folder?
I am able to get CSV file formate to see the archive ID of the files but is it possible to take the complete folder as it is showing separate archive ID for all files in folder?
I'm using the Copy module to transfer a 10G file from my machine to the remote /tmp dir. However, Copy uses an intermediate folder inside home and I need to transfer the file directly to /tmp because /home doesn't have enough space.
Is it possible to control the src path used by the Copy module?
Thanks
In your ansible.cfg change the "remote_tmp" to a location where sufficient space available or you run your playbook as below:
ANSIBLE_REMOTE_TEMP=/dir1/some_dir/large_space/ ansible-playbook copy.yml
Official Documentation for ANSIBLE_REMOTE_TEMP with shell plugin
I want to back my DynamoDB local server. I have install DynamoDB server in Linux machine. Some sites are refer to create a BASH file in Linux os and connect to S3 bucket, but in local machine we don't have S3 bucket.
So i am stuck with my work, Please help me Thanks
You need to find the database file created by DynamoDb local. From the docs:
-dbPath value — The directory where DynamoDB will write its database file. If you do not specify this option, the file will be written to
the current directory. Note that you cannot specify both -dbPath and
-inMemory at once.
The file name would be of the form youraccesskeyid_region.db. If you used the -sharedDb option, the file name would be shared-local-instance.db
By default, the file is created in the directory from which you ran dynamodb local. To restore you'll have to the copy the same file and while running dynamodb, specify the same dbPath.
In my RF scripts, I need to modify the hosts file under c:\windows\system32\drivers\etc to finish my test job. But the scripts fail with ERROR 20047, I tried to create file in another folder, it was OK. I get the conclusion that it only doesn't work in folder etc, it seems that I do have the permission to create/modify hosts file. I modified the owner of folder etc and granted all permission to the current user and tried again. It still does not work.
BTW: The RF scripts run with jython. I find that It works while I use python to create/modify hosts in folder etc.
Could anyone help me ?