I'm using an Ansible playbook to copy files between my host and a server. The thing is, I have to run the script repeatedly in order to upload some updates. At the beginning I was using the "copy" module of Ansible, but to improve performance of the synchronizing of files and directories, I've now switched to use the "synchronize" module. That way I can ensure Ansible uses rsync instead of sftp or scp.
With the "copy" module, I was able to specify the file's mode in the destination host by adding the mode option (e.g. mode=644). I want to do that using synchronize, but it only has the perms option that accepts yes or no as values.
Is there a way to specify the file's mode using "synchronize", without having to inherit it?
Thx!
Finally I solved it using rsync_opts
- name: sync file
synchronize:
src: file.py
dest: /home/myuser/file.py
rsync_opts:
- "--chmod=F644"
Related
I'm writing a playbook and i want to create a symlink.
While installing citrix on the linux system i need to create a sym link using this command:
ln -s /etc/ssl/serts cacerts
now in the playbook i use it as a :
- name: Link
command: ln -s /etc/ssl/serts cacerts
The thing is when I use the format above it works fine. But if I want to check if the file exists and if not creating and if yes then skip to the next task.
I could use ignore_errors: yes but I think there is a better way of doing it.
Thank you very much in advance.
You can use the "file" module:
- name: Link
file:
src: cacerts
dest: /etc/ssl/serts
state: link
It is generally better to use a proper module which will deal with failure conditions and check mode. In this case, it will not fail if the link already exists and it is correct.
You may want to give an absolute src depending on your application.
For more information: https://docs.ansible.com/ansible/latest/modules/file_module.html
I am trying to copy a file (example: /home/abc.jar) from controller machine(source system) to remote server. but before copying that file, I want to take the backup of the already existed file (/home/abc.jar) on the remote server to some directory (example /home/bck/abc.jar). then copy the new file ,
How can I write the playbook for this using ansible?
Thank you :)
The docs are your friend:
Ansible Copy Module
Ansible Template Module
Both contain a 'backup' parameter which will automatically take a backup of a file prior to making changes.
In one of the ansible roles we extract some tar.gz file and then we replace one of the extracted files with another one to fix some issue.
The problem is when we run ansible again, ansible is extracting the archive back again since the directory content is changed and naturally marking the task changed and also replaces the file again as expected.
So we have two "changes" now everytime we run the playbook...
How should I handle this issue to keep the operation idempotent?
Use exclude option to ignore certain paths, see documentation.
i.e.
- unarchive:
src: https://example.com/example.zip
dest: /usr/local/bin
remote_src: True
exclude: bad.config
creates might also suit you, unarchive step will not be run if specified path already exists on remote machine
I'm working on a playbook to upload a configuration file to remote servers, but the remote servers do not have python installed (which is a requirement for using modules). I have successfully written other playbooks using the raw feature to avoid having to install python on the servers, but I can't find any examples in the Ansible documentation to perform a file upload using bare-bones ssh. Is a non-module based upload possible?
No sure why do you use Ansible this way, but you can make a local task with scp:
- name: remote task
raw: echo remote
- name: local scp
local_action: command scp /path/to/localfile {{ inventory_hostname }}:/path/to/remotefile
- name: remote task
raw: cat /path/to/remotefile
I usually check and install python with raw module and continue with Ansible core modules.
This answer may not always be applicable, but as long as you are allowed to put the files on some kind of Web or so server, and as long as curl or wget or similar are installed on the remote system, you can use those tools to download your files within the raw block.
I've managed to set up a minimal Ansible playbook to execute some scripts on my machines:
- name: Execute CLI on remote servers
hosts: webserver
tasks:
- name: Get metrics
shell: /home/user1/bin/cli.sh --file=script.cli
The only issue is that this relies on the filesystem to store the scripts. I'd like to store my script in a repository (such as git) and pass a reference to it as argument to the shell.Something like:
shell: /home/user1/bin/cli.sh --file=ssh://git#github.com/mylogin/script.cli
Any suggestion is highly appreciated!
Not a very elegant solution, but you can use the Ansible git module (http://docs.ansible.com/ansible/git_module.html) to first clone the repository that contains your scripts on your target machine(s) (webserver) and then reference those files from the shell module.