Below is the folder structure
playbook
|-groups_Vars
|-host
|-roles
|-archive-artifact
|-task
|-main.yml
|-archive-playbook.yml
myfile
In my main.yml, I need to archive the playbook in playbook.tar.gz.
- archive:
path: "<earlier path>/playbook/*"
dest: "<earlier path>/playbook.tar.gz"
format: gz
The folder that holds a playbooks is accessible in the special variable playbook_dir.
Getting the parent directory of a file or directory in Ansible is possible via the filter dirname.
And, as pointed in the documentation, path can be either a single element or a list of elements, so you could also have myfile included in that list.
So, to archive the playbook directory in the parent folder of the playbook directory, one could do:
- archive:
path:
- "{{ playbook_dir }}/*"
- "{{ playbook_dir | dirname }}/myfile"
dest: "{{ playbook_dir | dirname }}/playbook.tar.gz"
format: gz
Quick question is it possible to create a directory using the file module and register the path of the new directory so that you can use it as a variable.
I would love to do this because I am creating a directory with timestamp but now I would love to use that directory to store some data and seems like when I use the lookup plugin it fails because the time changes when...
Here is the first two tasks where I create two directories;
- name: Create Directory with timestamp to store the data if it doesn't exist
when: inventory_hostname in groups['local']
file:
path: "{{store_files_path}}/{{ansible_date_time.date}}"
state: directory
mode: "0755"
- name: Create Directory with timestamp to store data that was run multiple times that day
when: inventory_hostname in groups['local']
file:
path: "{{store_files_path}}/{{ansible_date_time.date}}/{{ansible_date_time.time}}"
state: directory
mode: "0755"
I am using this variable in the other tasks to store some data in that directory which works really well ... "{{store_files_path}}/{{ansible_date_time.date}}" but now the issue is when it comes to retrieving a file in that directory using the lookup plugin it fails because the second directory which is a time stamp can't be found because the lookup plugin is actually looking for a directory with a different time which is the current time when the task is being executed which is different from when the second task was executed.
Kindly assist, I have thought of trying the stat feature but I don't know how to execute that thought.
my lookup task
- name: Deploy to the server
when: inventory_hostname in groups['Servers']
authorized_key:
user: "{{hostvars['dummy']['user']}}"
state: present
key: "{{ lookup('file','{{store_files_path}}/{{ansible_date_time.date}}/{{ansible_date_time.time}}/file') }}"
There are a couple of options:
You can keep the public keys on the Ansible control machine and use lookup:
# example: use /home/ansible/.ssh/dummy_id_rsa.pub on ansible control machine
- authorized_key:
user: "{{ hostvars['dummy']['user'] }}"
key: "{{ lookup('file', '/home/ansible/.ssh/dummy_id_rsa.pub') }}"
state: "present"
Pass the public key as a string:
vars:
dummy_pub_key: "ssh-rsa AAAABC12x........... dummy#localhost"
tasks:
- authorized_key:
user: "{{ hostvars['dummy']['user'] }}"
key: "{{ dummy_pub_key }}"
state: "present"
Update:
Oh, and for your original question on saving the created path into variable:
- name: Create Directory with timestamp to store the data if it doesn't exist
when: inventory_hostname in groups['local']
file:
path: "{{store_files_path}}/{{ansible_date_time.date}}"
state: directory
mode: "0755"
register: newdir_res
- name: show the newly created directory path
debug:
msg: "Directory path is {{ newdir_res.path }}"
I'm trying to create an Ansible task to save the content of a variable to a new file.
Using Ansible 2.5.13 and Python 2.7.5.
I've tried already to copy the content of the variable to a destination path where the file should be created...
- name: Save alert rule example file to Prometheus
copy:
content: "{{ alert_rule_config }}"
dest: /opt/compose/prom/alert_rule.yml
Tried also to create the file before copying the content of the variable
- name: Create alert rule file
file:
path: /opt/compose/prom/alert_rule.yml
state: touch
- name: Save alert rule example file to Prometheus
copy:
content: "{{ alert_rule_config }}"
dest: /opt/compose/prom/alert_rule.yml
Tried to wrap the destination path in quotes...
But no matter what a directory /opt/compose/prom/alert_rule.yml/ is created!
The content of the variable is something like
alert_rule_config:
groups:
- name: debug-metrics
rules:
- alert: DebugAlert
expr: test_expression
I expect the file to be created (because it does not exist) and the content of the variable to be saved to the newly created file but the task fails with
FAILED! => {"changed": false, "msg": "can not use content with a dir as dest"}
I want to avoid issuing a command and would prefer to use an Ansible module.
You need to create the target directory instead of the target file, or you will get Destination directory /opt/compose/prom does not exist in the first option or Error, could not touch target: [Errno 2] No such file or directory: '/opt/compose/prom/alert_rule.yml' in the second option.
- name: Create alert rule containing directory
file:
path: /opt/compose/prom/
state: directory
- name: Save alert rule example file to Prometheus
copy:
content: "{{ alert_rule_config }}"
dest: /opt/compose/prom/alert_rule.yml
But as #Calum Halpin says, if during your tests you did an error and created a directory /opt/compose/prom/alert_rule.yml/, you need to remove it before.
This will happen if a directory /opt/compose/prom/alert_rule.yml already exists when your tasks are run.
To remove it as part of your tasks add
- file:
path: /opt/compose/prom/alert_rule.yml
state: absent
ahead of your other tasks.
I want to ovrwrite file on remote location using Ansible. No matter content in zip file is changes or not, everytime I run playbook file needs to be overwrite on destination server.
Below is my playbook
- hosts: localhost
tasks:
- name: Checking if File is exsists to copy to update servers.
stat:
path: "/var/lib/abc.zip"
get_checksum: False
get_md5: False
register: win_stat_result
- debug:
var: win_stat_result.stat.exists
- hosts: uploads
tasks:
- name: Getting VARs
debug:
var: hostvars['localhost']['win_stat_result']['stat'] ['exists']
- name: copy Files to Destination Servers
win_copy:
src: "/var/lib/abc.zip"
dest: E:\xyz\data\charts.zip
force: yes
when: hostvars['localhost']['win_stat_result']['stat']['exists']
When I run this playbook it didn't overwrite file on destination as file is already exists. I used force=yes but it didn't worked.
Try the Ansible copy module.
The copy module defaults to overwriting an existing file that is set to the dest parameter (i.e. force defaults to yes). The source file can either come from the remote server you're connected to or the local machine your playbook runs from. Here's a code snippet:
- name: Overwrite file if it exists, the remote server has the source file bc remote_src is set below
copy:
src: "/var/lib/abc.zip"
dest: E:\xyz\data\charts.zip
remote_src: yes
You can remove the file before copy the new one:
- name: Delete file before copy
win_file:
path: E:\xyz\data\charts.zip
state: absent
I am trying to copy the content of dist directory to nginx directory.
- name: copy html file
copy: src=/home/vagrant/dist/ dest=/usr/share/nginx/html/
But when I execute the playbook it throws an error:
TASK [NGINX : copy html file] **************************************************
fatal: [172.16.8.200]: FAILED! => {"changed": false, "failed": true, "msg": "attempted to take checksum of directory:/home/vagrant/dist/"}
How can I copy a directory that has another directory and a file inside?
You could use the synchronize module. The example from the documentation:
# Synchronize two directories on one remote host.
- synchronize:
src: /first/absolute/path
dest: /second/absolute/path
delegate_to: "{{ inventory_hostname }}"
This has the added benefit that it will be more efficient for large/many files.
EDIT: This solution worked when the question was posted. Later Ansible deprecated recursive copying with remote_src
Ansible Copy module by default copies files/dirs from control machine to remote machine. If you want to copy files/dirs in remote machine and if you have Ansible 2.0, set remote_src to yes
- name: copy html file
copy: src=/home/vagrant/dist/ dest=/usr/share/nginx/html/ remote_src=yes directory_mode=yes
To copy a directory's content to another directory you CAN use ansibles copy module:
- name: Copy content of directory 'files'
copy:
src: files/ # note the '/' <-- !!!
dest: /tmp/files/
From the docs about the src parameter:
If (src!) path is a directory, it is copied recursively...
... if path ends with "/", only inside contents of that directory are copied to destination.
... if it does not end with "/", the directory itself with all contents is copied.
Resolved answer:
To copy a directory's content to another directory I use the next:
- name: copy consul_ui files
command: cp -r /home/{{ user }}/dist/{{ item }} /usr/share/nginx/html
with_items:
- "index.html"
- "static/"
It copies both items to the other directory. In the example, one of the items is a directory and the other is not. It works perfectly.
The simplest solution I've found to copy the contents of a folder without copying the folder itself is to use the following:
- name: Move directory contents
command: cp -r /<source_path>/. /<dest_path>/
This resolves #surfer190's follow-up question:
Hmmm what if you want to copy the entire contents? I noticed that * doesn't work – surfer190 Jul 23 '16 at 7:29
* is a shell glob, in that it relies on your shell to enumerate all the files within the folder before running cp, while the . directly instructs cp to get the directory contents (see https://askubuntu.com/questions/86822/how-can-i-copy-the-contents-of-a-folder-to-another-folder-in-a-different-directo)
Ansible remote_src does not support recursive copying.See remote_src description in Ansible copy docs
To recursively copy the contents of a folder and to make sure the task stays idempotent I usually do it this way:
- name: get file names to copy
command: "find /home/vagrant/dist -type f"
register: files_to_copy
- name: copy files
copy:
src: "{{ item }}"
dest: "/usr/share/nginx/html"
owner: nginx
group: nginx
remote_src: True
mode: 0644
with_items:
- "{{ files_to_copy.stdout_lines }}"
Downside is that the find command still shows up as 'changed'
the ansible doc is quite clear https://docs.ansible.com/ansible/latest/collections/ansible/builtin/copy_module.html for parameter src it says the following:
Local path to a file to copy to the remote server.
This can be absolute or relative.
If path is a directory, it is copied recursively. In this case, if path ends with "/",
only inside contents of that directory are copied to destination. Otherwise, if it
does not end with "/", the directory itself with all contents is copied. This behavior
is similar to the rsync command line tool.
So what you need is skip the / at the end of your src path.
- name: copy html file
copy: src=/home/vagrant/dist dest=/usr/share/nginx/html/
I found a workaround for recursive copying from remote to remote :
- name: List files in /usr/share/easy-rsa
find:
path: /usr/share/easy-rsa
recurse: yes
file_type: any
register: find_result
- name: Create the directories
file:
path: "{{ item.path | regex_replace('/usr/share/easy-rsa','/etc/easy-rsa') }}"
state: directory
mode: "{{ item.mode }}"
with_items:
- "{{ find_result.files }}"
when:
- item.isdir
- name: Copy the files
copy:
src: "{{ item.path }}"
dest: "{{ item.path | regex_replace('/usr/share/easy-rsa','/etc/easy-rsa') }}"
remote_src: yes
mode: "{{ item.mode }}"
with_items:
- "{{ find_result.files }}"
when:
- item.isdir == False
I got involved whole a day, too! and finally found the solution in shell command instead of copy: or command: as below:
- hosts: remote-server-name
gather_facts: no
vars:
src_path: "/path/to/source/"
des_path: "/path/to/dest/"
tasks:
- name: Ansible copy files remote to remote
shell: 'cp -r {{ src_path }}/. {{ des_path }}'
strictly notice to:
1. src_path and des_path end by / symbol
2. in shell command src_path ends by . which shows all content of directory
3. I used my remote-server-name both in hosts: and execute shell
section of jenkins, instead of remote_src: specifier in playbook.
I guess it is a good advice to run below command in Execute Shell section in jenkins:
ansible-playbook copy-payment.yml -i remote-server-name
Below worked for me,
-name: Upload html app directory to Deployment host
copy: src=/var/lib/jenkins/workspace/Demoapp/html dest=/var/www/ directory_mode=yes
This I found an ideal solution for copying file from Ansible server to remote.
copying yaml file
- hosts: localhost
user: {{ user }}
connection: ssh
become: yes
gather_facts: no
tasks:
- name: Creation of directory on remote server
file:
path: /var/lib/jenkins/.aws
state: directory
mode: 0755
register: result
- debug:
var: result
- name: get file names to copy
command: "find conf/.aws -type f"
register: files_to_copy
- name: copy files
copy:
src: "{{ item }}"
dest: "/var/lib/jenkins/.aws"
owner: {{ user }}
group: {{ group }}
remote_src: True
mode: 0644
with_items:
- "{{ files_to_copy.stdout_lines }}"
How to copy directory and sub dirs's and files from ansible server to remote host
- name: copy nmonchart39 directory to {{ inventory_hostname }}
copy:
src: /home/ansib.usr.srv/automation/monitoring/nmonchart39
dest: /var/nmon/data
Where:
copy entire directory: src: /automation/monitoring/nmonchart39
copy directory contents src: nmonchart39/