I am using ansible to move .js files from my local machine to an ec2 development environment and am having an issue copying the entire folder structure.
I am using the following task to move the files and seem to be running into an issue where only the files directly in the dist folder are getting copied. I need to copy the entire folder including the child files and folders to the destination folder.
- name: Copy each file over that matches the given pattern
copy:
src: "{{ item }}"
dest: "/home/admin/microservice/dist"
owner: "admin"
group: "admin"
force: "yes"
recurse: "true"
mode: 0755
with_fileglob:
- "/Users/myfolder/WebStormProjects/project/microservice/dist/*.js"
I need to copy the entire folder contents from the source to the destination, including subfolders and files? What can I do to fix this task to make this happen?
With the copy module, the solution to your problem would be much more complicated than you think, because:
you can't match a directory and *.js files in a single globbing operation,
even if you could, you can't use the same "copy" operation to copy the file as well as create a directory (notice: create directory, not copy! as the latter would imply copying with all the files).
You'd need to handle the directories and files separately (see an implementation in the first revision of this answer).
With rsync, the solution is much more concise and requires only setting appropriate filters --include='*/' --include='*.js' --exclude='*'.
The synchronize task implementing this in Ansible:
- synchronize:
src: /source/Users/myfolder/WebStormProjects/project/microservice/dist/
dest: /home/admin/microservice/dist/
rsync_opts:
- --include=*/
- --include=*.js
- --exclude=*
Note 1: it is important not to add quotes for the filtered values in rsync_opts.
Note 2: you might still need to set the appropriate ownership and permissions.
First using copy module here shouldn't be ideal as "The copy module recursively copy facility does not scale to lots (>hundreds) of files. For alternative, see synchronize module, which is a wrapper around rsync."
copy module documentation
synchonize module documentation
However you can perform as below using copy module::
copy:
src: "{{ item }}"
dest: /home/admin/microservice/dist
with_lines: "find /home/admin/microservice/dist -type f -name *.js "
Similarly you can try with "synchronize" module as below::
synchronize:
src: "{{ item }}"
dest: /home/admin/microservice/dist
with_lines: "find /home/admin/microservice/dist -type f -name *.js "
If you want to retain the Directory layout, You can do it as below::
In Step1=> You will copy the necessary pattern files in parent directory structure into a temp directory.
Step2=>Then you need to copy the temp directory into the destination. After wards you can delete the temp directory or whatever your use case is.
- name: copy pattern files and directory into a temp directory
shell: find . -type f -name "*.js" | cpio -pvdmB /temp/dir/
args:
chdir: "/Users/myfolder/WebStormProjects/project/microservice/dist/"
- name: Copy the temp directory recursively to destination directory
copy:
src: "/temp/dir/"
dest: "/home/admin/microservice/dist/"
owner: "admin"
group: "admin"
force: "yes"
mode: 0755
Related
Below is the folder structure
playbook
|-groups_Vars
|-host
|-roles
|-archive-artifact
|-task
|-main.yml
|-archive-playbook.yml
myfile
In my main.yml, I need to archive the playbook in playbook.tar.gz.
- archive:
path: "<earlier path>/playbook/*"
dest: "<earlier path>/playbook.tar.gz"
format: gz
The folder that holds a playbooks is accessible in the special variable playbook_dir.
Getting the parent directory of a file or directory in Ansible is possible via the filter dirname.
And, as pointed in the documentation, path can be either a single element or a list of elements, so you could also have myfile included in that list.
So, to archive the playbook directory in the parent folder of the playbook directory, one could do:
- archive:
path:
- "{{ playbook_dir }}/*"
- "{{ playbook_dir | dirname }}/myfile"
dest: "{{ playbook_dir | dirname }}/playbook.tar.gz"
format: gz
I have a.dsx file in the remote server which I wish to rename. I have ansible playbook that gets the artefacts from nexus, zips it and then unzips it to the remote server.
That unzipped file needs to be renamed.
unarchive:
remote_src: yes
src: {{destinationDir}}/{{artefactid}}-{{version}}.tar.gz
dest: {{destinationDir}}
The filename which gets unarchived is djp-1.0.2-20200805.123-1.dsx
And i just want djp.dsx
Actually the filename which I mentioned is just an example.. The filename would keep changing everytime we do deployment. Can you please suggest how can I modify the move command then.
Please use mv command to rename the file just as you would rename in a file in your terminal. As discussed in the comments
1) set_fact to a variable: item.path is the file your want to rename -set_fact: fname: {{ item.path | basename }}. You also have to find the files first.
2) - set_fact: prefix: "{{ fname | regex_replace('(\w+)-.*', '\\1') }}"
3) - name: Rename file command: mv ./djp-1.0.2-20200805.123-1.dsx ./{{prefix}}.dsx
I'm trying to specify part of an archive not to extract via Ansible's unarchive module using the exclude option.
I believe the syntax should be roughly as shown here...
- name: Extract files from discovered zip file
unarchive:
src: "{{ base_path }}/weblogic-deployment/environments/{{ client_environment }}/discovered_domain.zip"
dest: "{{ base_path }}/weblogic-deployment/environments/{{ client_environment }}/tmp"
exclude:
- ./wlsdeploy/applications/
remote_src: yes
I have tried a great many slight differences but the excluded directory is always output. Any suggestions?
The exclude option is expected to be a list of paths to exclude. So most likely it doesn't support directories alone.
List the directory and file entries that you would like to exclude from the unarchive action.
So try something like:
exclude:
- ./wlsdeploy/applications/*
For some syntax examples, see: unarchive/tasks/main.yml. Also check the source code.
To exclude the folder entirely:
exclude:
- "wlsdeploy/applications"
To extract the folder, but exclude the contents, you can use a glob:
exclude:
- "wlsdeploy/applications/*"
As the paths are relative to the archive, you don't need the preceding ./.
If you need to look inside the archive first (to find the paths), you can use less:
less discovered_domain.zip
From the unarchive tests for exclude:
- name: Unpack archive file excluding regular and glob files.
unarchive:
src: "{{ remote_tmp_dir }}/unarchive-00.{{item}}"
dest: "{{ remote_tmp_dir }}/exclude-{{item}}"
remote_src: yes
exclude:
- "exclude/exclude-*.txt"
- "other/exclude-1.ext"
with_items:
- zip
- tar
I'm doing ansible remote copy, have a question about dest, is there any difference if the dest ends with or without /?
dest: /tmp/dest
dest: /tmp/dest/
I've tried with and without /, looks like both of them do the copy.
- name: copy the properties file to dest
copy:
src: /tmp/src/{{ item }}
dest: /tmp/dest
remote_src: yes
with_items:
- runtime.properties
- default.properties
If you are copying a directory, it doesn't matter whether or not the target path ends with /. In both cases, Ansible first ensure the target directory exists and then copy the source directory into the target directory. That is, given either:
- copy:
src: src_dir
dest: /tmp/dest/
Or:
- copy:
src: src_dir
dest: /tmp/dest
In both cases, Ansible will first create /tmp/dest if it does not exist and will then create /tmp/dest/src_dir and populate it with the contents of src_dir.
However, if you are copying a file the situation is a little different. If the target destination /tmp/dest does not exist, this playbook will create a file named /tmp/dest:
- copy:
src: src_file
dest: /tmp/dest
However, if you add a trailing path to the destination, then Ansible will first create the directory /tmp/dest and then create the file /tmp/dest/src_file.
- copy:
src: src_file
dest: /tmp/dest/
If a directory named /tmp/dest already exists, then both of the above examples will do the same thing (create /tmp/dest/src_file).
I'm willing to transfer the contents of a folder unzipped from a source say myfolder to a location say dest_dir but apparently everything I try moves/copies/generates myfolder in the dest_dir location.
I tried
command: mv src dest_dir
I also tried unarchiving in the dest_dir location using,
unarchive:
src: /path/to/myfolder
dest: dest_dir
copy: no
become: yes
Apparently, for copy module, I found that remote_src does not support recursive copying yet.
What is the correct way to go about this?
Normally, in my system, I would do mv /path/to/myfolder/* dest_dir but wildcards throw an error with Ansible.
I'm using Ansible 2.3.2.
The reason you can't do it easily in Ansible is because Ansible was not designed to do it.
Just execute the command directly with shell module. Your requirement is not idempotent anyway:
- shell: mv /path/to/myfolder/* dest_dir
become: yes
Pay attention to mv defaults, you might want to add -f to prevent it from prompting for confirmation.
Otherwise play with synchronize module, but there's no value added for "move" operation. Just complexity.