How to archive multiple folders under one folder using Ansible - ansible

I'm using the below ansible-playbook code to archive multiple folders under IBM folder.
Below is my absolute path Directory structure:
/app
|-- /IBM
|--/test
|--/log
|--/common
|--/api
I wish to build an archive (gz) that has only IBM folder containing only common and api folders.
Thus, I wrote the below playbook:
- name: Creating the archive
archive:
path:
- /was/IBM/common
- /was/IBM/api
dest: /var/backup/mysetup.tar.gz
exclude_path:
- /was/IBM/log
- /was/IBM/test
format: gz
This gives me the file mysetup.tar.gz.
I want the mysetup.tar.gz file to have a folder called IBM which should have two folders common and api. Thus, I'm expecting the below in the mysetup.tar.gz
IBM
|--/common
|--/api
But, the mysetup.tar.gz has no IBM folder but only the common and api folders.
Can you please guide me as to how I can get the archive to have both the folders inside the IBM folder?

You need to include whole IBM folder and then exclude paths you do not want
- name: Creating the archive
archive:
path:
- /was/IBM
dest: /var/backup/mysetup.tar.gz
exclude_path:
- /was/IBM/log
- /was/IBM/test
format: gz

Related

TeamCity: How to setup proper trigger file wildcards

Say I have a solution containing the following projects in a folder-tree (this is C# but could in principle be anything):
MySolution
- SharedFolderA
- File1.cs
- File2.cs
- SharedProject.csproj
- SharedFolderB
- File1.cs
- File2.cs
- SharedProject.csproj
- Hosts
- Host1
- Program.cs
- Host1.csproj
- Host2
- Program.cs
- Host2.csproj
Now in TeamCity I wish to make building of each host into seperate builds. So I will have a project called MySolution containing the following builds:
Build Host1
Build Host2
Now comes the question: For each build I want to setup a VCS trigger rule, that triggers the build if the commits contain changes to either
files in the root folder
files in one or more of the shared folders
files in the specific host folder - but not the other host folder
Example: Build of Host2 should trigger if any files have been changed in any of the following folders:
MySolution
SharedFolderA
SharedFolderB
Hosts/Host2
How should I setup the File wildcards?
The format is as follows from TeamCity Ref Perhaps consider a different approach, since you have different build/host conditions!
Option 1:
Since you have various hosts with build conditions, a dependency build with wildcards is (IMHO) **recommended, i.e. use the Artifact Dependency feature which in turn allows wild cards in files & folders picture below.
Option 2: However if you choose to go down your original Build Trigger Path, you can try this.
+|-[:user=VCS_username;]root=VCS_root_id;]comment=VCS_comment_regexp]]:Ant_like_wildcard
Once you have setup via the UI, you can make a copy and edit the triggers by adding modifying like so.
//all files, or .CS in the root folder
/root/**
/root/*.cs
//files in one or more of the shared folders
/root/SharedFolderA/**
/root/SharedFolderB/*.cs
//files in the specific host folder - but not the other host folder
/root/SharedFolderA/**
// the -ve sign will ignore it
-/root/Hosts/Host2/**
You can add regex or ** or ? singmatch

Serverless.yml AWS Lambda in Windows 10: The symlinked folder is not in final package

My folder structure like this:
Root ---
common
search
elastic_client
elastic_client.py
elastic_delete
elastic_delete.py
requirements.txt
elastic_client (symlink)
../elastic_client
function1
elastic_delete (symlink)
../common/search/elastic_delete
serverless.yml
functions:
elastic-delete:
handler: elastic_delete.lambda_handler
module: elastic_delete
package:
include:
- elastic_delete/**
When I do "sls deploy", the elastic_client folder is not getting deployed/not in the final .zip file, that means the elastic_client.py is not getting packed. This issue is only in Windows 10. In Mac, I don't see this issue.
I created symlinks with the command mklink.
I don't have a windows machine, but typically the way I do this is packaging functionality of the framework. https://www.serverless.com/framework/docs/providers/google/guide/packaging/
At least for MacOS, I just include the directories I want (relative from where the serverless.yml files are, and they are included into the directory of the deployed package.
Hope that helps.

Ansible `archive` module to archive without compression

I have to archive a bunch of files, and want to avoid compression to save time. This is a daily operation to archive 1 TB of data, and write it to a different drive, so "time is of the essence".
Looking at the Ansible archive module documentation it's not clear how to build up the target file without compression.
Currently, my Ansible task looks like this:
- name: Create snapshot tarball
become: true
archive:
path: "{{ snapshots_path.stdout_lines }}"
dest: "{{backup_location}}{{short_date.stdout}}_snapshot.tgz"
owner: "{{backup_user}}"
group: "{{backup_group}}"
Is it possible to speed up this process by telling the module to NOT compress? If yes, how?
Based on this other answer on superuser, tar is not compressing files per default, on the other hand gz, which is the default format of archive is.
So you could try going by:
- name: Create snapshot tarball
become: true
archive:
path: "{{ snapshots_path.stdout_lines }}"
dest: "{{backup_location}}{{short_date.stdout}}_snapshot.tar"
format: tar
owner: "{{backup_user}}"
group: "{{backup_group}}"
This is also backed-up by the manual page of tar:
DESCRIPTION
GNU tar is an archiving program designed to store multiple files in a
single file (an archive), and to manipulate such archives. The
archive can be either a regular file or a device (e.g. a tape drive,
hence the name of the program, which stands for tape archiver), which
can be located either on the local or on a remote machine.

Ansible create a zip file backup on windows host

I want to zip the windows directory into zip file. archive function is not working.
for windows I see win_unzip module, but I didn't find win_zip module.
How do we take the backup of existing folder in windows?
- name: Backup existing install folder to zip
archive:
path:
- "{{ installdir }}"
dest: "{{ stragedir }}\\{{ appname }}.zip"
format: zip
error:
[WARNING]: FATAL ERROR DURING FILE TRANSFER: Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/ansible/plugins/connection/winrm.py", line 276, in _winrm_exec
self._winrm_send_input(self.protocol, self.shell_id, command_id, data, eof=is_last) File "/usr/lib/python2.7/site-packages/ansible/plugins/connection/winrm.py", line 256, in
_winrm_send_input protocol.send_message(xmltodict.unparse(rq)) File "/usr/lib/python2.7/site-packages/winrm/protocol.py", line 256, in send_message raise
WinRMOperationTimeoutError() WinRMOperationTimeoutError
thanks
SR
There is no module from ansible currently to do zip archiving on Windows. I've created a simple module that acts like win-unzip that I use, as long as powershell 4 is installed on the host this should work for you. The code is here: https://github.com/tjkellie/PublicRepo/blob/master/ansible feel free to use until an official module is created.
Add the files to your library:
library/ # Put the custom modules files here
filter_plugins/
roles/
library/ # or here
And use the module from a playbook like this:
- name: zip a directory
win_zip:
src: C:\Users\Someuser\Logs
dest: C:\Users\Someuser\OldLogs.zip
creates: C:\Users\Someuser\OldLogs.zip

How do I prevent module.run in saltstack if my file hasn't changed?

In the 2010.7 version of SaltStack, the onchanges element is available for states. However, that version isn't available for Windows yet, so that's right out.
And unfortunately salt doesn't use the zipfile module to extract zipfiles. So I'm trying to do this:
/path/to/nginx-1.7.4.zip:
file.managed:
- source: http://nginx.org/download/nginx-1.7.4.zip
- source_hash: sha1=747987a475454d7a31d0da852fb9e4a2e80abe1d
extract_nginx:
module.run:
- name: extract.zipfile
- archive: /path/to/nginx-1.7.4.zip
- path: /path/to/extract
- require:
- file: /path/to/nginx-1.7.4.zip
But this tries to extract the files every time. I don't want it to do that, I only want it to extract the file if the .zip file changes, because once it's been extracted then it'll be running (I've got something setup to take care of that). And once it's running, I can't overwrite nginix.exe because Windows is awesome like that.
So how can I extract the file only if it's a newer version of nginx?
I would probably use jinja to test for the existence of a file that you know would only exist if the zip file has been extracted.
{% if salt['file.exists']('/path/to/extract/known_file.txt') %}
extract_nginx:
module.run:
- name: extract.zipfile
- archive: /path/to/nginx-1.7.4.zip
- path: /path/to/extract
- require:
- file: /path/to/nginx-1.7.4.zip
{% endif %}
This will cause the extract_nginx state to not appear in the final rendered sls file if the zip file has been extracted.

Resources