Ansible create a zip file backup on windows host - windows

I want to zip the windows directory into zip file. archive function is not working.
for windows I see win_unzip module, but I didn't find win_zip module.
How do we take the backup of existing folder in windows?
- name: Backup existing install folder to zip
archive:
path:
- "{{ installdir }}"
dest: "{{ stragedir }}\\{{ appname }}.zip"
format: zip
error:
[WARNING]: FATAL ERROR DURING FILE TRANSFER: Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/ansible/plugins/connection/winrm.py", line 276, in _winrm_exec
self._winrm_send_input(self.protocol, self.shell_id, command_id, data, eof=is_last) File "/usr/lib/python2.7/site-packages/ansible/plugins/connection/winrm.py", line 256, in
_winrm_send_input protocol.send_message(xmltodict.unparse(rq)) File "/usr/lib/python2.7/site-packages/winrm/protocol.py", line 256, in send_message raise
WinRMOperationTimeoutError() WinRMOperationTimeoutError
thanks
SR

There is no module from ansible currently to do zip archiving on Windows. I've created a simple module that acts like win-unzip that I use, as long as powershell 4 is installed on the host this should work for you. The code is here: https://github.com/tjkellie/PublicRepo/blob/master/ansible feel free to use until an official module is created.
Add the files to your library:
library/ # Put the custom modules files here
filter_plugins/
roles/
library/ # or here
And use the module from a playbook like this:
- name: zip a directory
win_zip:
src: C:\Users\Someuser\Logs
dest: C:\Users\Someuser\OldLogs.zip
creates: C:\Users\Someuser\OldLogs.zip

Related

How to archive multiple folders under one folder using Ansible

I'm using the below ansible-playbook code to archive multiple folders under IBM folder.
Below is my absolute path Directory structure:
/app
|-- /IBM
|--/test
|--/log
|--/common
|--/api
I wish to build an archive (gz) that has only IBM folder containing only common and api folders.
Thus, I wrote the below playbook:
- name: Creating the archive
archive:
path:
- /was/IBM/common
- /was/IBM/api
dest: /var/backup/mysetup.tar.gz
exclude_path:
- /was/IBM/log
- /was/IBM/test
format: gz
This gives me the file mysetup.tar.gz.
I want the mysetup.tar.gz file to have a folder called IBM which should have two folders common and api. Thus, I'm expecting the below in the mysetup.tar.gz
IBM
|--/common
|--/api
But, the mysetup.tar.gz has no IBM folder but only the common and api folders.
Can you please guide me as to how I can get the archive to have both the folders inside the IBM folder?
You need to include whole IBM folder and then exclude paths you do not want
- name: Creating the archive
archive:
path:
- /was/IBM
dest: /var/backup/mysetup.tar.gz
exclude_path:
- /was/IBM/log
- /was/IBM/test
format: gz

Ansible `archive` module to archive without compression

I have to archive a bunch of files, and want to avoid compression to save time. This is a daily operation to archive 1 TB of data, and write it to a different drive, so "time is of the essence".
Looking at the Ansible archive module documentation it's not clear how to build up the target file without compression.
Currently, my Ansible task looks like this:
- name: Create snapshot tarball
become: true
archive:
path: "{{ snapshots_path.stdout_lines }}"
dest: "{{backup_location}}{{short_date.stdout}}_snapshot.tgz"
owner: "{{backup_user}}"
group: "{{backup_group}}"
Is it possible to speed up this process by telling the module to NOT compress? If yes, how?
Based on this other answer on superuser, tar is not compressing files per default, on the other hand gz, which is the default format of archive is.
So you could try going by:
- name: Create snapshot tarball
become: true
archive:
path: "{{ snapshots_path.stdout_lines }}"
dest: "{{backup_location}}{{short_date.stdout}}_snapshot.tar"
format: tar
owner: "{{backup_user}}"
group: "{{backup_group}}"
This is also backed-up by the manual page of tar:
DESCRIPTION
GNU tar is an archiving program designed to store multiple files in a
single file (an archive), and to manipulate such archives. The
archive can be either a regular file or a device (e.g. a tape drive,
hence the name of the program, which stands for tape archiver), which
can be located either on the local or on a remote machine.

Resolve Local Files by Playbook Directory?

I have the following Ansible role which simply does the following:
Create a temporary directory.
Download Goss, a server testing tool, into that temporary directory.
Upload a main Goss YAML file for the tests.
Upload additional directories for additional included tests.
Here are a couple places where I'm using it:
naftulikay.python-dev
naftulikay.ruby-dev
Specifically, these playbooks upload a local file adjacent to the playbook named goss.yml and a directory goss.d again adjacent to the playbook.
Unfortunately, it seems that Ansible logic has changed recently, causing my tests to not work as expected. My role ships with a default goss.yml, and it appears that when I set goss_file: goss.yml within my playbook, it uploads degoss/files/goss.yml instead of the Goss file adjacent to my playbook.
If I'm passing the name of a file to a role, is there a way to specify that Ansible should look up the file in the context of the playbook or the current working directory?
The actual role logic that is no longer working is this:
# deploy test files including the main and additional test files
- name: deploy test files
copy: src={{ item }} dest={{ degoss_test_root }} mode=0644 directory_mode=0755 setype=user_tmp_t
with_items: "{{ [goss_file] + goss_addtl_files + goss_addtl_dirs }}"
changed_when: degoss_changed_when
I am on Ansible 2.3.2.0 and I can reproduce this across distributions (namely CentOS 7, Ubuntu 14.04, and Ubuntu 16.04).
Ansible searches for relative paths in role's scope first, then in playbook's scope.
For example if you want to copy file test.txt in role r1, search order is this:
/path/to/playbook/roles/r1/files/test.txt
/path/to/playbook/roles/r1/test.txt
/path/to/playbook/roles/r1/tasks/files/test.txt
/path/to/playbook/roles/r1/tasks/test.txt
/path/to/playbook/files/test.txt
/path/to/playbook/test.txt
You can inspect your search_path order by calling ansible with ANSIBLE_DEBUG=1.
To answer your question, you have to options:
Use filename that doesn't exist within role's scope. Like:
goss_file: local_goss.yml
Supply absolute path. For example, you can use:
goss_file: '{{ playbook_dir }}/goss.yml'
Ansible doesn't apply search logic if the path is absolute.

Ansible win_unzip Module takes far to long

At our customer the Ansible module win_unzip takes for to long when executed. Our code is:
- name: unzip zip package into C:\server\dlls
win_unzip:
src: "{{app_path}}\\app_dll.zip"
dest: "{{app_path}}\\dlls"
rm: true
This step takes more than 10 minutes. The zip file is copied with win_copy in the direct step before, code is here:
- name: copy zip package to C:\server
win_copy:
src: "path2zip.zip"
dest: "{{app_path}}\\app_dll.zip"
The extraction successfully finishes, but it blocks our Pipeline for more than 10 minutes, which isnĀ“t acceptable.
We reduced the time needed to unzip the package with the help of the Powershell Module Expand-Archive to nearly zero. Here is the code:
- name: name: unzip zip package into C:\server\dlls
win_shell: "Expand-Archive {{app_path}}\\app_dll.zip -DestinationPath {{app_path}}\\dlls"
Our pipeline is now fast again, but it would be nice to have a fast Ansible win_unzip Module!

How do I prevent module.run in saltstack if my file hasn't changed?

In the 2010.7 version of SaltStack, the onchanges element is available for states. However, that version isn't available for Windows yet, so that's right out.
And unfortunately salt doesn't use the zipfile module to extract zipfiles. So I'm trying to do this:
/path/to/nginx-1.7.4.zip:
file.managed:
- source: http://nginx.org/download/nginx-1.7.4.zip
- source_hash: sha1=747987a475454d7a31d0da852fb9e4a2e80abe1d
extract_nginx:
module.run:
- name: extract.zipfile
- archive: /path/to/nginx-1.7.4.zip
- path: /path/to/extract
- require:
- file: /path/to/nginx-1.7.4.zip
But this tries to extract the files every time. I don't want it to do that, I only want it to extract the file if the .zip file changes, because once it's been extracted then it'll be running (I've got something setup to take care of that). And once it's running, I can't overwrite nginix.exe because Windows is awesome like that.
So how can I extract the file only if it's a newer version of nginx?
I would probably use jinja to test for the existence of a file that you know would only exist if the zip file has been extracted.
{% if salt['file.exists']('/path/to/extract/known_file.txt') %}
extract_nginx:
module.run:
- name: extract.zipfile
- archive: /path/to/nginx-1.7.4.zip
- path: /path/to/extract
- require:
- file: /path/to/nginx-1.7.4.zip
{% endif %}
This will cause the extract_nginx state to not appear in the final rendered sls file if the zip file has been extracted.

Resources