Ansible win_unzip Module takes far to long - windows

At our customer the Ansible module win_unzip takes for to long when executed. Our code is:
- name: unzip zip package into C:\server\dlls
win_unzip:
src: "{{app_path}}\\app_dll.zip"
dest: "{{app_path}}\\dlls"
rm: true
This step takes more than 10 minutes. The zip file is copied with win_copy in the direct step before, code is here:
- name: copy zip package to C:\server
win_copy:
src: "path2zip.zip"
dest: "{{app_path}}\\app_dll.zip"
The extraction successfully finishes, but it blocks our Pipeline for more than 10 minutes, which isnĀ“t acceptable.

We reduced the time needed to unzip the package with the help of the Powershell Module Expand-Archive to nearly zero. Here is the code:
- name: name: unzip zip package into C:\server\dlls
win_shell: "Expand-Archive {{app_path}}\\app_dll.zip -DestinationPath {{app_path}}\\dlls"
Our pipeline is now fast again, but it would be nice to have a fast Ansible win_unzip Module!

Related

Ansible "apt" puts unwanted second repository into sources list

On Linux Mint 21 I am trying to install signed packages from external repos.
I have the same problem with 5 different repos.
I can get the PGP key and add the repo to the /etc/apt/sources.list.d/ directory, but when I call apt, it makes another entry in the sources directory (but without the pointer to the key).
This causes the install to fail.
If I remove the second entry, then the package installs correctly.
I have tried several of the parameters to apt, but without success.
Here is an example, to install Chrome:
- name: Add Chrome signing key
get_url:
url: https://dl.google.com/linux/linux_signing_key.pub
dest: /usr/share/keyrings/google-chrome.asc
mode: '0644'
force: true
- name: Add Chrome repository
apt_repository:
repo: deb [arch=amd64 signed-by=/usr/share/keyrings/google-chrome.asc] https://dl.google.com/linux/chrome/deb/ stable main
state: present
At this point I correctly have:
/etc/apt/sources.list.d/dl_google_com_linux_chrome_deb.list
which correctly contains:
deb [arch=amd64 signed-by=/usr/share/keyrings/google-chrome.asc] https://dl.google.com/linux/chrome/deb/ stable main
After calling
- name: Add Chrome package
apt:
name: "google-chrome-stable"
there is a second list in the sources directory:
/etc/apt/sources.list.d/dl_google_com_linux_chrome_deb.list
/etc/apt/sources.list.d/google-chrome.list
This second list points to the repo, but without the key:
### THIS FILE IS AUTOMATICALLY CONFIGURED ###
# You may comment out this entry, but any other modifications may be lost.
deb [arch=amd64] https://dl.google.com/linux/chrome/deb/ stable main
Like I say, if I remove the second entry, then the package installs correctly.
Question: How do I stop this incorrect list from being added?
Further testing:
I used Ansible to get the key and add the repo to /etc/apt/sources.list.d/ and then I manually called
sudo apt install google-chrome-stable
It correctly installed but then also added the google-chrome.list file.
The same thing happens when I try to install Docker, TeamViewer, VS Code, and 1Password so it isn't just Chrome.
So how do I use Ansible to install signed external packages?
I have experienced such behavior a few times when manually installing deb packages. When the deb package is installed, a sources-list file is automatically created.
I can think of a few possibilities you could test:
Obviously when chrome is installed, the file google-chrome.list is created. You could test if this is overwritten during installation if you name your file google-chrome.list instead of dl_google_com_linux_chrome_deb.list. You have to add the parameter filename (without specifying the file extension).
- name: Add Chrome repository
apt_repository:
repo: deb [arch=amd64 signed-by=/usr/share/keyrings/google-chrome.asc] https://dl.google.com/linux/chrome/deb/ stable main
state: present
filename: google-chrome
As long as the existing file is not overwritten, everything should be fine. If it is overwritten again, you could try to run the apt_repository command again after the installation.
You could delete the file created by Chrome after the installation. However, based on the note in the file "any other modifications may be lost.", I don't know if that holds.
- name: Remove google-chrome.list file
file:
path: /etc/apt/sources.list.d/google-chrome.list
state: absent
You could comment out the deb line in the new file after installation. As it is mentioned there, this modification should then hopefully be preserved.
- name: Comment out Chrome's default source.
lineinfile:
path: /etc/apt/sources.list.d/google-chrome.list
regexp: '^(deb .*)$'
line: '# \g<1>'
backrefs: yes
However, you should still test how the behavior is during an update, if the file is overwritten/created again every time. If there is a switch to prevent the creation of the sources-list-file directly during the installation, I don't know.
Edit: Combination of solution 1 and 3:
- name: Add Chrome repository
apt_repository:
repo: deb [arch=amd64 signed-by=/usr/share/keyrings/google-chrome.asc] https://dl.google.com/linux/chrome/deb/ stable main
state: present
filename: google-chrome
- name: Add Chrome package
apt:
name: "google-chrome-stable"
- name: Comment out Chrome's default source.
lineinfile:
path: /etc/apt/sources.list.d/google-chrome.list
regexp: '^(deb \[arch=amd64\] .*)$'
line: '# \g<1>'
backrefs: yes
I have adjusted the regexp, so only the deb line without signature key matches.

How to archive multiple folders under one folder using Ansible

I'm using the below ansible-playbook code to archive multiple folders under IBM folder.
Below is my absolute path Directory structure:
/app
|-- /IBM
|--/test
|--/log
|--/common
|--/api
I wish to build an archive (gz) that has only IBM folder containing only common and api folders.
Thus, I wrote the below playbook:
- name: Creating the archive
archive:
path:
- /was/IBM/common
- /was/IBM/api
dest: /var/backup/mysetup.tar.gz
exclude_path:
- /was/IBM/log
- /was/IBM/test
format: gz
This gives me the file mysetup.tar.gz.
I want the mysetup.tar.gz file to have a folder called IBM which should have two folders common and api. Thus, I'm expecting the below in the mysetup.tar.gz
IBM
|--/common
|--/api
But, the mysetup.tar.gz has no IBM folder but only the common and api folders.
Can you please guide me as to how I can get the archive to have both the folders inside the IBM folder?
You need to include whole IBM folder and then exclude paths you do not want
- name: Creating the archive
archive:
path:
- /was/IBM
dest: /var/backup/mysetup.tar.gz
exclude_path:
- /was/IBM/log
- /was/IBM/test
format: gz

Ansible `archive` module to archive without compression

I have to archive a bunch of files, and want to avoid compression to save time. This is a daily operation to archive 1 TB of data, and write it to a different drive, so "time is of the essence".
Looking at the Ansible archive module documentation it's not clear how to build up the target file without compression.
Currently, my Ansible task looks like this:
- name: Create snapshot tarball
become: true
archive:
path: "{{ snapshots_path.stdout_lines }}"
dest: "{{backup_location}}{{short_date.stdout}}_snapshot.tgz"
owner: "{{backup_user}}"
group: "{{backup_group}}"
Is it possible to speed up this process by telling the module to NOT compress? If yes, how?
Based on this other answer on superuser, tar is not compressing files per default, on the other hand gz, which is the default format of archive is.
So you could try going by:
- name: Create snapshot tarball
become: true
archive:
path: "{{ snapshots_path.stdout_lines }}"
dest: "{{backup_location}}{{short_date.stdout}}_snapshot.tar"
format: tar
owner: "{{backup_user}}"
group: "{{backup_group}}"
This is also backed-up by the manual page of tar:
DESCRIPTION
GNU tar is an archiving program designed to store multiple files in a
single file (an archive), and to manipulate such archives. The
archive can be either a regular file or a device (e.g. a tape drive,
hence the name of the program, which stands for tape archiver), which
can be located either on the local or on a remote machine.

Ansible create a zip file backup on windows host

I want to zip the windows directory into zip file. archive function is not working.
for windows I see win_unzip module, but I didn't find win_zip module.
How do we take the backup of existing folder in windows?
- name: Backup existing install folder to zip
archive:
path:
- "{{ installdir }}"
dest: "{{ stragedir }}\\{{ appname }}.zip"
format: zip
error:
[WARNING]: FATAL ERROR DURING FILE TRANSFER: Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/ansible/plugins/connection/winrm.py", line 276, in _winrm_exec
self._winrm_send_input(self.protocol, self.shell_id, command_id, data, eof=is_last) File "/usr/lib/python2.7/site-packages/ansible/plugins/connection/winrm.py", line 256, in
_winrm_send_input protocol.send_message(xmltodict.unparse(rq)) File "/usr/lib/python2.7/site-packages/winrm/protocol.py", line 256, in send_message raise
WinRMOperationTimeoutError() WinRMOperationTimeoutError
thanks
SR
There is no module from ansible currently to do zip archiving on Windows. I've created a simple module that acts like win-unzip that I use, as long as powershell 4 is installed on the host this should work for you. The code is here: https://github.com/tjkellie/PublicRepo/blob/master/ansible feel free to use until an official module is created.
Add the files to your library:
library/ # Put the custom modules files here
filter_plugins/
roles/
library/ # or here
And use the module from a playbook like this:
- name: zip a directory
win_zip:
src: C:\Users\Someuser\Logs
dest: C:\Users\Someuser\OldLogs.zip
creates: C:\Users\Someuser\OldLogs.zip

Resolve Local Files by Playbook Directory?

I have the following Ansible role which simply does the following:
Create a temporary directory.
Download Goss, a server testing tool, into that temporary directory.
Upload a main Goss YAML file for the tests.
Upload additional directories for additional included tests.
Here are a couple places where I'm using it:
naftulikay.python-dev
naftulikay.ruby-dev
Specifically, these playbooks upload a local file adjacent to the playbook named goss.yml and a directory goss.d again adjacent to the playbook.
Unfortunately, it seems that Ansible logic has changed recently, causing my tests to not work as expected. My role ships with a default goss.yml, and it appears that when I set goss_file: goss.yml within my playbook, it uploads degoss/files/goss.yml instead of the Goss file adjacent to my playbook.
If I'm passing the name of a file to a role, is there a way to specify that Ansible should look up the file in the context of the playbook or the current working directory?
The actual role logic that is no longer working is this:
# deploy test files including the main and additional test files
- name: deploy test files
copy: src={{ item }} dest={{ degoss_test_root }} mode=0644 directory_mode=0755 setype=user_tmp_t
with_items: "{{ [goss_file] + goss_addtl_files + goss_addtl_dirs }}"
changed_when: degoss_changed_when
I am on Ansible 2.3.2.0 and I can reproduce this across distributions (namely CentOS 7, Ubuntu 14.04, and Ubuntu 16.04).
Ansible searches for relative paths in role's scope first, then in playbook's scope.
For example if you want to copy file test.txt in role r1, search order is this:
/path/to/playbook/roles/r1/files/test.txt
/path/to/playbook/roles/r1/test.txt
/path/to/playbook/roles/r1/tasks/files/test.txt
/path/to/playbook/roles/r1/tasks/test.txt
/path/to/playbook/files/test.txt
/path/to/playbook/test.txt
You can inspect your search_path order by calling ansible with ANSIBLE_DEBUG=1.
To answer your question, you have to options:
Use filename that doesn't exist within role's scope. Like:
goss_file: local_goss.yml
Supply absolute path. For example, you can use:
goss_file: '{{ playbook_dir }}/goss.yml'
Ansible doesn't apply search logic if the path is absolute.

Resources