Iterating over stdout - ansible

I am writing a playbook to locate a string pattern in a sequence of files. If I run my utility through the command module it will generate one or more strings on STDOUT. To run this across a number of systems I would like to run the command with_items:
- command: "findstring {{ item }}"
with_items:
- "string1"
- "string2"
register: found
failed_when: found.rc >= 2
And then iterate over the result to post process the info:
- name: Print strings we found
debug:
var: "{{ item }}"
with_items: found.results
Is there something equivalent to loop.index that can be used with "results" in the task above? This would allow me to do something like {{ item[INDEX].stdout }} to get the strings that were generated. I haven't been able to find an answer in the official documentation so I thought I would post here to see what the gurus think.

If you need to iterate over every line from all commands, use:
- debug:
msg: "Do smth for line {{ item }}"
with_items: "{{ found | json_query('results[].stdout_lines[]') }}"
This will take ever element from found.results, then every element from every stdout_lines.

For me this works in ansible [core 2.11.6]:
- name: Command which outputs multiple lines
ansible.builtin.command:
cmd: ls -l /
register: _ls_cmd
changed_when: no
- name: Debug to show each line
ansible.builtin.debug:
msg: "ITEM: {{ item }}"
with_items: "{{ _ls_cmd.stdout_lines }}"

Related

Having problem defining the condition as it is based on a different loop than the task

I am trying to make a playbook that will check if something exists and depending on the results, it will execute a command.
I have simplified the problem but here is the gist of it :
I have a list, called "list" :
sample1
sample2
sample3
sample4
I will start by checking if a directory with this name exists.
- name: Status
shell: ls -l | grep {{ item }} | grep -v grep | wc -l
loop: "{{ list }}"
register: status
then i ll determine whether the folder exists or not (not sure if i need this step...)
- debug:
msg: {{ item.item }} exists
loop: "{{ status.results }}"
when: item.stdout != "0"
register: check
- debug:
msg: {{ item.item }} does not exist
loop: "{{ status.results }}"
when: item.stdout = "0"
register: check
the next step is where i am stuck... cant really find the right syntax or way to do this.. Anyway, i want to check if my folder exists or not, if it does not i want to create it.
- name: creation
shell: mkdir {{ item }}
loop: "{{ list }}"
when: check.results.item.stdout != "0"
as it need to check for every results from the list, my condition is based on the "check.results" and not the "list" defined in the loop.
I dont really know if this can be written as such
This is a very common misunderstanding. Ansible is about describing the expected state of the remote system and is generally idempotent for most of its modules. Running the same task an infinite number of times will lead to the same result on the target. In other words, don't check if a directory exists to later give the order to create it. Just describe the expected state: the directory must exist.
This can be done in a single task with the ansible.builtin.file module
- name: Make sure needed directories exist
ansible.builtin.file:
path: "{{ item }}"
state: directory
loop: "{{ list }}"
As the description name in my example suggests, this will create the needed dirs for those who do not exist reporting CHANGE and leave the others alone reporting OK.
Tip: whenever you are going to use shell hold a minute and check the documentation for a module doing the job. In most cases there is one. For example you can use the ansible.builtin.find module rather than looking for files with the shell.
In case you still want to apply a bad solution, the answer was already in my previous version of the answer and is also (again) in you debug task. You just have to apply the same receipe.
- name: creation
shell: mkdir {{ item.item }}
loop: "{{ status.results }}"
when: item.stdout != "0"
Note that this would work with a correct module as well but does not really make sense
- name: Make sure missing directories are created
ansible.builtin.file:
path: "{{ item.item }}"
state: directory
loop: "{{ status.results }}"
when: item.stdout != "0"
Both of these examples require of course to run your previous task (which should be refactored to a find module as stated earlier)

How to grab a value from a war file without extracting it preferably, or extracting the minimum amount

I have a .war file called app.war, which contains a test.properties file which has a line called appName: Blackberry. The test.properties file could be anywhere in the WAR file, no specific directory.
What is the most efficient way for me to find the test.properties file, and then grab the value Blackberry from appName: Blackberry which is one of the lines in the file ?
The file looks like this
mainInfo: deployed
app.Name: Blackberry
testRun: success
I have heard about jar xf app.war, but not sure how to approach it. I am very new to Ansible, any help would be really appreciated :).
Regarding your question "How to grab a value from ...", I've created a simple test logic.
---
- hosts: test
become: no
gather_facts: no
vars:
WAR_FILE: "app.war"
PROPERTY_FILE: "test.properties"
tasks:
- name: Gather full path of '{{ PROPERTY_FILE }}', if there is any
shell:
cmd: zipinfo -1 {{ WAR_FILE }} | grep {{ PROPERTY_FILE }}
register: result
check_mode: false
changed_when: false
failed_when: result.rc != 0
- name: Gather content of '{{ PROPERTY_FILE }}'
shell:
cmd: unzip -qq -c {{ WAR_FILE }} {{ result.stdout }}
warn: false
register: properties
check_mode: false
changed_when: false
failed_when: properties.rc != 0
- name: Show content
debug:
msg: "{{ properties.stdout }}"
check_mode: false
... Consider using the unarchive
module rather than running 'unzip'
Resulting into an output of
TASK [Show content] ****
ok: [test.example.com] =>
msg: |-
mainInfo: deployed
app.Name: Blackberry
testRun: success
You have to adapt this to your environment and requirements.
Thanks to
View list of files in ZIP archive on Linux
How can I run unzip silently in terminal
How to unpackage and repackage a WAR file
Linux command for extracting WAR file
Please take note that the result properties.stdout is a list of strings which contains the key values pairs.
- name: Debug how to get value of key
debug:
var: item | type_debug
loop_control:
label: "{{ ansible_loop.index }}"
extended: yes
loop: "{{ properties.stdout_lines }}"
It is recommended to read them into Ansible variables. You will find plenty examples for how to do that here on SO via a search.
- name: Debug how to get value of key
debug:
var: item | from_yaml | type_debug
loop: "{{ properties.stdout_lines }}"
Thanks to
Ansible - Check variable type
Using filters to manipulate data - Formatting data: YAML and JSON

Ansible files lookup in directory remote hosts

I am trying to execute the scripts from a directory in a particular order with an interval of 90 sec each iteration. For instance, I want to to start the scripts under /opt/scripts/, and it has (SK-1.sh, SK-2.sh, PX.sh, N1.sh,N2.sh). I want to start scripts that are SK and then PX and finally N.
- name: execute scripts A
shell: "ls -ltr {{ item }}"
args:
chdir: "/opt/scripts/"
with_fileglob:
- "/opt/scripts/*SK*.sh"
- "/opt/scripts/*PX*.sh"
- "/opt/scripts/*N*.sh"
loop_control:
pause: 90
But fileglob will look up for those scripts only on the controller host (Ansible tower),
hence not working. Is there some kind of filter I can use that would work on remote hosts rather controller host ?
I am trying to avoid the listing of the files from a previous task and register the items to process.
After all that is becoming a two step task.
Q: "Start scripts that are SK and then PX and finally N."
A: The Ansible module find neither keeps the order of the patterns nor sorts the results. You'll have to iterate the list of the globs on your own and sort the results to achieve the required order. For example, create a file with the tasks
shell> cat find_and_execute.yml
- find:
paths: /tmp
patterns: "{{ eitem }}"
register: result
- debug:
msg: "{{ '%H:%M:%S'|strftime }} Execute {{ item }}"
loop: "{{ result.files|map(attribute='path')|list|sort }}"
loop_control:
pause: 3
and use it in a playbook. For example
shell> cat pb.yml
- hosts: test_11
tasks:
- include_tasks: find_and_execute.yml
loop:
- 'SK*.sh'
- 'PX*.sh'
- 'N*.sh'
loop_control:
loop_var: eitem
gives
msg: 18:19:14 Execute /tmp/SK-1.sh
msg: 18:19:17 Execute /tmp/SK-2.sh
msg: 18:19:21 Execute /tmp/PX.sh
msg: 18:19:24 Execute /tmp/N1.sh
msg: 18:19:27 Execute /tmp/N2.sh
Fit the parameters to your needs.
- find:
paths: /tmp
patterns:
- 'SK*.sh'
- 'PX*.sh'
- 'N*.sh'
register: result
- debug:
msg: "{{ result.files|map(attribute='path')|list }}"
give
msg:
- /tmp/SK-2.sh
- /tmp/SK-1.sh
- /tmp/N2.sh
- /tmp/N1.sh
- /tmp/PX.sh

Compare two files with Ansible

I am struggling to find out how to compare two files. Tried several methods including this one which errors out with:
FAILED! => {"msg": "The module diff was not found in configured module paths. Additionally, core modules are missing. If this is a
checkout, run 'git pull --rebase' to correct this problem."}
Is this the best practice to compare two files and ensure the contents are the same or is there a better way?
Thanks in advance.
My playbook:
- name: Find out if cluster management protocol is in use
ios_command:
commands:
- show running-config | include ^line vty|transport input
register: showcmpstatus
- local_action: copy content="{{ showcmpstatus.stdout_lines[0] }}" dest=/poc/files/{{ inventory_hostname }}.result
- local_action: diff /poc/files/{{ inventory_hostname }}.result /poc/files/transport.results
failed_when: "diff.rc > 1"
register: diff
- name: debug output
debug: msg="{{ diff.stdout }}"
Why not using stat to compare the two files?
Just a simple example:
- name: Get cksum of my First file
stat:
path : "/poc/files/{{ inventory_hostname }}.result"
register: myfirstfile
- name: Current SHA1
set_fact:
mf1sha1: "{{ myfirstfile.stat.checksum }}"
- name: Get cksum of my Second File (If needed you can jump this)
stat:
path : "/poc/files/transport.results"
register: mysecondfile
- name: Current SHA1
set_fact:
mf2sha1: "{{ mysecondfile.stat.checksum }}"
- name: Compilation Changed
debug:
msg: "File Compare"
failed_when: mf2sha1 != mf1sha1
your "diff" task is missing the shell keyword, Ansible thinks you want to use the diff module instead.
also i think diff (as name of the variable to register the tasks result) leads ansible to confusion, change to diff_result or something.
code (example):
tasks:
- local_action: shell diff /etc/hosts /etc/fstab
failed_when: "diff_output.rc > 1"
register: diff_output
- debug:
var: diff_output
hope it helps
From Ansible User Guide: https://docs.ansible.com/ansible/latest/user_guide/playbooks_error_handling.html
- name: Fail task when both files are identical
ansible.builtin.raw: diff foo/file1 bar/file2
register: diff_cmd
failed_when: diff_cmd.rc == 0 or diff_cmd.rc >= 2
A slightly shortened version of 'imjoseangel' answer which avoids setting facts:
vars:
file_1: cats.txt
file_2: dogs.txt
tasks:
- name: register the first file
stat:
path: "{{ file_1 }}"
checksum: sha1
get_checksum: yes
register: file_1_checksum
- name: register the second file
stat:
path: "{{ file_2 }}"
checksum: sha1
get_checksum: yes
register: file_2_checksum
- name: Check if the files are the same
debug: msg="The {{ file_1 }} and {{ file_2 }} are identical"
failed_when: file_1_checksum.stat.checksum != file_2_checksum.stat.checksum
ignore_errors: true

Ansible: find file and loop over paths

Using an Ansible role. I would like to loop over a list of file paths, but I get an error:
template error while templating string: unexpected '/'.
String: {{/home/xyz/download.log}}
This is the main.yml for the "list_log_files" role:
- name: "find logs"
find:
paths: /
patterns: 'download.log'
recurse: yes
register: find_logs
- name: "list log files"
debug: var="{{ item.path }}"
with_items: "{{ find_logs.files }}"
The find returns an array "files", each is a dictionary. The dictionary contains a path entry, which is what I am interested in.
I have faced same issue and above issue is same where I want list of path of each file to insert line. I use Jinja2 filter:
- name: fetch files
find: paths=/var/tmp/ patterns='*.log'
register: find_logs
- name: insert line
lineinfile: dest={{ item }} line='my line' insertafter=EOF
with_items: "{{ find_logs.files | map(attribute='path') | list }}"
{{ find_logs.files | map(attribute='path') | list }}
Helpful Link
map()
Applies a filter on a sequence of objects or looks up an attribute.
This is useful when dealing with lists of objects but you are really
only interested in a certain value of it.
The correct syntax for var argument of debug module (with the value for your use case) is:
In Ansible notation:
debug: var=item.path
In YAML notation:
debug:
var: item.path
Ansible modules' usage is fairy well documented and examples cover most users' needs. This is also true for the debug module, so refer to the examples to check the basic syntax.

Resources