How to get all indices of output array in Ansible - ansible

I'm gathering info on SSL certs on servers (looking for expiration date) using the find module.
- name: Find certs on server
find:
path: /etc/ssl/custom/certs
file_type: file
patterns: "*.crt"
recurse: yes
register: find_result
- debug:
var: find_result
The results are:
ok: [server00] => {
"find_result": {
"changed": false,
"examined": 5,
"failed": false,
"files": [
{
"atime": 1622749788.1552677,
"ctime": 1622744497.4393551,
"dev": 2050,
"gid": 0,
"gr_name": "root",
"inode": 19531534,
"isblk": false,
"ischr": false,
"isdir": false,
"isfifo": false,
"isgid": false,
"islnk": false,
"isreg": true,
"issock": false,
"isuid": false,
"mode": "0644",
"mtime": 1622744497.4393551,
"nlink": 1,
"path": "/etc/ssl/custom/certs/somewebsite0.com.crt",
"pw_name": "root",
"rgrp": true,
"roth": true,
"rusr": true,
"size": 1879,
"uid": 0,
"wgrp": false,
"woth": false,
"wusr": true,
"xgrp": false,
"xoth": false,
"xusr": false
},
{
"atime": 1622719627.2477663,
"ctime": 1616545902.3681087,
"dev": 2050,
"gid": 0,
"gr_name": "root",
"inode": 19531253,
"isblk": false,
"ischr": false,
"isdir": false,
"isfifo": false,
"isgid": false,
"islnk": false,
"isreg": true,
"issock": false,
"isuid": false,
"mode": "0644",
"mtime": 1613754568.0,
"nlink": 1,
"path": "/etc/ssl/custom/certs/somewebsite1.com.crt",
"pw_name": "root",
"rgrp": true,
"roth": true,
"rusr": true,
"size": 2081,
"uid": 0,
"wgrp": false,
"woth": false,
"wusr": true,
"xgrp": false,
"xoth": false,
"xusr": false
},
{
"atime": 1622719627.2197664,
"ctime": 1616545902.3721087,
"dev": 2050,
"gid": 0,
"gr_name": "root",
"inode": 19535012,
"isblk": false,
"ischr": false,
"isdir": false,
"isfifo": false,
"isgid": false,
"islnk": false,
"isreg": true,
"issock": false,
"isuid": false,
"mode": "0644",
"mtime": 1601653231.0,
"nlink": 1,
"path": "/etc/ssl/custom/certs/somewebsite2.com.crt",
"pw_name": "root",
"rgrp": true,
"roth": true,
"rusr": true,
"size": 2269,
"uid": 0,
"wgrp": false,
"woth": false,
"wusr": true,
"xgrp": false,
"xoth": false,
"xusr": false
}
],
"matched": 3,
"msg": ""
}
}
I'm needing the path portion of the output ("path": "/etc/ssl/custom/certs/somewebsite1.com.crt"), and if I use find_result.files[0].path it only gives me a single result for each host, when I need every *.crt file.
How can I access each index? I try to use the shell module to perform an action on the .crt file, but again, it's only grabbing the first one due to the [0] index, like so:
- name: Check expiration
shell: "cat {{ find_result.files[0].path }} | openssl x509 -noout -enddate"
register: date
- debug:
var: date.stdout_lines
ok: [server00] => {
"date.stdout_lines": [
"notAfter=Apr 2 19:50:38 2018 GMT"
]
}

Here would be an example playbook based on it:
- hosts: localhost
tasks:
- name: Find certs on server
find:
path: /etc/ssl/custom/certs
file_type: file
patterns: "*.crt"
recurse: yes
register: find_result
- debug:
var: find_result
- name: Play with the data just to demonstrate
set_fact:
IRGeekSauce_list: "{{ (IRGeekSauce_list|default([])) + [item.path] }}"# <-- add each list item to a custom list
with_items: '{{ find_result.files }}' # <-- here we get the files as a list.
- name: your list
debug:
msg: '{{ IRGeekSauce_list }}'
- include_tasks: anothertasklist.yml
loop: '{{ IRGeekSauce_list }}'
loop_control:
loop_var: singlepathvariable
And then you have another "playbook" with just the tasks 'anothertasklist.yml'
- name: hello
debug:
msg: 'You are now in another playbook'
- name:
debug:
msg: 'Woho: {{ singlepathvariable }}'
- name:
openssl_certificate_info:
howeverthatmoduleworks...
And you should be able to just take the entire and include find_result.files as the loop, and then instead just use the loopvar singlepathvariable(and maybe rename it) and just take out the path as {{ singlepathvariable.path }}

Related

Find/list files with specific extension and rename them dynamically using ansible playbook

I'm trying to rename a jar file in a path for deployment using ansible and it is successful. I have tried multiple ways but still, it's failing,
For example, there are multiple Jar files in the path /appdata/tomcat/lib/jars/
we havemultiple jar files with version numbers like below
app_deploy-1.1.1.jar
app_deploy-1.1.2_old_1.jar
app_deploy-1.1.2.jar_before
app_deploy-1.1.2_old_2022.jar
app_deploy-1.1.2_before.jar
I have move all files with .jar extension with date using custom date variable {deploy_date} like this app_deploy-1.1.2_before.jar_{deploy_date}
I have method 1 as below :
- name: Get the name of the current jar files
shell: ls -l /appdata/tomcat/lib/jars/ | grep .*.jar
register: jar_files_list
- debug:
msg: "{{ jar_files_list }}"
- name: Get the date
shell: date +%Y%m%d%H%M%S
register: timestamp
when: jar_files_list.stdout != ''
- name: Rename the current jar file
file:
src: /appdata/tomcat/lib/jars/{{ jar_files_list }}
dest: /appdata/tomcat/lib/jars/{{ jar_files_list }}_{{ ansible_date_time.date }}_backup
when: jar_files_list.stdout != ''
I have method 2 as below :
- name: Rename the current jar files
shell: mv /appdata/tomcat/lib/jars/*.jar {{ repo_name }}-*.jar_backup_{{ ansible_date_time.date }}
- name: Move current filesto backup directory
shell: mv /appdata/tomcat/lib/jars/{{ repo_name }}-.*.jar /appdata/tomcat/lib/jars//backup_jars/`
Both of the solutions doest work, can someone help me with some solutions
The find module can be used to find and generate a list of files located on a remote host based on a pattern:
- name: Get jars in {{ jars_path }}
find:
paths: "{{ jars_path }}"
file_type: file
patterns: '*.jar'
register: jars_list
The output will be a dictionary of only the *.jar files in the specified path:
TASK [Get jars in /appdata/tomcat/lib/jars] ************************
ok: [test-001] => {
"changed": false,
"examined": 4,
"files": [
{
"atime": 1661249640.3583002,
"ctime": 1661249640.3583002,
"dev": 64768,
"gid": 0,
"gr_name": "root",
"inode": 8980720,
"isblk": false,
"ischr": false,
"isdir": false,
"isfifo": false,
"isgid": false,
"islnk": false,
"isreg": true,
"issock": false,
"isuid": false,
"mode": "0644",
"mtime": 1661249640.3583002,
"nlink": 1,
"path": "/appdata/tomcat/lib/jars/test1.jar",
"pw_name": "root",
"rgrp": true,
"roth": true,
"rusr": true,
"size": 0,
"uid": 0,
"wgrp": false,
"woth": false,
"wusr": true,
"xgrp": false,
"xoth": false,
"xusr": false
},
{
"atime": 1661249640.3583002,
"ctime": 1661249640.3583002,
"dev": 64768,
"gid": 0,
"gr_name": "root",
"inode": 8980722,
"isblk": false,
"ischr": false,
"isdir": false,
"isfifo": false,
"isgid": false,
"islnk": false,
"isreg": true,
"issock": false,
"isuid": false,
"mode": "0644",
"mtime": 1661249640.3583002,
"nlink": 1,
"path": "/appdata/tomcat/lib/jars/test2.jar",
"pw_name": "root",
"rgrp": true,
"roth": true,
"rusr": true,
"size": 0,
"uid": 0,
"wgrp": false,
"woth": false,
"wusr": true,
"xgrp": false,
"xoth": false,
"xusr": false
},
{
"atime": 1661249640.3583002,
"ctime": 1661249640.3583002,
"dev": 64768,
"gid": 0,
"gr_name": "root",
"inode": 8980726,
"isblk": false,
"ischr": false,
"isdir": false,
"isfifo": false,
"isgid": false,
"islnk": false,
"isreg": true,
"issock": false,
"isuid": false,
"mode": "0644",
"mtime": 1661249640.3583002,
"nlink": 1,
"path": "/appdata/tomcat/lib/jars/test3.jar",
"pw_name": "root",
"rgrp": true,
"roth": true,
"rusr": true,
"size": 0,
"uid": 0,
"wgrp": false,
"woth": false,
"wusr": true,
"xgrp": false,
"xoth": false,
"xusr": false
}
],
"invocation": {
"module_args": {
"age": null,
"age_stamp": "mtime",
"contains": null,
"depth": null,
"excludes": null,
"file_type": "file",
"follow": false,
"get_checksum": false,
"hidden": false,
"paths": [
"/appdata/tomcat/lib/jars"
],
"patterns": [
"*.jar"
],
"read_whole_file": false,
"recurse": false,
"size": null,
"use_regex": false
}
},
"matched": 3,
"msg": "All paths examined",
"skipped_paths": {}
}
"{{ jars_list.files }}" will list the files and .path will provide the full path of the file.
Now the copy module with the option remote_src: yes can be used to rename the files:
- name: Rename jars
copy:
src: "{{ item.path }}"
dest: "{{ item.path }}_{{ deploy_date }}"
remote_src: yes
loop: "{{ jars_list.files }}"
Delete the old files:
- name: Remove old jars
file:
path: "{{ item.path }}"
state: absent
loop: "{{ jars_list.files }}"
The complete playbook
- hosts: all
vars:
deploy_date: "{{ ansible_date_time.date }}"
jars_path: /appdata/tomcat/lib/jars
tasks:
- name: Get jars in {{ jars_path }}
find:
paths: "{{ jars_path }}"
file_type: file
patterns: '*.jar'
register: jars_list
- name: Rename jars
copy:
src: "{{ item.path }}"
dest: "{{ item.path }}_{{ deploy_date }}"
remote_src: yes
loop: "{{ jars_list.files }}"
- name: Remove old jars
file:
path: "{{ item.path }}"
state: absent
loop: "{{ jars_list.files }}"

How to grab filepath from ansibles find module

I'm using Ansible for some IAC(infra as code) tasks.
I have a playbook where I'm using the find module recursively to search for readable files.
Here is an example of it:
- name: Application logs with read access
become: true
find:
paths: /
file_type: file
recurse: yes
patterns:
- '*.log'
- '*.config'
register: rapplogs
- set_fact: read_app_logs={{rapplogs.matched}}
- debug: var=read_app_logs
- set_fact: read_log_list={{rapplogs.files}}
- debug: var=read_log_list
run_once: True
failed_when: read_app_logs >= 1
ignore_errors: True
The output of it is like this:
TASK [infra_pt : set_fact] ******************************************************************
ok: [192.168.47.135]
TASK [infra_pt : debug] *********************************************************************
ok: [192.168.47.135] => {
"read_app_logs": "72"
}
TASK [infra_pt : set_fact] ******************************************************************
ok: [192.168.47.135]
TASK [infra_pt : debug] *********************************************************************
fatal: [192.168.47.135]: FAILED! => {
"failed_when_result": true,
"read_log_list": {
"changed": false,
"examined": 210060,
"failed": false,
"files": [
{
"atime": 1558446815.3474104,
"ctime": 1558446815.3474104,
"dev": 64768,
"gid": 0,
"gr_name": "root",
"inode": 2065610,
"isblk": false,
"ischr": false,
"isdir": false,
"isfifo": false,
"isgid": false,
"islnk": false,
"isreg": true,
"issock": false,
"isuid": false,
"mode": "0644",
"mtime": 1558446815.3474104,
"nlink": 1,
"path": "/test2.log",
"pw_name": "root",
"rgrp": true,
"roth": true,
"rusr": true,
"size": 0,
"uid": 0,
"wgrp": false,
"woth": false,
"wusr": true,
"xgrp": false,
"xoth": false,
"xusr": false
},
From the output list I actually want to access only the "mode" and "path" objects, how this can be done? Any idea?
Try json_query
- set_fact:
read_app_logs: "{{ rapplogs.files|json_query('[].{path: path, mode: mode}') }}"
(not tested)
Sure. You can just iterate over the list of matched files and refer to whichever keys are of interest:
- debug:
msg: "mode of {{ item.path }} is {{ item.mode }}"
loop: "{{ read_log_list.files }}"
Which, given your example output, would produce something like this:
TASK [debug] **********************************************************************************
ok: [localhost] => (item={u'islnk': False, u'uid': 0, u'rgrp': True, u'xoth': False, u'rusr': True, u'woth': False, u'nlink': 1, u'issock': False, u'mtime': 1558446815.3474104, u'gr_name': u'root', u'path': u'/test2.log', u'xusr': False, u'atime': 1558446815.3474104, u'inode': 2065610, u'isgid': False, u'size': 0, u'isdir': False, u'wgrp': False, u'ctime': 1558446815.3474104, u'isblk': False, u'xgrp': False, u'isuid': False, u'dev': 64768, u'roth': True, u'isreg': True, u'isfifo': False, u'mode': u'0644', u'pw_name': u'root', u'gid': 0, u'ischr': False, u'wusr': True}) => {
"msg": "mode of /test2.log is 0644"
}

Ansible retrieve multiple array values from results

I'd like to be able to iterate over all of the name values, but I'm not sure how to do so with Ansible. The variable domain is a list, and register is used.
- name: find *.ccfg files in domain(s)
find:
paths: "/tmp/opt/{{ item }}/ccfg"
patterns: "*.ccfg"
recurse: yes
excludes: "Admin.ccfg"
with_items: "{{ domain }}"
register: files
when: ('local' in group_names)
- debug:
msg: "{{ files.results }}"
The path value in each array could be anywhere from 1 to 20. Each index in the array has multiple values. Some arrays may not have any values
Standard Output:
ok: [127.0.0.1] => {
"msg": [
{
"_ansible_ignore_errors": null,
"_ansible_item_label": "CIE",
"_ansible_item_result": true,
"_ansible_no_log": false,
"_ansible_parsed": true,
"changed": false,
"examined": 3,
"failed": false,
"files": [
{
"atime": 1541632866.4095802,
"ctime": 1541632866.4095802,
"dev": 64768,
"gid": 0,
"gr_name": "root",
"inode": 52174935,
"isblk": false,
"ischr": false,
"isdir": false,
"isfifo": false,
"isgid": false,
"islnk": false,
"isreg": true,
"issock": false,
"isuid": false,
"mode": "0644",
"mtime": 1541632866.4095802,
"nlink": 1,
"path": "/tmp/opt/CIE/ccfg/cie.ccfg",
"pw_name": "root",
"rgrp": true,
"roth": true,
"rusr": true,
"size": 0,
"uid": 0,
"wgrp": false,
"woth": false,
"wusr": true,
"xgrp": false,
"xoth": false,
"xusr": false
}
],
Take a look at the json_query filter:
- debug:
msg: "{{ item }}"
with_items: "{{ files | json_query('results[*].files[*].path') }}"
Official doco: https://docs.ansible.com/ansible/latest/user_guide/playbooks_filters.html#json-query-filter
Shameless plug with more examples: https://parko.id.au/2018/08/16/complex-data-structures-and-the-ansible-json_query-filter

Ansible replace command is not working find module

I am new to Ansible, I have to find some file and then replace some pattern in the all files. so I am using the find and replace module as follows.
- name: My Great Playbook
hosts: all
gather_facts: False
accelerate: False
strategy: free
vars:
dbname: "#DBNAME#"
repldbname: "connect to mydb"
tasks:
- block:
- name: finding fl
find:
paths: "/home/username1/temp"
patterns: "*.sql"
file_type: "file"
register: repos
- name: some thing
debug: msg="{{ item }}"
with_items: "{{ repos.files }}"
- name: replacing string
replace:
path: "{{ item }}"
#path: "/home/username1/temp/1.sql"
regexp: ({{ dbname }})
replace: '{{ repldbname }}'
backup: no
unsafe_writes: yes
with_items: "{{ repos.files }}"
I am getting following error as follows
failed: [localhost] (item={u'uid': 575479814, u'woth': True, u'mtime': 1504541305.603901, u'inode': 8433422, u'isgid': False, u'size': 256, u'roth': True, u'isuid': False, u'isreg': True, u'gid': 575144449, u'ischr': False, u'wusr': True, u'xoth': True, u'rusr': True, u'nlink': 1, u'issock': False, u'rgrp': True, u'path': u'/home/username1/temp/1.sql', u'xusr': True, u'atime': 1504541305.604901, u'isdir': False, u'ctime': 1504541305.6059012, u'wgrp': True, u'xgrp': True, u'dev': 64772, u'isblk': False, u'isfifo': False, u'mode': u'0777', u'islnk': False}) => {
"failed": true,
"item": {
"atime": 1504541305.604901,
"ctime": 1504541305.6059012,
"dev": 64772,
"gid": 575144449,
"inode": 8433422,
"isblk": false,
"ischr": false,
"isdir": false,
"isfifo": false,
"isgid": false,
"islnk": false,
"isreg": true,
"issock": false,
"isuid": false,
"mode": "0777",
"mtime": 1504541305.603901,
"nlink": 1,
"path": "/home/username1/temp/1.sql",
"rgrp": true,
"roth": true,
"rusr": true,
"size": 256,
"uid": 575479814,
"wgrp": true,
"woth": true,
"wusr": true,
"xgrp": true,
"xoth": true,
"xusr": true
},
"rc": 257
}
MSG:
Path {'uid': 575479814, 'woth': True, 'mtime': 1504541305.603901, 'inode': 8433422, 'isgid': False, 'size': 256, 'wgrp': True, 'isuid': False, 'isreg': True, 'gid': 575144449, 'ischr': False, 'wusr': True, 'xoth': True, 'islnk': False, 'nlink': 1, 'issock': False, 'rgrp': True, 'path': '/home/username1/temp/1.sql', 'xusr': True, 'atime': 1504541305.604901, 'isdir': False, 'ctime': 1504541305.6059012, 'isblk': False, 'xgrp': True, 'dev': 64772, 'roth': True, 'isfifo': False, 'mode': '0777', 'rusr': True} does not exist !
Please let me know what is issue here ?
Replace:
path: "{{ item }}"
With:
path: "{{ item.path }}"
You are trying to pass a dictionary object to an argument which requires a string value.

How do i filter on ansible stat existence flags?

I'm executing a simple stat task (Ansible 2.3.1.0) on the named pipe created by wpa_supplicant:
- stat:
path: "/var/run/wpa_supplicant/{{ item }}"
with_items:
- wifi
register: wpa_stats
sudo: true
The variable contains the following data after execution:
ok: [10.10.23.187] => {
"wpa_stats": {
"changed": false,
"msg": "All items completed",
"results": [
{
"_ansible_item_result": true,
"_ansible_no_log": false,
"_ansible_parsed": true,
"changed": false,
"invocation": {
"module_args": {
"checksum_algorithm": "sha1",
"follow": false,
"get_attributes": true,
"get_checksum": true,
"get_md5": true,
"get_mime": true,
"path": "/var/run/wpa_supplicant/wifi"
}
},
"item": "wifi",
"stat": {
"atime": 1497900522.6306846,
"attr_flags": "",
"attributes": [],
"block_size": 4096,
"blocks": 0,
"charset": "binary",
"ctime": 1497900290.0605242,
"dev": 18,
"device_type": 0,
"executable": true,
"exists": true,
"gid": 0,
"gr_name": "root",
"inode": 796,
"isblk": false,
"ischr": false,
"isdir": false,
"isfifo": false,
"isgid": false,
"islnk": false,
"isreg": false,
"issock": true,
"isuid": false,
"mimetype": "inode/socket",
"mode": "0770",
"mtime": 1497900290.0605242,
"nlink": 1,
"path": "/var/run/wpa_supplicant/wifi",
"pw_name": "root",
"readable": true,
"rgrp": true,
"roth": false,
"rusr": true,
"size": 0,
"uid": 0,
"version": null,
"wgrp": true,
"woth": false,
"writeable": true,
"wusr": true,
"xgrp": true,
"xoth": false,
"xusr": true
}
}
]
}
}
But this filter returns an empty result:
- debug:
msg: "{{ wpa_stats | json_query('results[*].stat[?exists].path') | list }}"
If I remove the [?exists] filter it works fine:
- debug:
msg: "{{ wpa_stats | json_query('results[*].stat.path') | list }}"
I've also tried using ==. Jmespath is installed and I'm querying other values with JSON filters successfully.
What am I missing?
I guess you want to use pipe expression:
results[*].stat | [?exists].path
From my understating of JMESPath in stat[?filter] filter is applied inside stat (to select elements that are down the path), but you want to apply filter to select/reject stat siblings, so you should stop further projections with pipe and filter elements.

Resources