Ansible - Iterate Thru Folders Containing Files. Output to file - ansible

Hoping someone might be able to provide me an answer with this. I currently have a folder structure like so.
/BASE_DIR
/FOLDER_A
- file1.txt
- file2.txt
/FOLDER_B
/FOLDER_C
- file3.txt
Im trying to create a playbook that could tell me which folders contain files. My end goal is to have a flat file with:
FOLDER_A, file1.txt
FOLDER_A, file2.txt
FOLDER_C, file3.txt
This is my playbook currently:
- name: get files from all folders
shell: cd /BASE_DIR/{{ item.name }} && ls -p | grep -v / |grep .txt |cat
with_items:
- name: "FOLDER_A"
- name: "FOLDER_B"
- name: "FOLDER_C"
register: "fileitems"
- name: combine to have folder name as key, filenames as values
set_fact:
folders_with_files: "{{ folders_with_files|default({}) | combine( { item.item.name: item.stdout_lines } ) }}"
with_items: "{{ fileitems.results }}"
when: "{{ item.stdout_lines|length }} > 0"
- debug:
var: folders_with_files
I thought I could iterate through each folder looking for *.txt and then use a combine, it would be an easy way to iterate.
ok: [localhost] => {
"folders_with_files": {
"FOLDER_A": [
"file1.txt",
"file2.txt"
],
"FOLDER_C": [
"file3.txt"
]
}
}
But even with this output, I don't think I can properly parse it the way I need to. I thought maybe a nested loop could help, but that would mean I would need to know the name of the keys beforehand.
Any help would be appreciated!
Thanks,
T

Go figure as soon as I post the question, I find my own answer...
I decided to remove the combine and just append to an empty list.
- set_fact:
folders_with_files: []
- name: get all sql from each adapter
shell: cd /tmp/{{ item.name }} && ls -p | grep -v / |grep .txt |cat
with_items:
- name: "FOLDER_A"
- name: "FOLDER_B"
- name: "FOLDER_C"
register: "fileitems"
- name: combine to display which adapters have files
set_fact:
folders_with_files: "{{ folders_with_files + [{ 'name': item.item.name, 'files': item.stdout_lines }] }}"
with_items: "{{ fileitems.results }}"
when: "{{ item.stdout_lines|length }} > 0"
- debug:
var: folders_with_files
My output then became:
ok: [localhost] => {
"folders_with_files": [
{
"files": [
"file1.txt",
"file2.txt"
],
"name": "FOLDER_A"
},
{
"files": [
"file3.txt"
],
"name": "FOLDER_C"
}
]
}
I could then use a with_subelements:
- name: echo
shell: echo "{{ item.0.name }}, {{ item.1}}" >> /tmp/output.txt
with_subelements:
- "{{ folders_with_files }}"
- files

Related

contents in all files not parse using grep in ansible

I am looking to parse the lines contain word "harddisk" in two different files such as file1.txt and file2.txt.
with below ansible command am able to get only contents in file2.txt. kindly advise.
---
- name: find file name with .txt extension
shell: find /root -name "file*.txt" -print
register: "files"
- debug:
var: files
- name: find word harddisk in the file contents and output to result.txt in the destination server.
shell: cat "{{ item }}" |grep -i 'harddisk' > /root/test/result.txt
with_items:
- "{{ files.stdout_lines }}"
args:
executable: /bin/bash
register: output
Given the list of files
my_files:
- file1.txt
- file2.txt
and the content
shell> cat file1.txt
foo
harddisk A
harddisk B
bar
shell> cat file2.txt
foo
harddisk C
harddisk D
bar
The task below grep the lines from the files
- shell: "cat {{ item }} | grep harddisk"
register: files
loop: "{{ my_files }}"
Then, the task below creates the list of the lines
- set_fact:
my_list: "{{ files.results|
map(attribute='stdout_lines')|
list|flatten }}"
- debug:
var: my_list
gives
"my_list": [
"harddisk A",
"harddisk B",
"harddisk C",
"harddisk D"
]
To keep the source of the lines, the task below creates a dictionary with the data
- set_fact:
my_dict: "{{ dict(my_files|
zip(files.results|
map(attribute='stdout_lines')|
list)) }}"
- debug:
var: my_dict
gives
"my_dict": {
"file1.txt": [
"harddisk A",
"harddisk B"
],
"file2.txt": [
"harddisk C",
"harddisk D"
]
}

Delete all old files, but keep newest 3 files for same file name and different path

Hi can i know how can i keep the latest 3 files based on file name and path. For instance i have
fileAbackup1.war, fileAbackup2.war, fileAbackup3.war, fileAbackup4.war
fileBbackup1.war, fileBbackup2.war, fileBbackup3.war, fileBbackup4.war
So i should keep
fileAbackup1.war, fileAbackup2.war, fileAbackup3.war
fileBbackup1.war, fileBbackup2.war, fileBbackup3.war
I know the part of finding files and deleting files older than x days. I already done the coding part. But i need have some filter to keep 3 backup files of same name of different path
Updated
Below is my ansible code
- name: Find files
find:
paths: "{{ remove_file_path }}"
use_regex: yes
patterns:
- '*.war.*.backup*'
- '{{ rar_file_name }}.rar.*.bakup*'
- '.*\..*\.\d+\.\d{4}-\d{2}-\d{2}#\d{2}:\d{2}:\d{2}~$'
age: 5d
recurse: yes
register: removeFile
- name: Delete files
file:
path: "{{ item.path }}"
state: absent
loop: "{{ removeFile.files }}"
The tasks below
- find:
paths: dir1
patterns: '*.war'
register: result
- set_fact:
my_files: "{{ result.files|
map(attribute='path')|
sort|
list }}"
- debug:
var: my_files[0:3]
give
my_files[0:3]:
- dir1/fileAbackup1.war
- dir1/fileAbackup2.war
- dir1/fileAbackup3.war
If you need the filenames only map the filter basename. For example
- set_fact:
my_files: "{{ result.files|
map(attribute='path')|
map('basename')|
sort|
list }}"
- debug:
var: my_files[0:3]
give
my_files[0:3]:
- fileAbackup1.war
- fileAbackup2.war
- fileAbackup3.war
Q: "Will it have file fileBbackup1.war, fileBbackup2.war, and fileBbackup3.war as well?"
A: Yes. It will. Create a dictionary of the lists. For example
- set_fact:
my_files: "{{ my_files|default({})|
combine({item: result.files|
map(attribute='path')|
map('basename')|
select('search', item)|
sort|
list}) }}"
loop:
- fileAbackup
- fileBbackup
- debug:
msg: "{{ item.value[0:3] }}"
loop: "{{ my_files|dict2items }}"
loop_control:
label: "{{ item.key }}"
give
ok: [localhost] => (item=fileAbackup) =>
msg:
- fileAbackup1.war
- fileAbackup2.war
- fileAbackup3.war
ok: [localhost] => (item=fileBbackup) =>
msg:
- fileBbackup1.war
- fileBbackup2.war
- fileBbackup3.war
This command will provide the information you need.
ls -1 *.war | sort -r | head -n -3
Explanation
ls -1 *.war | sort -r Performs a reverse sort and displays just the filenames
head -n -3 Displays all except the first 3
You can pipe this to the command that deletes the files or you can run the following
rm $(ls -1 | sort -r | head -n -3)

Find files in a loop and delete them

I would like to delete file in some sub directory which contains certain format. However, I am getting the error
'dict object' has no attribute 'files'.
Below is my code. The file pattern would be file_name.file_extension.processID.YYYY-MM-DD#HH:MM:SS~
My variables
fileToFindInAllSubDirecotry
- "home/usr/file1"
- "home/usr/file2"
- "home/usr/file3/file4"
- "home/usr/file5"
My playbook role
- name: Find file
find:
paths: "{{ item }}"
use_regex: yes
patterns:
- '.*\.\d+\.\d{4}-\d{2}-\d{2}#\d{2}:\d{2}:\d{2}~$'
age: 1d
recurse: yes
register: fileToDelete
loop: "{{ fileToFindInAllSubDirecotry }}"
- name: Delete file
file:
path: "{{ item.path }}"
state: absent
loop: "{{ fileToDelete.files }}"
This is the sample file and directory
home
|-------usr
|-------file1
|-------configFile.xml
|-------file2
|-------propertiesFile.txt.2012.2020-07-13#23:08:10~
|-------file3
|-------file4
|-------content.yml.2012.2020-04-04#23:08:10~
|-------file5
|-------configFile.xml.2012.2020-03-05#13:08:10~
This is happening because you are populating the find with a loop, so you end up with a result that would be a dictionary having a list or results.
Something like:
ok: [localhost] => {
"msg": {
"changed": false,
"msg": "All items completed",
"results": [
{
...
"files": [ ... ],
...
"item": "/home/usr/file1",
...
},
{
...
"files": [ ... ],
...
"item": "/home/usr/file2",
...
},
...
]
}
}
There is two ways to fix this:
The nicest one, because, as pointed by the documentation, the paths parameter of the module find can accept lists of paths, just pass it your whole fileToFindInAllSubDirecotry variables instead of using a loop, this way your deletion works as is:
- name: Find file
find:
paths: "{{ fileToFindInAllSubDirecotry }}"
use_regex: yes
patterns:
- '.*\.\d+\.\d{4}-\d{2}-\d{2}#\d{2}:\d{2}:\d{2}~$'
age: 1d
recurse: yes
register: fileToDelete
- name: Delete file
file:
path: "{{ item.path }}"
state: absent
loop: "{{ fileToDelete.files }}"
Use json_query to fetch the result[*].files then flatten the resulting list of list
- name: Find file
find:
paths: "{{ item }}"
use_regex: yes
patterns:
- '.*\.\d+\.\d{4}-\d{2}-\d{2}#\d{2}:\d{2}:\d{2}~$'
age: 1d
recurse: yes
register: fileToDelete
loop: "{{ fileToFindInAllSubDirecotry }}"
- name: Delete file
file:
path: "{{ item.path }}"
state: absent
loop: "{{ fileToDelete | json_query('results[*].files') | flatten }}"

How to reduce a list against another list of patterns?

I need to write with Ansible's built in filters and tests the similar logic of shell one-liner:
for path in $(find PATH_TO_DIR); do for pattern in $PATTERNS; do echo $path | grep -v $pattern; done; done
---
- hosts: localhost
connection: local
gather_facts: False
vars:
paths:
- "/home/vagrant/.ansible"
- path-one
- path-two
- path-three
- "/home/vagrant/.ssh"
- "/home/vagratn/"
patterns:
- ".*ssh.*"
- ".*ansible.*"
- ".*one.*"
tasks:
- name: set empty list
set_fact:
files_to_be_removed: [ ]
In the end I would like to have a list like this:
ok: [localhost] => {
"msg": [
"path-two",
"path-three",
"/home/vagratn/"
]
}
With this form I getting a list where only last item from patterns is applied.
- set_fact:
files_to_be_removed: |
{{ paths
|reject("search", item)
|list }}
with_items:
- "{{ patterns }}"
The tasks below do the job
- set_fact:
files_to_be_removed: "{{ paths }}"
- set_fact:
files_to_be_removed: "{{ files_to_be_removed|
reject('search', item)|
list }}"
loop: "{{ patterns }}"
- debug:
var: files_to_be_removed
give
"files_to_be_removed": [
"path-two",
"path-three",
"/home/vagratn/"
]

In Ansible ,can we loop through list of files and match content in file using lineinfile module

I have to find all config.xml on a server and produce the list on a given server.
Once we registered the files list, i have to check the content in each file on the list using Ansible
I tried to derive the paths for all config.xml
register them and print the list
Added the registered variable into lineinfile path
##Derive Config.xml path
- name: Find the location of xml file
shell: find {{ wlx_mount_point }} -maxdepth 1 -name {{xml_file}} | rev | cut -d '/' -f3- | rev
register: wlx_domain_home
ignore_errors: yes
- debug:
msg: "{{ wlx_domain_home.stdout_lines|list }}"
- name: check domain home directory exists
stat:
path: "{{ wlx_domain_home |list }}"
ignore_errors: true
- debug:
msg: "{{ wlx_domain_home.stdout_lines|list }}"
- name: "Ensure Logging Settings in config.xml"
lineinfile:
line: "{{ item.line }}"
regexp: "{{ item.regexp }}"
path: "{{ wlx_domain_home.stdout_lines|list }}/config/config.xml"
state: present
backrefs: yes
register: config.xml.Logging
with_fileglob: "{{ wlx_domain_home.stdout_lines|list }}/config/config.xml"
with_items:
- line: "<logger-severity>Info</logger-severity>"
regexp: "^logger-severity.*"
Expected results are , it has to look for lines in each file and loop through the list. ` Its printing the list and not able to find the content
"_ansible_ignore_errors": true, "msg": "Destination
[u'/appl/cmpas9/user_projects/pte-ipscasws',
u'/appl/bbb/user_projects/qa-ucxservices_bkp',
u'/appl/app/user_projects/weiss_apps12',
u'appl/view/user_projects/weiss_apps12_oldbkp',
u'appl/voc/user_projects/qa-voc']/config/config.xml does not exist !"
}
This is how i fixed the issue. Now it gives output
- name: find all weblogic domain paths
shell: find /tech/appl -type f -name config.xml
register: wlx_domain_config
- debug:
var: wlx_domain_config.stdout_lines
name: "Ensure allow-unencrypted-null-cipher Encryption Settings in config.xml"
shell: grep -i "" {{ item }}
with_items: "{{ wlx_domain_config.stdout_lines }}"
register: allowunencrypted
debug:
var: allowunencrypted.stdout_lines

Resources