contents in all files not parse using grep in ansible - ansible

I am looking to parse the lines contain word "harddisk" in two different files such as file1.txt and file2.txt.
with below ansible command am able to get only contents in file2.txt. kindly advise.
---
- name: find file name with .txt extension
shell: find /root -name "file*.txt" -print
register: "files"
- debug:
var: files
- name: find word harddisk in the file contents and output to result.txt in the destination server.
shell: cat "{{ item }}" |grep -i 'harddisk' > /root/test/result.txt
with_items:
- "{{ files.stdout_lines }}"
args:
executable: /bin/bash
register: output

Given the list of files
my_files:
- file1.txt
- file2.txt
and the content
shell> cat file1.txt
foo
harddisk A
harddisk B
bar
shell> cat file2.txt
foo
harddisk C
harddisk D
bar
The task below grep the lines from the files
- shell: "cat {{ item }} | grep harddisk"
register: files
loop: "{{ my_files }}"
Then, the task below creates the list of the lines
- set_fact:
my_list: "{{ files.results|
map(attribute='stdout_lines')|
list|flatten }}"
- debug:
var: my_list
gives
"my_list": [
"harddisk A",
"harddisk B",
"harddisk C",
"harddisk D"
]
To keep the source of the lines, the task below creates a dictionary with the data
- set_fact:
my_dict: "{{ dict(my_files|
zip(files.results|
map(attribute='stdout_lines')|
list)) }}"
- debug:
var: my_dict
gives
"my_dict": {
"file1.txt": [
"harddisk A",
"harddisk B"
],
"file2.txt": [
"harddisk C",
"harddisk D"
]
}

Related

Delete all old files, but keep newest 3 files for same file name and different path

Hi can i know how can i keep the latest 3 files based on file name and path. For instance i have
fileAbackup1.war, fileAbackup2.war, fileAbackup3.war, fileAbackup4.war
fileBbackup1.war, fileBbackup2.war, fileBbackup3.war, fileBbackup4.war
So i should keep
fileAbackup1.war, fileAbackup2.war, fileAbackup3.war
fileBbackup1.war, fileBbackup2.war, fileBbackup3.war
I know the part of finding files and deleting files older than x days. I already done the coding part. But i need have some filter to keep 3 backup files of same name of different path
Updated
Below is my ansible code
- name: Find files
find:
paths: "{{ remove_file_path }}"
use_regex: yes
patterns:
- '*.war.*.backup*'
- '{{ rar_file_name }}.rar.*.bakup*'
- '.*\..*\.\d+\.\d{4}-\d{2}-\d{2}#\d{2}:\d{2}:\d{2}~$'
age: 5d
recurse: yes
register: removeFile
- name: Delete files
file:
path: "{{ item.path }}"
state: absent
loop: "{{ removeFile.files }}"
The tasks below
- find:
paths: dir1
patterns: '*.war'
register: result
- set_fact:
my_files: "{{ result.files|
map(attribute='path')|
sort|
list }}"
- debug:
var: my_files[0:3]
give
my_files[0:3]:
- dir1/fileAbackup1.war
- dir1/fileAbackup2.war
- dir1/fileAbackup3.war
If you need the filenames only map the filter basename. For example
- set_fact:
my_files: "{{ result.files|
map(attribute='path')|
map('basename')|
sort|
list }}"
- debug:
var: my_files[0:3]
give
my_files[0:3]:
- fileAbackup1.war
- fileAbackup2.war
- fileAbackup3.war
Q: "Will it have file fileBbackup1.war, fileBbackup2.war, and fileBbackup3.war as well?"
A: Yes. It will. Create a dictionary of the lists. For example
- set_fact:
my_files: "{{ my_files|default({})|
combine({item: result.files|
map(attribute='path')|
map('basename')|
select('search', item)|
sort|
list}) }}"
loop:
- fileAbackup
- fileBbackup
- debug:
msg: "{{ item.value[0:3] }}"
loop: "{{ my_files|dict2items }}"
loop_control:
label: "{{ item.key }}"
give
ok: [localhost] => (item=fileAbackup) =>
msg:
- fileAbackup1.war
- fileAbackup2.war
- fileAbackup3.war
ok: [localhost] => (item=fileBbackup) =>
msg:
- fileBbackup1.war
- fileBbackup2.war
- fileBbackup3.war
This command will provide the information you need.
ls -1 *.war | sort -r | head -n -3
Explanation
ls -1 *.war | sort -r Performs a reverse sort and displays just the filenames
head -n -3 Displays all except the first 3
You can pipe this to the command that deletes the files or you can run the following
rm $(ls -1 | sort -r | head -n -3)

How to reduce a list against another list of patterns?

I need to write with Ansible's built in filters and tests the similar logic of shell one-liner:
for path in $(find PATH_TO_DIR); do for pattern in $PATTERNS; do echo $path | grep -v $pattern; done; done
---
- hosts: localhost
connection: local
gather_facts: False
vars:
paths:
- "/home/vagrant/.ansible"
- path-one
- path-two
- path-three
- "/home/vagrant/.ssh"
- "/home/vagratn/"
patterns:
- ".*ssh.*"
- ".*ansible.*"
- ".*one.*"
tasks:
- name: set empty list
set_fact:
files_to_be_removed: [ ]
In the end I would like to have a list like this:
ok: [localhost] => {
"msg": [
"path-two",
"path-three",
"/home/vagratn/"
]
}
With this form I getting a list where only last item from patterns is applied.
- set_fact:
files_to_be_removed: |
{{ paths
|reject("search", item)
|list }}
with_items:
- "{{ patterns }}"
The tasks below do the job
- set_fact:
files_to_be_removed: "{{ paths }}"
- set_fact:
files_to_be_removed: "{{ files_to_be_removed|
reject('search', item)|
list }}"
loop: "{{ patterns }}"
- debug:
var: files_to_be_removed
give
"files_to_be_removed": [
"path-two",
"path-three",
"/home/vagratn/"
]

How to read line-by-line in a file on remote machine

first line: /u01/app/oracle/oradata/TEST/
second line: /u02/
How to read both lines in a same variable and by using same varible i want know the present working directory through shell commands in ansible
You can use command to read a file from disk
- name: Read a file into a variable
command: cat /path/to/your/file
register: my_variable
And then do something like below to loop over the lines in the file.
- debug: msg="line: {{ item }}"
loop: y_variable.stdout_lines
The task below creates the list of the lines from a file
- set_fact:
lines_list: "{{ lines_list|default([]) + [item] }}"
with_lines: cat /path/to/file
It's possible to create both a list
"lines_list": [
"/u01/app/oracle/oradata/TEST/",
"/u02/"
]
and a dictionary
"lines_dict": {
"0": "/u01/app/oracle/oradata/TEST/",
"1": "/u02/"
}
with the combine filter
- set_fact:
lines_dict: "{{ lines_dict|default({})|combine({idx: item}) }}"
with_lines: cat /path/to/file
loop_control:
index_var: idx
"Present working directory through shell commands in ansible" can be printed from the registered variable. For example
- command: echo $PWD
register: result
- debug:
var: result.stdout
(not tested)

Use awk with ansible to run command

I have a playbook below:
- hosts: localhost
vars:
folderpath:
folder1/des
folder2/sdf
tasks:
- name: Create a symlink
shell: "echo {{folderpath}} | awk -F'/' '{system(\"mkdir \" $1$2 );}'"
register: result
#- debug:
# msg: "{{ result.stdout }}"
with_items:
- " {{folderpath}} "
However when I run the playbook I get 2 folders made. The first one is :
1- folder1des (as expected)
2- folder2 (this should ideally be folder2sdf )
I have tried many combination and still it doesnt want to work. What do I need to have it work properly.
I do not have ansible environment at the moment. But following should work:
- hosts: localhost
tasks:
- name: Create a symlink
shell: "echo {{item}} | awk -F'/' '{system(\"mkdir \" $1$2 );}'"
register: result
#- debug:
# msg: "{{ result.stdout }}"
with_items:
- folder1/des
- folder2/sdf
Reference: Ansible Loops Example
Explanation:
You were adding a single list object to the with_items. so in your with_items it finds only one object (which is of type list) to iterate over. Hence it runs only once. So now what I have done is I have passed a list of items to with_items that way it can iterate over the multiple items present in with_items.
Hope this helps!
Maybe
- hosts: localhost
vars:
folderpath:
folder1/des
folder2/sdf
tasks:
- name: Create a symlink
file:
state : link
path : "{{ item | regex_replace('[0-9]/','_') }}"
src : "{{ item }}"
with_items: " {{ folderpath }} "
Nothing in your given code creates symlinks. Is that really what you meant to do?

Ansible - Iterate Thru Folders Containing Files. Output to file

Hoping someone might be able to provide me an answer with this. I currently have a folder structure like so.
/BASE_DIR
/FOLDER_A
- file1.txt
- file2.txt
/FOLDER_B
/FOLDER_C
- file3.txt
Im trying to create a playbook that could tell me which folders contain files. My end goal is to have a flat file with:
FOLDER_A, file1.txt
FOLDER_A, file2.txt
FOLDER_C, file3.txt
This is my playbook currently:
- name: get files from all folders
shell: cd /BASE_DIR/{{ item.name }} && ls -p | grep -v / |grep .txt |cat
with_items:
- name: "FOLDER_A"
- name: "FOLDER_B"
- name: "FOLDER_C"
register: "fileitems"
- name: combine to have folder name as key, filenames as values
set_fact:
folders_with_files: "{{ folders_with_files|default({}) | combine( { item.item.name: item.stdout_lines } ) }}"
with_items: "{{ fileitems.results }}"
when: "{{ item.stdout_lines|length }} > 0"
- debug:
var: folders_with_files
I thought I could iterate through each folder looking for *.txt and then use a combine, it would be an easy way to iterate.
ok: [localhost] => {
"folders_with_files": {
"FOLDER_A": [
"file1.txt",
"file2.txt"
],
"FOLDER_C": [
"file3.txt"
]
}
}
But even with this output, I don't think I can properly parse it the way I need to. I thought maybe a nested loop could help, but that would mean I would need to know the name of the keys beforehand.
Any help would be appreciated!
Thanks,
T
Go figure as soon as I post the question, I find my own answer...
I decided to remove the combine and just append to an empty list.
- set_fact:
folders_with_files: []
- name: get all sql from each adapter
shell: cd /tmp/{{ item.name }} && ls -p | grep -v / |grep .txt |cat
with_items:
- name: "FOLDER_A"
- name: "FOLDER_B"
- name: "FOLDER_C"
register: "fileitems"
- name: combine to display which adapters have files
set_fact:
folders_with_files: "{{ folders_with_files + [{ 'name': item.item.name, 'files': item.stdout_lines }] }}"
with_items: "{{ fileitems.results }}"
when: "{{ item.stdout_lines|length }} > 0"
- debug:
var: folders_with_files
My output then became:
ok: [localhost] => {
"folders_with_files": [
{
"files": [
"file1.txt",
"file2.txt"
],
"name": "FOLDER_A"
},
{
"files": [
"file3.txt"
],
"name": "FOLDER_C"
}
]
}
I could then use a with_subelements:
- name: echo
shell: echo "{{ item.0.name }}, {{ item.1}}" >> /tmp/output.txt
with_subelements:
- "{{ folders_with_files }}"
- files

Resources