Ansible getting list of files in folders containg special file - ansible

I am trying to find the the files in side folders that contain 'done' file
i was trying to do it in pure ansible but i could not make it to work, so i am trying to do it using find,ls,xargs and then running it inside ansible shell command.
example for folders and file structure:
├── a12
│   ├── 1.txt
│   ├── 2.txt
│   └── 3.txt
├── a13
│   ├── 4.txt
│   ├── 5.txt
│   ├── 6.txt
│   └── done
└── a14
├── 7.txt
├── 8.txt
├── 9.txt
└── done
and i am tring to get
4.txt
5.txt
6.txt
7.txt
8.txt
9.txt
with the command
find /tmp/test_an/ -type f -name 'done' | xargs dirname | xargs ls -1 | grep -v 'done'
I am getting
/tmp/test_an/a13:
4.txt
5.txt
6.txt
/tmp/test_an/a14:
7.txt
8.txt
9.txt
i can exclude the folder using grep, but i am looking for a cleaner\better solution

my best solution till now:
find /tmp/test_an/ -type f -name 'done' | xargs dirname | xargs -I{} find {} -type f
/tmp/test_an/a13/5.txt
/tmp/test_an/a13/4.txt
/tmp/test_an/a13/done
/tmp/test_an/a13/6.txt
/tmp/test_an/a14/8.txt
/tmp/test_an/a14/9.txt
/tmp/test_an/a14/7.txt

Related

Take a list of names from a text file and compare them with a list of directories in Bash

I am trying to take a list of names from a text file and compare them with a list of directories. If there is a match in the directories then move them.
The code below doesn't work but it is essentially what I am trying to achieve.
#!/bin/bash
echo "Starting"
names="names.txt"
while IFS= read -r directory; do
find 'Folder/' -type d -name '$directory' -print0
done < "$names" | xargs -t mv Folder/ MoveTo/
Example folder structure:
Folder/
folder1
folder2
folder3
oddfolder
oddfolder2
MoveTo/
(empty)
Example text file structure:
folder1
folder2
folder3
Output expectation:
Folder/
oddfolder
oddfolder2
MoveTo/
folder1
folder2
folder3
I don't have an issue with spaces or capitalization. If there is a match then I want to move the selected folders to a different folder.
This should work:
$ tree
├── folder
│   ├── f1
│   ├── f2
│   ├── f3
│   ├── f4
│   ├── other1
│   └── other2
├── name.txt
└── newdir
$ cat name.txt
f1
f2
f3
f4
$ while IFS= read -r dir; do
mv "folder/$dir" newdir/. 2>/dev/null
done < name.txt
$ tree
.
├── folder
│   ├── other1
│   └── other2
├── name.txt
└── newdir
├── f1
├── f2
├── f3
└── f4
Note that you should also use " instead of ' with variable
You do not have to execute find command within the while loop.
The test [[ -d dirname ]] will be enough to confirm the existence
of the directory. Would you please try:
#!/bin/bash
names="names.txt"
src="Folder"
dest="MoveTo"
while IFS= read -r dir; do
[[ -d $src/$dir ]] && mv "$src/$dir" "$dest"
done < "$names"

A bash script to rename files from different directories at once

If I have a directory named /all_images, and inside this directory there's a ton of directories, all the directories named dish_num as shown below. and inside each dish directory, there's one image named rgb.png. How can i rename all the image files to be the name of its directory.
Before
|
├── dish_1
│ └── rgb.png
├── dish_2
│ └── rgb.png
├── dish_3
│ └── rgb.png
├── dish_4
│ └── rgb.png
└── dish_5
└── rgb.png
After
|
├── dish_1
│ └── dish_1.png
├── dish_2
│ └── dish_2.png
├── dish_3
│ └── dish_3.png
├── dish_4
│ └── dish_4.png
└── dish_5
└── dish_5.png
WARNING: Make sure you have backups before running code you got someplace on the Internet!
find /all_images -name rgb.png -execdir sh -c 'mv rgb.png $(basename $PWD).png' \;
where
find /all_images will start looking from the directory "/all_images"
-name rbg.png will look anywhere for anything named "rbg.png"
optionally use -type f to restrict results to only files
-exedir in every directory where you got a hit, execute the following:
sh -c shell script
mv move, or "rename" in this case
rgb.png file named "rgb.png"
$(basename $PWD).png output of "basename $PWD", which is the last section of the $PWD - the current directory - and append ".png" to it
\; terminating string for the find loop
If you want to benefited from your multi-core processors, consider using xargs instead of find -execdir to process files concurrently.
Here is a solution composed of find, xargs, mv, basename and dirname.
find all_images -type f -name rgb.png |
xargs -P0 -I# sh -c 'mv # $(dirname #)/$(basename $(dirname #)).png'
find all_images -type f -name rgb.png prints a list of file paths whose filename is exactly rgb.png.
xargs -P0 -I# CMD... executes CMD in a parallel mode with # replaced by path names from find command. Please refer to man xargs for more information.
-P maxprocs
Parallel mode: run at most maxprocs invocations of utility at once. If maxprocs is set to 0, xargs will run as many processes as possible.
dirname all_images/dash_4/rgb.png becomes all_images/dash_4
basename all_images/dash_4 becomes dash_4
Demo
mkdir all_images && seq 5 |
xargs -I# sh -c 'mkdir all_images/dash_# && touch all_images/dash_#/rgb.png'
tree
find all_images -type f -name rgb.png |
xargs -P0 -I# sh -c 'mv # $(dirname #)/$(basename $(dirname #)).png'
tree
Output
.
└── all_images
├── dash_1
│   └── rgb.png
├── dash_2
│   └── rgb.png
├── dash_3
│   └── rgb.png
├── dash_4
│   └── rgb.png
└── dash_5
└── rgb.png
.
└── all_images
├── dash_1
│   └── dash_1.png
├── dash_2
│   └── dash_2.png
├── dash_3
│   └── dash_3.png
├── dash_4
│   └── dash_4.png
└── dash_5
└── dash_5.png
6 directories, 5 files

Formatting Unix password store ls output

I just want to automate some process, and I need to be able to format the output from pass ls
Which is the listing command for https://www.passwordstore.org
Current output looks like so:
➜ ~ pass ls
Password Store
├── README.md
├── folder
│   └── subfolder
│   └── subfolder2
│   └── key1
│   └── key2
│   └── key3
├── anotherfolder
│   ├── subfolder
│   │   └── subfolder2
│   │   └── subfolder3
│   │ └── key1
│   │ └── key2
│   │ └── key3
I want the output to look like:
➜ ~ pass ls | some magic sed/grep/replace/etc
folder/subfolder/subfolder2/key1
folder/subfolder/subfolder2/key2
folder/subfolder/subfolder2/key3
anotherfolder/subfolder/subfolder2/subfolder3/key1
anotherfolder/subfolder/subfolder2/subfolder3/key2
anotherfolder/subfolder/subfolder2/subfolder3/key3
I am trying to use sed to do so, but couldn't replace the increasing spaces/tabs as the subfolder levels gets deeper (for example folder/subfolder/subfolder/subfolder/key)
Here is what I am trying so far:
pass ls | sed -e 's/├──[ \t]*/\\/g' | sed -e 's/│   └──[ \t]*/\\/g'
EDIT AFTER COMMENTS:
Seems like pass ls is just a simple tree command on the password store directory, so I can run find on my directory to achieve the above format I want
I will try that, and the scope of the question can be changed to:
What's the proper listing command that can produce the above format?
find . -type d -name '.git' -prune -o -name '*.gpg' -type f -print
Did the trick for me, it excludes all files in .git directory, and only prints files with .gpg extension (which are the actual keys)

Bash - Combine files in separated sub folders

So I'm looking for a way to cat .html files in multiple subfolders, but by keeping them in their place.
Actual situation:
$ Folder1
.
├── Subfolder1
│ └── File1.html
└── File2.html
├── Subfolder2
│ └── File1.html
└── File2.html
Desired outcome:
$ Folder1
.
├── Subfolder1
│ └── Mergedfile1.html
└── File1.html
└── File2.html
├── Subfolder2
│ └── Mergedfile2.html
└── File1.html
└── File2.html
So far I've came up with this:
find . -type f -name *.html -exec cat {} + > Mergedfile.html
But this combines all the files of all the subfolders of Folder1, while I want to keep them separated.
Thanks a lot!
You can loop on all subfolders with a for statement:
for i in Folder1/SubFolder*; do
cat "$i"/File*.html > MergeFile$(echo "$i" | sed 's,.*\([0-9]\+\)$,\1,').html
done
Like told by AK_ , you can use find with exec.
find Folder1/ -mindepth 1 -maxdepth 1 -type d -exec sh -c "rep='{}';cat "'"$rep"'"/*.html > "'"$rep"'"/Mergedfile.html" \;

Script to remove oldest files of type in each directory?

Much research has turned almost similar questions yet nothing close enough to give me an idea of how to accomplish part my task. I'll try to keep this clear and short, while explaining the situation and desired result. My structure would be as follows:
-mobile
--Docs
--Downloads
--SomeFile
----this.is.crazy_0.0.1-1_named-silly.txt
----dont.touch.me.pdf
----leave.me.alone.png
----this.is.crazy_0.0.1-2_named-silly.txt
----this.is.crazy_0.0.1-3_named-silly.txt <---- file to keep
--SomeFileA
----this.is.crazy_0.0.1-1_also-silly.txt
----this.is.crazy_0.0.1-2_also-silly.txt
----dont.touch.me.either.pdf
----leave.me.alone.too.png
----this.is.crazy_0.0.1-3_also-silly.txt
----this.is.crazy_0.0.1-11_also-silly.txt <----file to keep
The first part of my script to find the .txt files ignores every directory that is constant in this working directory and prints them to a list (which is a completely ugly hack and most likely a hinder to the way most would accomplish this task) "SomeFileB and SomeFileC" could come along with the same file structure and I'd like to catch them in this script as well.
The idea is to keep the newest .txt file in each directory according to its time stamp which obviously isn't in the filename. The files to keep will continue to change of course. To clarify the question again, how to go about keeping the newest .txt file in each variable directory with variable crazy name, according to timestamp which isn't in the filename? Hopefully I've been clear enough for help. This script should be in bash.
I'm not with the current code right now, as i said its ugly but heres a snippet of what I have find /path/to/working/directory -maxdepth 0 -not -path "*Docs*" -not -path "*Downloads* -name "*.txt" >list
Assuming the question was understood correctly, the task could be expressed as:
Recursively remove all files *.txt except the newest in each respective directory
#!/bin/bash
# Find all directories from top of tree
find a -type d | while read -r dir; do
# skip $dir if doesn't contain any files *.txt
ls "$dir"/*.txt &>/dev/null || continue
# list *.txt by timestamp, skipping the newest file
ls -t "$dir"/*.txt | awk 'NR>1' | while read -r file; do
rm "$file"
done
done
Assuming this directory tree, where a.txt is always the newest:
$ tree -t a
a
├── otherdir
├── b
│   ├── d e
│   │   ├── a.txt
│   │   ├── b.txt
│   │   ├── c.txt
│   │   ├── bar.txt
│   │   └── foo.pdf
│   ├── c
│   │   ├── a.txt
│   │   ├── b.txt
│   │   └── c.txt
│   ├── a.txt
│   ├── b.txt
│   ├── c.txt
│   └── foo.pdf
├── a.txt
├── b.txt
└── c.txt
This is the result after running the script:
$ tree -t a
a
├── b
│   ├── c
│   │   └── a.txt
│   ├── d e
│   │   ├── a.txt
│   │   └── foo.pdf
│   ├── a.txt
│   └── foo.pdf
├── otherdir
└── a.txt
Change rm "$file" to echo rm "$file" to check what would be removed before running "for real"

Resources