Get contents of all the files in all the folders - bash

There is a directory with some folders like a, b, c...
In every folder there are some text files which contents I need to get.
I've already tried to write a script like
for i in `ls`;
do
cd $i ;
cat * ;
done
But it doesn't work (I know why, but I don't know how to do it properly)

You shouldn't parse the output of ls. Instead use the find command to get all your files.
If you want to display the content of all regular files in the current directory and all its subdirectories, use this command:
find -type f -exec cat {} \;
If you have a lot of subdirectories, you may want to restrict the depth level with the option -maxdepth.

Related

How to copy recursively files with multiple specific extensions in bash

I want to copy all files with specific extensions recursively in bash.
****editing****
I've written the full script. I have list of names in a csv file, I'm iterating through each name in that list, then creating a directory with that same name somewhere else, then I'm searching in my source directory for the directory with that name, inside it there are few files with endings of xlsx,tsv,html,gz and I'm trying to copy all of them into the newly created directory.
sample_list_filepath=/home/lists/papers
destination_path=/home/ds/samples
source_directories_path=/home/papers_final/new
cat $sample_list_filepath/sample_list.csv | while read line
do
echo $line
cd $source_directories_path/$line
cp -r *.{tsv,xlsx,html,gz} $source_directories_path/$line $destination_path
done
This works, but it copies all the files there, with no discrimination for specific extension.
What is the problem?
An easy way to solve your problem is to use find and regex :
find src/ -regex '.*\.\(tsv\|xlsx\|gz\|html\)$' -exec cp {} dest/ \;
find look recursively in the directory you specify (in my example it's src/), allows you to filter with -regex and to apply a command for matching results with -exec
For the regex part :
.*\.
will take the name of the file and the dot before extension,
\(tsv\|xlsx\|gz\|html\)$
verify the extension with those you want.
The exec block is what you do with files you got from regex
-exec cp {} dest/ \;
In this case, you copy what you got ({} meaning) to the destination directory.

Create text document of every file in a directory recursively

Part of a script I currently use is using "ls -FCRlhLoprt" to list every file inside of a root directory recursively to a text document. The problem is, every time I run the script, ls includes that document in its output so the text document grows each time I run it. I believe I can use -i or --ignore, but how can I use that when ls is using a few variables? I keep getting errors:
ls "$lsopt" "$masroot"/ >> "$masroot"/"$client"_"$jobnum"_"$mas"_drive_contents.txt . #this works
If I try:
ls -FCRlhLoprt --ignore=""$masroot"/"$client"_"$jobnum"_"$mas"_drive_contents.txt"" "$masroot"/ >> "$masroot"/"$client"_"$jobnum"_"$mas"_drive_contents.txt #this does not work
I get errors. I basically want to not include the output back into the 2nd time I run this command.
Additional, all I am trying to do is create an easy to read document of every file inside of a directory recursively. If there is a better way, please let me know.
Additional, all I am trying to do is create an easy to read document of every file inside of a directory recursively. If there is a better way, please let me know.
To list every file in a directory recursively, the find command does exactly what you want, and admits further programmatic manipulation of the files found if you wish.
Examples:
To list every file under the current directory, recursively:
find ./ -type f
To list files under /etc/ and /usr/share, showing their owners and permissions:
find /etc /usr/share -type f -printf "%-100p %#m %10u %10g\n"
To show line counts of all files recursively, but ignoring subdirectories of .git:
find ./ -type f ! -regex ".*\.git.*" -exec wc -l {} +
To search under $masroot but ignore files generated by past searches, and dump the results into a file:
find "$masroot" -type f ! -regex ".*/[a-zA-Z]+_[0-9]+_.+_drive_contents.txt" | tee "$masroot/${client}_${jobnum}_${mas}_drive_contents.txt"
(Some of that might be slightly different on a Mac. For more information see man find.)

Can't rename, No such file or directory

I have a root folder (03_COMPLETE), inside which are 40 subfolders two levels down (all called CHILD_PNG) that contain .png files I want to rename. There are 6 complete folders I have to go through, with tens of thousands of files. All files are currently named like this: 123456_lifestyle.png, I want them named to lifestyle_123456.png.
My code:
find . -mindepth 2 -type f -iname '*.png' -print0 | xargs -0 /usr/local/bin/rename -v 's/\/([0-9]+)_([A-Za-z]+[0-9])/\/$2_$1/'\;
If I run this on an individual folder of .png files (without using -mindepth) it renames them. However if I run it on the root 03_COMPLETE directory to try and do all the renaming at once, I get lines of errors like this:
Can't rename
'/Volumes/COMMON-LIC-PHOTO/RETOUCHING/04_DELIVERY_PNG/Computer1/03_COMPLETE/06052017_NYS5_W_1263_Output/CHILD_PNG/123456_lifestyle.png'
to
'/Volumes/COMMON-LIC-PHOTO/RETOUCHING/04_DELIVERY_PNG/Computer1/03_COMPLETE/NYS5_06052017_W_1263_Output/CHILD_PNG/123456_lifestyle.png':
No such file or directory
I think it might have something to do with the names of the folder 1 level down (eg. here NYS5_06052017_W_1263_Output) because it did rename on a couple of folders named Bustform_000. Most of the folders though start with a number like 06052017.
I can't figure out why this will work at the .png folder level but won't work on the root folder, and why it will rename in a few folders but most of them it won't.
Also what is weird is that in the error it says it is trying to rename 123456_lifestyle.png to the same filename. Why would it do that? Any ideas?
This might help:
find 03_COMPLETE -type f | xargs -n 1 rename -n 's|/([^_/]*)_([^_/]*).png$|/$2_$1.png|'
Remove -n if output is okay.
You could change directory into each of the CHILD_PNG directories and run a single rename in there on all the files so you don't exec a new rename for every single file:
find 03_COMPLETE -type d -name CHILD_PNG -execdir bash -c "cd {}; rename -n '...' *.png" \;
The issue with your original Regex is, it matches the directory names of the form "xxxxx_yyyyy" and tries to convert them into "yyyyy_xxxxx", which, of course, doesn't exist. Since you're interested in changing only the filenames, and all of them end with .png, you can use the below Regex. Additionally, as you're trying to match a literal '/', you can choose a different character like '|' as delimiter to make the Regex easier to read
's|/([0-9]+)_([A-Za-z]+[0-9]*)(\.[Pp][Nn][Gg])|/$2_$1$3|'

Bash: How to control iteration flow/loops?

For going over some recovered data, I am working on a script that recursively goes through folders & files and finally runs file on them, to check if they are likely fully recovered from a certain backup or not. (recovered files play, and are identified as mp3 or other audio, non-working files as ASCII-Text)
For now I would just be satisfied with having it go over my test folder structure, print all folders & corresponding files. (printing them mainly for testing, but also because I would like to log where the script currently is and how far along it is in the end, to verify what has been processed)
I tried using 2 for loops, one for the folders, then one for the files. (so that ideally it would take 1 folder, then list the files in there (or potentially delve into subfolders) and below each folder only give the files in that subfolders, then moving on to the next.
Such as:
Folder1
- File 1
- File 2
-- Subfolder
-- File3
-- File4
Folder2
- File5
However this doesn't seem to work in the ways (such with for loops) that are normally proposed. I got as far as using "find . -type d" for the directories and "find . -type f" or "find * -type f" (so that it doesn't go in to subdirectories) However, when just printing the paths/files in order to check if it ran as I wanted it to, it became obvious that that didn't work.
It always seemed to first print all the directories (first loop) and then all the files (second loop). For keeping track of what it is doing and for making it easier to know what was checked/recovered I would like to do this in a more orderly fashion as explained above.
So is it that I just did something wrong, or is this maybe a general limitation of the for loop in bash?
Another problem that could be related: Although assigning the output of find to an array seemed to work, it wasn't accessible as an array ...
Example for loop:
for folder in '$(find . -type d)' ; do
echo $folder
let foldercounter++
done
Arrays:
folders=("$(find . -type d)")
#As far as I know this should assign the output as an array
#However, it is not really assigned properly somehow as
echo "$folders[1]"
# does not work (quotes necessary for spaces)
A find ... -exec ... solution #H.-Dirk Schmitt was referring to might look something like:
find . -type f -exec sh -c '
case $(file "$1") in
*Audio file*)
echo "$1 is an audio file"
;;
*ASCII text*)
echo "$1 is an ascii text file"
;;
esac
' _ {} ';'
For going over some recovered data, I am working on a script that recursively goes through folders & files and finally runs file on them, to check if they are likely fully recovered from a certain backup or not. (recovered files play, and are identified as mp3 or other audio, non-working files as ASCII-Text)
If you want to run file on every file and directory in the current directory, including its subdirectories and so on, you don't need to use a Bash for-loop, because you can just tell find to run file:
find -exec file '{}' ';'
(The -exec ... ';' option runs the command ... on every matched file or directory, replacing the argument {} with the path to the file.)
If you only want to run file on regular files (not directories), you can specify -type f:
find -type f -exec file '{}' ';'
If you (say) want to just print the names of directories, but run the above on regular files, you can use the -or operator to connect one directive that uses -type d and one that uses -type f:
find -type d -print -or -type f -exec file '{}' ';'
Edited to add: If desired, the effect of the above commands can be achieved in pure Bash (plus the file command, of course), by writing a recursive shell function. For example:
function foo () {
local file
for file in "$1"/* ; do
if [[ -d "$file" ]] ; then
echo "$file"
foo "$file"
else
file "$file"
fi
done
}
foo .
This differs from the find command in that it will sort the files more consistently, and perhaps in gritty details such as handling of dot-files and symbolic links, but is broadly the same, so may be used as a starting-point for further adjustments.

Bash loop through directory and rename every file

I am horrible at writing bash scripts, but I'm wondering if it's possible to recursively loop through a directory and rename all the files in there by "1.png", "2.png", etc, but I need it to restart at one for every new folder it enters. Here's script that works but only does it for one directory.
cd ./directory
cnt=1
for fname in *
do
mv $fname ${cnt}.png
cnt=$(( $cnt + 1 ))
done
Thanks in advance
EDIT
Can anyone actually write this code out? I have no idea how to write bash, and it's very confusing to me
Using find is a great idea. You can use find with the next syntax to find all directories inside your directory and apply your script to found directories:
find /directory -type d -exec youscript.sh {} \;
-type d parameter means you want to find only directories
-exec youscript.sh {} \; starts your script for every found directory and pass it this directory name as a parameter
Use find(1) to get a list of files, and then do whatever you like with that list.

Resources