Bash Script for listing subdirectories and files in textfile - bash

I need a Script that writes the directory and subdirectory in a text-file.
For example the script lies in /Mainfolder and in this folder are four other folders. Each contains several files.
Now I would like the script to write the path of each file in the textfile.
Subfolder1/File1.dat
Subfolder1/File2.dat
Subfolder2/File1.dat
Subfolder3/File1.dat
Subfolder4/File1.dat
Subfolder4/File2.dat
Important is that there is no slash in front of the listing.

Use the find command:
find Mainfolder > outputfile
and if you only want the files listed, do
find Mainfolder -type f > outputfile
You can also strip the leading ./ if you search the current directory, with the %P format option:
find . -type f -printf '%P\n' > outputfile

If your bash version is high enough, you can do it like that:
#!/bin/bash
shopt -s globstar
echo ** > yourtextfile

This solution assumes that the subdirectories contain only files -- they do not contain any directory in turn.
find . -type f -print | sed 's|^.*/S|S|'
I have created a single file in each of the four subdirectories. The original output is:
./Subfolder1/File1.dat
./Subfolder4/File4.dat
./Subfolder2/File2.dat
./Subfolder3/File3.dat
The filtered output is:
Subfolder1/File1.dat
Subfolder4/File4.dat
Subfolder2/File2.dat
Subfolder3/File3.dat

You can use this find with -exec:
find . -type f -exec bash -c 'f="{}"; echo "${f:2}"' \;
This will print all files starting from current paths by removing ./ from front.

Related

How can I diff two directories in bash recursively for only 1 file name?

Currently I am trying this:
diff -r /develop /us-prod
which shows all the differences between the two, but all I really care about here is a file named schema.json, which is guaranteed to be there in all directories, but this file can be different.
I want to diff these two directories, but only if the file name is schema.json.
I see that you can do -x to exclude files, but it is difficult to say which other files could be in there.
There are some guaranteed files to be there, but some are not. Is there more an "inclusion" than an exclude?
You can try this :
find /develop -type f -name schema.json -exec bash -c\
'diff "$1" "/us-prod${1#/develop}"' _ {} \;
Assuming the both directories have just one schema.json file for each directory
including their subdirectories, would you please try:
diff $(find /develop -type f -name schema.json) $(find /us-prod -type f -name schema.json)

Find and rename multiple files using a bash script in Linux

As an example, in a directory /home/hel/files/ are thousends of files and hundreds of directories.
An application saves there its output files with special characters in the file names.
I want to replace these special characters with underscores in all file names. e.g. -:"<>#
I wrote a bash script which simply repeats a command to rename the files using Linux/Unix 'rename'.
Example: file name: rename.sh
#!/bin/bash
rename "s/\'/_/g" *
rename 's/[-:"<>#\,&\s\(\)\[\]?!–~%„“;│\´\’\+#]/_/g' *
rename 'y/A-Z/a-z/' *
rename 's/\.(?=[^.]*\.)/_/g' *
rename 's/[_]{2,}/_/g' *
I execute the following find command:
find /home/hel/files/ -maxdepth 1 -type f -execdir /home/hel/scripts/rename.sh {} \+
Now the issue:
This works fine, except the fact, that it renames subdirectories too, if they have the searched characters in their name.
The find command searches just for files and not for directories.
I tried some other find variations like:
find /home/hel/files/ -maxdepth 1 -type f -execdir sh /home/hel/scripts/rename.sh {} \+
find /home/hel/files/ -maxdepth 1 -type f -execdir sh /home/hel/scripts/rename.sh {} +
find /home/hel/files/ -maxdepth 1 -type f -execdir sh /home/hel/scripts/rename.sh {} \;
They are all working, but with the same result.
What is not working:
find /home/hel/files/ -maxdepth 1 -type f -exec sh /home/hel/scripts/rename.sh {} \+
This one is dangerous, because it renames the directories and files in the current directory, where you call the find command too.
Maybe one has an idea, why this happens or has a better solution.
The script rename.sh did not use its command line arguments at all, but instead searched files and directories (!) on its own using the glob *.
Change your script to the following.
#!/bin/bash
rename -d s/\''/_/g;
s/[-:"<>#\,&\s\(\)\[\]?!–~%„“;│\´\’\+#]/_/g;
y/A-Z/a-z/;
s/\.(?=[^.]*\.)/_/g;
s/[_]{2,}/_/g' "$#"
Then use find ... -maxdepth 1 -type f -exec sh .../rename.sh {} +.
Changes Made
Use "$#" instead of * to process the files given as arguments rather than everything in the current directory.
Execute rename only once as a 2nd rename wouldn't find the files specified with "$#" after they were renamed by the 1st rename.
Use the -d option such that only the basenames are modified. find always puts a path in front of the files, at the very least ./. Without this option rename would change ./filename to mangledPath/newFilename and therefore move the file to another directory.
Note that man rename is a bit misleading
--path, --fullpath
Rename full path: including any directory component. DEFAULT
-d, --filename, --nopath, --nofullpath
Do not rename directory: only rename filename component of path.
For a given path rename -d 's...' some/path/basename just processes the basename and ignores the leading components some/path/. If basename is a directory it will still be renamed despite the -d option.

How do I recursively find files with specific names and join using ImageMagick in Terminal?

I have created an ImageMagick command to join images with certain names:
convert -append *A_SLIDER.jpg *B_SLIDER.jpg out.jpg
I have lots of folders with files named *A_SLIDER.jpg and *B_SLIDER.jpg next to each other (only ever one pair in a directory).
I would like to recursively search a directory with many folders and execute the command to join the images.
If it is possible to name the output image based on the input images that would be great e.g.
=> DOGS_A_SLIDER.jpg and DOGS_B_SLIDER.jpg would combine to DOGS_SLIDER.jpg
Something like this, but back up first and try on a sample directory only!
#!/bin/bash
find . -name "*A_SLIDER*" -execdir bash -c ' \
out=$(ls *A_SLIDER*);
out=${out/_A/}; \
convert -append "*A_SLIDER*" "*B_SLIDER*" $out' \;
Find all files containing the letters "A_SLIDER" and go to the containing directory and start bash there. While you are there, get the name of the file, and remove the _A part to form the output filename. Then execute ImageMagick convert with the _A_ and the corresponding _B_ files to form the output file.
Or, a slightly more concise suggestion from #gniourf_gniourf... thank you.
#!/bin/bash
find . -name "*A_SLIDER.jpg" -type f -execdir bash -c 'convert -append "$1" "${1/_A_/_B_}" "${1/_A/}"' _ {} \;
The "find" command will recursively search folders:
$ find . -name "*.jpg" -print
That will display all the filenames. You might instead want "-iname" which does case-insensitive filename matching.
You can add a command line with "-exec", in which "{}" is replaced by the name of the file. You must terminate the command line with "\;":
$ find . -name "*.jpg" -exec ls -l {} \;
You can use sed to edit the name of a file:
$ echo DOGS_A_SLIDER.jpg | sed 's=_.*$=='
DOGS
Can you count on all of your "B" files being named the same as the corresponding "A" files? That is, you will not have "DOGS_A_SLIDER.jpg" and "CATS_A_SLIDER.jpg" in the same directory. If so, something like the following isn't everything you need, but will contribute to your solution:
$ find . -type f -name "*.jpg" -exec "(echo {} | sed 's=_.*==')" \;
That particular sed script will do the wrong thing if you have any directory names with underscores in them.
"find . -type f" finds regular files; it runs modestly faster than without the -type. Use "-d" to find directories.

How to access all files in all sub directories using shell script?

I want to execute following command on all files of a directory and also on all files in sub directories of that directory:
cat filename | col -b > filename
this command will remove control M or ^M character from the file and works fine with single file. Please help ...
I tried below command but doesn't work. It works with single directory but not with sub directories.
for i in *
do
cat i | col -b > i
done
find . -type f -name '*.gif or .jpeg' -o -exec sed -i 's/^M//' {} \;
this is what your looking for ^M it works in all file to remove the images. just check it in you are code
find . -type f -exec sed -i 's/^M//' {} \;
Note, do the ^M by hitting ctrl-V then ctrl-m, not just ^M
If you want to only hit a subset of files, then specify an appropriate regexp and pass that to find as (most likely) -iname - see man find for the options

How do I grab the filename of the file containing a certain string when there are hundreds of files?

I have a folder with 200 files in it. We can say that the files are named "abc0" to "abc199". Five of these files contain the string "ez123" but I don't know which ones. My current attempt to find the file names of the files that contain the string is:
#!/bin/sh
while read FILES
do
cat $FILES | egrep "ez123"
done
I have a file that contains the filenames of all files in the directory. So I then execute:
./script < filenames
This is verifies for me that the files containing the string exist but I still don't have the name of the files. Are there any ideas concerning the best way to accomplish this?
Thanks
you can try
grep -l "ez123" abc*
find /directory -maxdepth 1 -type f -exec fgrep -l 'ez123' \{\} \;
(-maxdepth 1 is only necessary if you only want to search the directory and not the tree recursively (if there's any)).
fgrep is a bit faster than grep. -l lists the matched filenames only.
Try
find -type f -exec grep -qs "ez123" {} \; -print
This will use find to find all real files in current directory (and subdirectories), execute grep on them ({} will be replaced by file name, -qs tells it to be silent and just set an exit code), -print will print out the names of the files that grep found a matching line in.
What about:
xargs egrep -l ez123
That reads filenames from stdin and prints out the filenames with matches.

Resources