Looping through all files in a given directory [duplicate] - shell

This question already has answers here:
Looping through all files in a directory [duplicate]
(6 answers)
Closed 4 years ago.
Here is what I'm trying to do:
Give a parameter to a shell script that will run a task on all files of jpg, bmp, tif extension.
Eg: ./doProcess /media/repo/user1/dir5
and all jpg, bmp, tif files in that directory will have a certain task run on them.
What I have now is:
for f in *
do
imagejob "$f" "output/${f%.output}" ;
done
I need help with the for loop to restrict the file types and also have some way of starting under a specified directory instead of current directory.

Use shell expansion rather than ls
for file in *.{jpg,bmp,tif}
do
imagejob "$file" "output/${file%.output}"
done
also if you have bash 4.0+, you can use globstar
shopt -s globstar
shopt -s nullglob
shopt -s nocaseglob
for file in **/*.{jpg,bmp,tif}
do
# do something with $file
done

for i in `ls $1/*.jpg $1/*.bmp $1/*.tif`; do
imagejob "$i";
done
This is assuming you're using a bashlike shell where $1 is the first argument given to it.
You could also do:
find "$1" -iname "*.jpg" -or -iname "*.bmp" -or -iname "*.tif" \
-exec imagejob \{\} \;

You could use a construct with backticks and ls (or any other commando of course):
for f in `ls *.jpg *.bmp *.tif`; do ...; done

The other solutions here are either Bash-only or recommend the use of ls in spite of it being a common and well-documented antipattern. Here is how to solve this in POSIX sh without ls:
for file in *.jpg *.bmp *.tif; do
... stuff with "$file"
done
If you have a very large number of files, perhaps you also want to look into
find . -maxdepth -type f \( \
-name '*.jpg' -o -name '*.bmp' -o -name '*.tif' \) \
-exec stuff with {} +
which avoids the overhead of sorting the file names alphabetically. The -maxdepth 1 says to not recurse into subdirectories; obviously, take it out or modify it if you do want to recurse into subdirectories.
The -exec ... + predicate of find is a relatively new introduction; if your find is too old, you might want to use -exec ... \; or replace the -exec stuff with {} + with
find ... -print0 |
xargs -r0 stuff with
where however again the -print0 option and the corresponding -0 option for xargs are a GNU extension.

Related

find command - get base name only - NOT with basename command / NOT with printf

Is there any way to get the basename in the command find?
What I don't need:
find /dir1 -type f -printf "%f\n"
find /dir1 -type f -exec basename {} \;
Why you may ask? Because I need to continue using the found file. I basically want something like this:
find . -type f -exec find /home -type l -name "*{}*" \;
And it uses ./file1, not file1 as the agrument for -name.
Thanks for your help :)
If you've got Bash version 4.3 or later, try this Shellcheck-clean pure Bash code:
#! /bin/bash -p
shopt -s dotglob globstar nullglob
for path in ./**; do
[[ -L $path ]] && continue
[[ -f $path ]] || continue
base=${path##*/}
for path2 in /home/**/*"$base"*; do
[[ -L $path2 ]] && printf '%s\n' "$path2"
done
done
shopt -s ... enables some Bash settings that are required by the code:
dotglob enables globs to match files and directories that begin with .. find shows such files by default.
globstar enables the use of ** to match paths recursively through directory trees. globstar was introduced in Bash 4.0, but it was dangerous to use before Bash 4.3 (2014) because it followed symlinks when looking for matches.
nullglob makes globs expand to nothing when nothing matches (otherwise they expand to the glob pattern itself, which is almost never useful in programs).
See Removing part of a string (BashFAQ/100 (How do I do string manipulation in bash?)) for an explanation of ${path##*/}. That always works, even in some rare cases where $(basename "$path") doesn't.
See the accepted, and excellent, answer to Why is printf better than echo? for an explanation of why I used printf instead of echo to output the found paths.
This solution works correctly if you've got files that contain pattern characters (?, *, [, ], \) in their names.
Spawn a shell and make the second call to find from there
find /dir1 -type f -exec sh -c '
for p; do
find /dir2 -type l -name "*${p##*/}*"
done' sh {} +
If your files may contain special characters in their names (like [, ?, etc.), you may want to escape them like this to avoid false positives
find /dir1 -type f -exec sh -c '
for p; do
esc=$(printf "%sx\n" "${p##*/}" | sed "s/[?*[\]/\\\&/g")
esc=${esc%x}
find /dir2 -type l -name "*$esc*"
done' sh {} +
You'll have to forward it to another evaluator. There is no way to do that in find.
find . -type f -printf '%f\0' |
xargs -r0I{} find /home -type l -name '*{}*'
This answers your question about trying to merge the functionality of %f and -exec find and is based on your example but your example injects raw filenames as -name patterns so avoid that and look at other solutions instead.
Simply spawn a bash shell:
find /dir1 -type f -exec bash -c '
base=$(basename "$1")
echo "$base"
do_something_else "$base"
' bash {} \;
$1 in the bash part is each file filtered by find.

How to randomly name file when using find exec in a bash script?

When it comes to quickly converting a bunch of files and randomly renaming them I use a pretty simple way to do so with a for loop:
for i in *; do convert [...] $i ../output/$RANDOM.jpg; done
Easy as that. The details what imagemagick does here aren't important here. It works as intended. It's just about how to handle the bash stuff.
Now my current case the folder does not only contain photos, it also does contain subfolders with other photos themself. Expected behavior is again that all photos are randomly renamed and the output files are merged in a single folder.
Since I don't know a way to recursively loop with for, I use a find construct here.
find . \( -iname "*.jpg" -or -iname "*.png" \) -exec convert [...] {} ../output/$RANDOM.jpg \;
Problem is $RANDOM does only get called once, so it stays the same over the whole process and the images get overwritten again and again. So in fact the output folder does only one image, the one that got processed the last.
So the question is:
How do I get the $RANDOM variable to change with each new file?
Kind regards!
Throw it into a loop.
find . \( -iname "*.jpg" -or -iname "*.png" \) -type f -print0 |
while read -d '' -r f
do convert [...] "$f" ../output/$RANDOM.jpg # copied mostly from your find above
done
The -print0 and read -d '' are unnecessary if you never have embedded newlines in your filenames.
Don't use find at all; just use the globstar option.
shopt -s globstar
for f in **/*.jpg **/*.png; do
convert [...] "$i" ../output/$RANDOM.jpg
done
I would go with a shell loop as detailed in the other answers, but it's still useful to know how to run arbitrary shell code like $RANDOM in a find -exec command. You do it by running a shell:
find . \( -iname "*.jpg" -or -iname "*.png" \) \
-exec bash -c 'convert [...] "$1" "../output/$RANDOM.jpg"' _ {} \;

Bash script to move all png files in folder and its subfolders to another directory?

In ~/Desktop/a/ , I have .png files, and there are also subfolders within this that also have .png files.
I'd like to move all of those .png files to another folder.
This is my code so far. It runs, but nothing is placed into the target folder. What is the problem?
#!/bin/bash
cd ~/Desktop/a/
for f in $(find . -type f -name "*.png")
do
mv $f ~/Desktop/new/
done
I guess that these image filenames maybe include spaces or other special characters.
find ~/Desktop/a/ -type f -name "*.png" -exec mv "{}" ~/Desktop/new/ \;
or
find ~/Desktop/a/ -type f -name "*.png" -print0 | xargs -0 -I{} mv "{}" ~/Desktop/new/
If your bash is new enough, you can also use globstar:
cd ~/Desktop/a || exit 1
shopt -s globstar
mv -- **/*.png ~/Desktop/new
Or (if there are too many files to fit in a single command line):
shopt -s globstar
for f in ~/Desktop/a/**/*.png; do
mv -- "$f" ~/Desktop/new
done

BASH: find and rename files & directories

I would like to replace :2f with a - in all file/dir names and for some reason the one-liner below is not working, is there any simpler way to achieve this?
Directory name example:
AN :2f EXAMPLE
Command:
for i in $(find /tmp/ \( -iname ".*" -prune -o -iname "*:*" -print \)); do { mv $i $(echo $i | sed 's/\:2f/\-/pg'); }; done
You don't have to parse the output of find:
find . -depth -name '*:2f*' -execdir bash -c 'echo mv "$0" "${0//:2f/-}"' {} \;
We're using -execdir so that the command is executed from within the directory containing the found file. We're also using -depth so that the content of a directory is considered before the directory itself. All this to avoid problems if the :2f string appears in a directory name.
As is, this command is harmless and won't perform any renaming; it'll only show on the terminal what's going to be performed. Remove echo if you're happy with what you see.
This assumes you want to perform the renaming for all files and folders (recursively) in current directory.
-execdir might not be available for your version of find, though.
If your find doesn't support -execdir, you can get along without as so:
find . -depth -name '*:2f*' -exec bash -c 'dn=${0%/*} bn=${0##*/}; echo mv "$dn/$bn" "$dn/${bn//:2f/-}"' {} \;
Here, the trick is to separate the directory part from the filename part—that's what we store in dn (dirname) and bn (basename)—and then only change the :2f in the filename.
Since you have filenames containing space, for will split these up into separate arguments when iterating. Pipe to a while loop instead:
find /tmp/ \( -iname ".*" -prune -o -iname "*:*" -print \) | while read -r i; do
mv "$i" "$(echo "$i" | sed 's/\:2f/\-/pg')"
Also quote all the variables and command substitutions.
This will work as long as you don't have any filenames containing newline.

Bash: recursively copy and rename files

I have a lot of files whose names end with '_100.jpg'. They spread in nested folder / sub-folders. Now I want a trick to recursively copy and rename all of them to have a suffix of '_crop.jpg'. Unfortunately I'm not familiar with bash scripting so don't know the exact way to do this thing. I googled and tried the 'find' command with the '-exec' para but with no luck.
Plz help me. Thanks.
find bar -iname "*_100.jpg" -printf 'mv %p %p\n' \
| sed 's/_100\.jpg$/_crop\.jpg/' \
| while read l; do eval $l; done
if you have bash 4
shopt -s globstar
for file in **/*_100.jpg; do
echo mv "$file" "${file/_100.jpg/_crop.jpg}"
one
or using find
find . -type f -iname "*_100.jpg" | while read -r FILE
do
echo mv "${FILE}" "${FILE/_100.jpg/_crop.jpg}"
done
This uses a Perl script that you may have already on your system. It's sometimes called prename instead of rename:
find /dir/to/start -type f -iname "*_100.jpg" -exec rename 's/_100/_crop' {} \;
You can make the regexes more robust if you need to protect filenames that have "_100" repeated or in parts of the name you don't want changed.

Resources