BASH: find and rename files & directories - bash

I would like to replace :2f with a - in all file/dir names and for some reason the one-liner below is not working, is there any simpler way to achieve this?
Directory name example:
AN :2f EXAMPLE
Command:
for i in $(find /tmp/ \( -iname ".*" -prune -o -iname "*:*" -print \)); do { mv $i $(echo $i | sed 's/\:2f/\-/pg'); }; done

You don't have to parse the output of find:
find . -depth -name '*:2f*' -execdir bash -c 'echo mv "$0" "${0//:2f/-}"' {} \;
We're using -execdir so that the command is executed from within the directory containing the found file. We're also using -depth so that the content of a directory is considered before the directory itself. All this to avoid problems if the :2f string appears in a directory name.
As is, this command is harmless and won't perform any renaming; it'll only show on the terminal what's going to be performed. Remove echo if you're happy with what you see.
This assumes you want to perform the renaming for all files and folders (recursively) in current directory.
-execdir might not be available for your version of find, though.
If your find doesn't support -execdir, you can get along without as so:
find . -depth -name '*:2f*' -exec bash -c 'dn=${0%/*} bn=${0##*/}; echo mv "$dn/$bn" "$dn/${bn//:2f/-}"' {} \;
Here, the trick is to separate the directory part from the filename part—that's what we store in dn (dirname) and bn (basename)—and then only change the :2f in the filename.

Since you have filenames containing space, for will split these up into separate arguments when iterating. Pipe to a while loop instead:
find /tmp/ \( -iname ".*" -prune -o -iname "*:*" -print \) | while read -r i; do
mv "$i" "$(echo "$i" | sed 's/\:2f/\-/pg')"
Also quote all the variables and command substitutions.
This will work as long as you don't have any filenames containing newline.

Related

Find and rename directories on the basis of specific characters

My directory tree looks like this:
../files/Italy.Pictures.Tom.Canon.2017-April/
../files/Italy.Videos.Marie.Sony.2017-April/
../files/Spain.Pictures.Food.John.iPhone.2018-September/
and so on..
My code:
#!/bin/bash
DIR="/home/user/files/"
find $DIR -depth -name "* *" -execdir rename 's/ /./g' "{}" \; # replace space with dot
find $DIR -depth -iname "*.iphone*" -execdir rename 's/.iphone//ig' "{}" \; # remove 'iPhone' from dir name
find $DIR -depth -iname "*.john*" -execdir rename 's/.john//ig' "{}" \;
find $DIR -depth -iname "*.tom*" -execdir rename 's/.tom//ig' "{}" \;
find $DIR -depth -iname "*-april*" -execdir rename 's/-april//ig' "{}" \;
find $DIR -depth -iname "*-september*" -execdir rename 's/-september//ig' "{}" \;
and more commands like this for all names, month,..
Yes, this works!
But: Is this the best way to remove/replace characters in directory names? Any suggestions to make my script more efficient? Maybe, to put all words in a list, which should be removed?
Thanks for your thoughts!
Personally, I'd prefer using for loop with sed and mv to rename directories, instead of using find and rename. For example:
#!/usr/bin/env bash
for dir in $(ls -d ./*/); do
newdir=$(sed 's/-.*$//' <<< "$dir" | sed 's/.\(iphone\|tom\|john\)//gi')
mv "$dir" "$newdir"
done
The first sed is to remove the month name. The 2nd sed will remove all names, and it can be extended by adding other names.
I don't know if it's "the best" way to do the job. However, I find it's quite efficient and easy to maintain. Hope you would like this method as well.

Removing files with/without spaces in filename

Hello stackoverflow community,
I'm facing a problem with removing files that contain spaces in filename, i have this part of code which is responsible of deleting files that we get from a directory,
for f in $(find $REP -type f -name "$Filtre" -mtime +${DelAvtPurge})
do
rm -f $f
I know that simple or double quotes are working for deleting files with spaces, it works for me when i try them in a command line, but when i put them in $f in the file it doesn't work at all.
Could anybody help me to find a solution for this ?
GNU find has -delete for that:
find "$REP" -type f -name "$Filtre" -mtime +"$DelAvtPurge" -delete
With any other find implementation, you can use bulk-exec:
find "$REP" -type f -name "$Filtre" -mtime +"$DelAvtPurge" -exec rm -f {} +
For a dry-run, drop -delete from the first and see the list of files to be deleted; for second, insert echo before rm.
The other answer has shown how to do this properly. But fundamentally the issue in your command is the lack of quoting, due to the way the shell expands variables:
rm -f $f
needs to become
rm -f "$f"
In fact, always quoting your variables is safe and generally a good idea.
However, this will not fix your code. Now filenames with spaces will work, but filenames with other valid characters (to wit, newlines) won’t. Try it:
touch foo$'\n'bar
for f in $(find . -maxdepth 1 -name foo\*); do echo "rm -f $f"; done
Output:
rm -f ./foo
rm -f bar
Clearly that won’t do. In fact, you mustn’t parse the output of find, for this reason. The only way of making this safe, apart from the solution via find -exec is to use the -print0 option:
find "$REP" -type f -name "$Filtre" -mtime +"$DelAvtPurge" -print0 \
| IFS= while read -r -d '' f; do
rm -f "$f"
done
Using -print0 instead of (implicit) -print causes find to delimit hits by the null character instead of newline. Correspondingly, IFS= read -r -d '' reads a null-character delimited input string, which we do in a loop using while (the -r option prevents read from interpreting backslashes as escape sequences).

Rename all files in directory and (deeply nested) sub-directories

What is the shell command for renaming all files in a directory and sub-directory (recursively)?
I would like to add an underscore to all the files ending with *scss from filename.scss to _filename.scss in all the directories and sub-directories.
I have found answers relating to this but most if not all require you to know the filename itself, and I do not want this because the filenames differ and are a lot to know by heart or even type them manually and some of them are deeply nested in directories.
Edit: I was under the impression that the bash -c bit was somehow necessary for multiple expansion of the found element; anubhava's answer proved me wrong. I am leaving that bit in the answer for now as it worked for the OP.
find . -type f -name *scss -exec bash -c 'mv $1 _$1' -- {} \;
find . -- find in current directory (recursively)
-type f -- files
-name *scss -- matching the pattern *scss
-exec -- execute for each element found
bash -c '...' -- execute command in a subshell
-- -- end option parsing
{} -- expands to the name of the element found (which becomes the positional parameter for the bash -c command)
\; -- end the -exec command
You can use -execdir option here:
find ./src/components -iname "*.scss" -execdir mv {} _{} \;
You are close to a solution:
find ./src/components -iname "*.scss" -print0 | xargs -0 -n 1 -I{} mv {} _{}
In this approach, the "loop" is executed by xargs. I prefer this solution overt the usage of the -exec in find. The syntax is clear to me.
Also, if you want to repeat the command and avoid double-adding the underscore to the already processed files, use a regexp to get only the files not yet processed:
find ./src/components -iregex ".*/[^_][^/]*\.scss" -print0 | xargs -0 -n 1 -I{} mv {} _{}
By adding the -print0/-0 options, you also avoid problems with whitespaces.
#!/bin/sh
EXTENSION='.scss'
cd YOURDIR
find . -type f | while read -r LINE; do
FILE="$( basename "$LINE" )"
case "$LINE" in
*"$EXTENSION")
DIRNAME="$( dirname "$LINE" )"
mv -v "$DIRNAME/$FILE" "$DIRNAME/_$FILE"
;;
esac
done

Shell generic equivalent of Bash Substring replacement ${foo/a/b}

Is there a shell-independent equivalence of Bash substring replacement:
foo=Hello
echo ${foo/o/a} # will output "Hella"
Most of the time I can use bash so that is not a problem, however when combined with find -exec it does not work. For instance, to rename all .cpp files to .c, I'd like to use:
# does not work
find . -name '*.cpp' -exec mv {} {/.cpp$/.c}
For now, I'm using:
# does work, but longer
while read file; do
mv "$file" "${file/.cpp$/.c}";
done <<< $(find . -name '*.cpp')
Ideally a solution that could be used in scripts is better!
Using find and -exec you can do this:
find . -name '*.cpp' -exec bash -c 'f="$1"; mv "$f" "${f/.cpp/.c}"' - '{}' \;
However this will fork bash -c for each filename so using xargs or a for loop like this is better for performance reasons:
while IFS= read -d '' -r file; do
mv "$file" "${file/.cpp/.c}"
done < <(find . -name '*.cpp' -print0)
Btw, an alternative to using bash would be to use rename. If you have the cool version of the rename command, which is shipped along with perl you can do:
find -name '*.cpp' -exec rename 's/\.cpp$/.c/' {} +
The above example assumes that you have GNU findutils, having this you don't need to pass the current directory since it is the default. If you don't have GNU findutils, you need to explicitly pass it:
find . -name '*.cpp' -exec rename 's/\.cpp$/.c/' {} +

Looping through all files in a given directory [duplicate]

This question already has answers here:
Looping through all files in a directory [duplicate]
(6 answers)
Closed 4 years ago.
Here is what I'm trying to do:
Give a parameter to a shell script that will run a task on all files of jpg, bmp, tif extension.
Eg: ./doProcess /media/repo/user1/dir5
and all jpg, bmp, tif files in that directory will have a certain task run on them.
What I have now is:
for f in *
do
imagejob "$f" "output/${f%.output}" ;
done
I need help with the for loop to restrict the file types and also have some way of starting under a specified directory instead of current directory.
Use shell expansion rather than ls
for file in *.{jpg,bmp,tif}
do
imagejob "$file" "output/${file%.output}"
done
also if you have bash 4.0+, you can use globstar
shopt -s globstar
shopt -s nullglob
shopt -s nocaseglob
for file in **/*.{jpg,bmp,tif}
do
# do something with $file
done
for i in `ls $1/*.jpg $1/*.bmp $1/*.tif`; do
imagejob "$i";
done
This is assuming you're using a bashlike shell where $1 is the first argument given to it.
You could also do:
find "$1" -iname "*.jpg" -or -iname "*.bmp" -or -iname "*.tif" \
-exec imagejob \{\} \;
You could use a construct with backticks and ls (or any other commando of course):
for f in `ls *.jpg *.bmp *.tif`; do ...; done
The other solutions here are either Bash-only or recommend the use of ls in spite of it being a common and well-documented antipattern. Here is how to solve this in POSIX sh without ls:
for file in *.jpg *.bmp *.tif; do
... stuff with "$file"
done
If you have a very large number of files, perhaps you also want to look into
find . -maxdepth -type f \( \
-name '*.jpg' -o -name '*.bmp' -o -name '*.tif' \) \
-exec stuff with {} +
which avoids the overhead of sorting the file names alphabetically. The -maxdepth 1 says to not recurse into subdirectories; obviously, take it out or modify it if you do want to recurse into subdirectories.
The -exec ... + predicate of find is a relatively new introduction; if your find is too old, you might want to use -exec ... \; or replace the -exec stuff with {} + with
find ... -print0 |
xargs -r0 stuff with
where however again the -print0 option and the corresponding -0 option for xargs are a GNU extension.

Resources