uuidgen and $RANDOM doesn't change in find -exec argument - bash

I want to get all the instances of a file in my macosx file system and copy them in a single folder of an external hard disk.
I wrote a simple line of code in terminal but when I execute it, there is only a file in the target folder that is replaced at every occurrence it finds.
It seems that the $RANDOM or $(uuidgen) used in a single command return only one value used for every occurrence {} of the find command.
Is there a way to get a new value for every result of the find command?
Thank you.
find . -iname test.txt -exec cp {} /Volumes/EXT/$(uuidgen) \;
or
find . -iname test.txt -exec cp {} /Volumes/EXT/$RANDOM \;

This should work:
find ... -exec bash -c 'cp "$1" /Volumes/somewhere/$(uuidgen)' _ {} \;
Thanks to dan and pjh for corrections in comments.

find . -iname test.txt -exec bash -c '
for i do
cp "$i" "/Volumes/EXT/$RANDOM"
done' _ {} +
You can use -exec with +, to pass multiple files to a bash loop. You can't use command subs (or multiple commands at all) in a single -exec.

If you've got Bash 4.0 or later, another option is:
shopt -s dotglob
shopt -s globstar
shopt -s nocaseglob
shopt -s nullglob
for testfile in **/test.txt; do
cp -- "$testfile" "/Volumes/EXT/$(uuidgen)"
done
shopt -s dotglob enables globs to match files and directories that begin with . (e.g. .dir/test.txt)
shopt -s globstar enables the use of ** to match paths recursively through directory trees
shopt -s nocaseglob causes globs to match in a case-insensitive fashion (like find option -iname versus -name)
shopt -s nullglob makes globs expand to nothing when nothing matches (otherwise they expand to the glob pattern itself, which is almost never useful in programs)
The -- in cp -- ... prevents paths that begin with hyphens (e.g. -dir/test.txt) being (mis)treated as options to `cp'
Note that this code might fail on versions of Bash prior to 4.3 because symlinks are (stupidly) followed while expanding ** patterns

Related

Glob for file suffix not matching

I have some Unix command finding mov-files that have a corresponding jpg file:
find . -name '*.[Mm][Oo][Vv]' -exec sh -c '
for mov; do
for jpg in "${mov%.*}".[Jj][Pp][Gg]; do
if test -f "$jpg"; then
echo "$mov"
fi
break
done
done' sh {} +
The current code just searches for .jpg (or uppercase) as file extension, but I need to extend this to also support files that ends with ".jpeg".
I modified the code to say:
for jpg in "${mov%.*}".[Jj][Pp][Ee]?[Gg]; do
which I believed should make it possible to have an optional "E or e", but this does not work.
I was able to use this instead
for jpg in "${mov%.*}".[Jj][Pp]*[Gg]; do
which is not very safe because it will accept a lot more tha e and E in that position.
Any ideas how to modify expression to add the optional e/E in the reg exp?
The extglob feature suffices for this. Running shopt -s extglob when using bash (not sh) will let you use ?([Ee]) to refer to zero-or-one instances of [Ee].
Even better, while we're setting shopt flags, we can set nocaseglob so you can use *.jp?(e)g, without the explicit character classes. (The find equivalent for this is changing -name to -iname, which the following does in addition).
find . -iname '*.mov' -exec bash -c '
shopt -s extglob nocaseglob
for mov; do
for jpg in "${mov%.*}".jp?(e)g; do
if test -f "$jpg"; then
printf "%s\n" "$mov"
fi
break
done
done' bash {} +

Strip ./ from filename in find -execdir

Whole story: I am writing the script that will link all files from one directory to another. New file name will contain an original directory name. I use find at this moment with -execdir option.
This is how I want to use it:
./linkPictures.sh 2017_wien 2017/10
And it will create a symbolic link 2017_wien_picture.jpg in 2017/10 pointing to a file 2017_wien/picture.jpg.
This is my current script:
#!/bin/bash
UPLOAD="/var/www/wordpress/wp-content/uploads"
SOURCE="$UPLOAD/photo-gallery/$1/"
DEST="$UPLOAD/$2/"
find $SOURCE -type f -execdir echo ln -s {} $DEST/"$1"_{} ";"
It prints:
ln -s ./DSC03278.JPG /var/www/wordpress/wp-content/uploads/2017/10/pokus_./DSC03278.JPG
This is what I want:
ln -s ./DSC03278.JPG /var/www/wordpress/wp-content/uploads/2017/10/pokus_DSC03278.JPG
How to implement it? I do not know how to incorporate basename into to strip ./.
To run basename on {} you would need to execute a command through sh:
find "$SOURCE" -type f -execdir sh -c "echo ln -s '{}' \"$DEST/${1}_\$(basename \"{}\")\"" ";"
This won't win any speed contests (because of the sh for every file), but it will work.
All the quoting may look a bit crazy, but it's necessary to make it safe for files that may contain spaces.
You can use this find with bash -c:
find $SOURCE -type f -execdir bash -c 'echo ln -s "$2" "/$DEST/$1"_${2#./}' - "$1" '{}' \;
${2#./} will strip starting ./ from each entry of find command's output.
$1 will be passed as is to bash -c command line.
If you have large number of files to process I suggest using this while loop using a process substitution for faster execution since it doesn't spawn a new bash for every file. Moreover it will also handle filenames with white-spaces and other special characters:
while IFS= read -r file; do
echo ln -s "$file" "/$DEST/${1}_${file#./}"
done < <(find "$SOURCE" -type f -print0)

Delete all files and directories but certain ones using Bash

I'm writing a script that needs to erase everything from a directory except two directories, mysql and temp.
I've tried this:
ls * | grep -v mysql | grep -v temp | xargs rm -rf
but this also keeps all the files that have mysql in their name, that i don't need. it also doesn't delete any other directories.
any ideas?
You may try:
rm -rf !(mysql|init)
Which is POSIX defined:
Glob patterns can also contain pattern lists. A pattern list is a sequence
of one or more patterns separated by either | or &. ... The following list
describes valid sub-patterns.
...
!(pattern-list):
Matches any string that does not match the specified pattern-list.
...
Note: Please, take time to test it first! Either create some test folder, or simply echo the parameter substitution, as duly noted by #mnagel:
echo !(mysql|init)
Adding useful information: if the matching is not active, you may to enable/disable it by using:
shopt extglob # shows extglob status
shopt -s extglob # enables extglob
shopt -u extglob # disables extglob
This is usually a job for find. Try the following command (add -rf if you need a recursive delete):
find . -maxdepth 1 \! \( -name mysql -o -name temp \) -exec rm '{}' \;
(That is, find entries in . but not subdirectories that are not [named mysql or named tmp] and call rm on them.)
You can use find, ignore mysql and temp, and then rm -rf them.
find . ! -iname mysql ! -iname temp -exec rm -rf {} \;

bash rename files issue?

I know nothing about Linux commands o bash scripts so help me please.
I have a lot of file in different directories i want to rename all those files from "name" to "name.xml" using bash file is it possible to do that? I just find usefulness codes on the internet like this:
shopt -s globstar # enable ** globstar/recursivity
for i in **/*.txt; do
echo "$i" "${i/%.txt}.xml";
done
it does not even work.
For the purpose comes in handy the prename utility which is installed by default on many Linux distributions, usually it is distributed with the Perl package. You can use it like this:
find . -iname '*.txt' -exec prename 's/.txt/.xml/' {} \;
or this much faster alternative:
find . -iname '*.txt' | xargs prename 's/.txt/.xml/'
Explanation
Move/rename all files –whatever the extension is– in current directory and below from name to name.xml. You should test using echo before running the real script.
shopt -s globstar # enable ** globstar/recursivity
for i in **; do # **/*.txt will look only for .txt files
[[ -d "$i" ]] && continue # skip directories
echo "$i" "$i.xml"; # replace 'echo' by 'mv' when validated
#echo "$i" "${i/%.txt}.xml"; # replace .txt by .xml
done
Showing */.txt */.xml means effectively there are no files matching the given pattern, as by default bash will use verbatim * if no matches are found.
To prevent this issue you'd have to additionally set shopt -s nullglob to have bash just return nothing when there is no match at all.
After verifying the echoed lines look somewhat reasonable you'll have to replace
echo "$i" "${i/%.txt}.xml"
with
mv "$i" "${i/%.txt}.xml"
to rename the files.
You can use this bash script.
#!/bin/bash
DIRECTORY=/your/base/dir/here
for i in `find $DIRECTORY -type d -exec find {} -type f -name \*.txt\;`;
do mv $i $i.xml
done

command line : ignore particular file while using wildcards

Consider I have lots of shell scripts in a folder named test. I want to execute all the files in it except one particular file. what do I do? relocating the file or executing the file one after an other manually is not an option. Is there any way I could do this in single line. Or perhaps, adding something to sh path/to/test/*.sh, which executes all files?
for file in test/*; do
[ "$file" != "test/do-not-run.sh" ] && sh "$file"
done
If you are using bash, you can use extended patterns to skip the undesired script:
shopt -s extglob
for file in test/!(do-not-run).sh; do
sh "$file"
done
for FILE in `ls "$YOURPATH"` ; do
test "$FILE" != "do-not-run.sh" && sh "$YOURPATH/$FILE";
done
find path/to/test -name "*.sh" \! -name $pattern_for_unwanted_scripts -exec {} \;
Find will recursively execute all entries in the directory which end in .sh (-name "*.sh") and don't match the unwanted pattern (\! -name $pattern_for_unwanted_scripts).
in bash, provided you do shopt -s extglob you can use "extended globbing" allowing to use !(pattern-list) which matches anything except one of the given patterns.
In your case:
shopt -s extglob
for f in !(do-not-run.sh); do if [ "${f##*.}" == "sh" ]; then sh $f; fi; done

Resources