I have some Unix command finding mov-files that have a corresponding jpg file:
find . -name '*.[Mm][Oo][Vv]' -exec sh -c '
for mov; do
for jpg in "${mov%.*}".[Jj][Pp][Gg]; do
if test -f "$jpg"; then
echo "$mov"
fi
break
done
done' sh {} +
The current code just searches for .jpg (or uppercase) as file extension, but I need to extend this to also support files that ends with ".jpeg".
I modified the code to say:
for jpg in "${mov%.*}".[Jj][Pp][Ee]?[Gg]; do
which I believed should make it possible to have an optional "E or e", but this does not work.
I was able to use this instead
for jpg in "${mov%.*}".[Jj][Pp]*[Gg]; do
which is not very safe because it will accept a lot more tha e and E in that position.
Any ideas how to modify expression to add the optional e/E in the reg exp?
The extglob feature suffices for this. Running shopt -s extglob when using bash (not sh) will let you use ?([Ee]) to refer to zero-or-one instances of [Ee].
Even better, while we're setting shopt flags, we can set nocaseglob so you can use *.jp?(e)g, without the explicit character classes. (The find equivalent for this is changing -name to -iname, which the following does in addition).
find . -iname '*.mov' -exec bash -c '
shopt -s extglob nocaseglob
for mov; do
for jpg in "${mov%.*}".jp?(e)g; do
if test -f "$jpg"; then
printf "%s\n" "$mov"
fi
break
done
done' bash {} +
Related
I want to get all the instances of a file in my macosx file system and copy them in a single folder of an external hard disk.
I wrote a simple line of code in terminal but when I execute it, there is only a file in the target folder that is replaced at every occurrence it finds.
It seems that the $RANDOM or $(uuidgen) used in a single command return only one value used for every occurrence {} of the find command.
Is there a way to get a new value for every result of the find command?
Thank you.
find . -iname test.txt -exec cp {} /Volumes/EXT/$(uuidgen) \;
or
find . -iname test.txt -exec cp {} /Volumes/EXT/$RANDOM \;
This should work:
find ... -exec bash -c 'cp "$1" /Volumes/somewhere/$(uuidgen)' _ {} \;
Thanks to dan and pjh for corrections in comments.
find . -iname test.txt -exec bash -c '
for i do
cp "$i" "/Volumes/EXT/$RANDOM"
done' _ {} +
You can use -exec with +, to pass multiple files to a bash loop. You can't use command subs (or multiple commands at all) in a single -exec.
If you've got Bash 4.0 or later, another option is:
shopt -s dotglob
shopt -s globstar
shopt -s nocaseglob
shopt -s nullglob
for testfile in **/test.txt; do
cp -- "$testfile" "/Volumes/EXT/$(uuidgen)"
done
shopt -s dotglob enables globs to match files and directories that begin with . (e.g. .dir/test.txt)
shopt -s globstar enables the use of ** to match paths recursively through directory trees
shopt -s nocaseglob causes globs to match in a case-insensitive fashion (like find option -iname versus -name)
shopt -s nullglob makes globs expand to nothing when nothing matches (otherwise they expand to the glob pattern itself, which is almost never useful in programs)
The -- in cp -- ... prevents paths that begin with hyphens (e.g. -dir/test.txt) being (mis)treated as options to `cp'
Note that this code might fail on versions of Bash prior to 4.3 because symlinks are (stupidly) followed while expanding ** patterns
Whole story: I am writing the script that will link all files from one directory to another. New file name will contain an original directory name. I use find at this moment with -execdir option.
This is how I want to use it:
./linkPictures.sh 2017_wien 2017/10
And it will create a symbolic link 2017_wien_picture.jpg in 2017/10 pointing to a file 2017_wien/picture.jpg.
This is my current script:
#!/bin/bash
UPLOAD="/var/www/wordpress/wp-content/uploads"
SOURCE="$UPLOAD/photo-gallery/$1/"
DEST="$UPLOAD/$2/"
find $SOURCE -type f -execdir echo ln -s {} $DEST/"$1"_{} ";"
It prints:
ln -s ./DSC03278.JPG /var/www/wordpress/wp-content/uploads/2017/10/pokus_./DSC03278.JPG
This is what I want:
ln -s ./DSC03278.JPG /var/www/wordpress/wp-content/uploads/2017/10/pokus_DSC03278.JPG
How to implement it? I do not know how to incorporate basename into to strip ./.
To run basename on {} you would need to execute a command through sh:
find "$SOURCE" -type f -execdir sh -c "echo ln -s '{}' \"$DEST/${1}_\$(basename \"{}\")\"" ";"
This won't win any speed contests (because of the sh for every file), but it will work.
All the quoting may look a bit crazy, but it's necessary to make it safe for files that may contain spaces.
You can use this find with bash -c:
find $SOURCE -type f -execdir bash -c 'echo ln -s "$2" "/$DEST/$1"_${2#./}' - "$1" '{}' \;
${2#./} will strip starting ./ from each entry of find command's output.
$1 will be passed as is to bash -c command line.
If you have large number of files to process I suggest using this while loop using a process substitution for faster execution since it doesn't spawn a new bash for every file. Moreover it will also handle filenames with white-spaces and other special characters:
while IFS= read -r file; do
echo ln -s "$file" "/$DEST/${1}_${file#./}"
done < <(find "$SOURCE" -type f -print0)
I know nothing about Linux commands o bash scripts so help me please.
I have a lot of file in different directories i want to rename all those files from "name" to "name.xml" using bash file is it possible to do that? I just find usefulness codes on the internet like this:
shopt -s globstar # enable ** globstar/recursivity
for i in **/*.txt; do
echo "$i" "${i/%.txt}.xml";
done
it does not even work.
For the purpose comes in handy the prename utility which is installed by default on many Linux distributions, usually it is distributed with the Perl package. You can use it like this:
find . -iname '*.txt' -exec prename 's/.txt/.xml/' {} \;
or this much faster alternative:
find . -iname '*.txt' | xargs prename 's/.txt/.xml/'
Explanation
Move/rename all files –whatever the extension is– in current directory and below from name to name.xml. You should test using echo before running the real script.
shopt -s globstar # enable ** globstar/recursivity
for i in **; do # **/*.txt will look only for .txt files
[[ -d "$i" ]] && continue # skip directories
echo "$i" "$i.xml"; # replace 'echo' by 'mv' when validated
#echo "$i" "${i/%.txt}.xml"; # replace .txt by .xml
done
Showing */.txt */.xml means effectively there are no files matching the given pattern, as by default bash will use verbatim * if no matches are found.
To prevent this issue you'd have to additionally set shopt -s nullglob to have bash just return nothing when there is no match at all.
After verifying the echoed lines look somewhat reasonable you'll have to replace
echo "$i" "${i/%.txt}.xml"
with
mv "$i" "${i/%.txt}.xml"
to rename the files.
You can use this bash script.
#!/bin/bash
DIRECTORY=/your/base/dir/here
for i in `find $DIRECTORY -type d -exec find {} -type f -name \*.txt\;`;
do mv $i $i.xml
done
Consider I have lots of shell scripts in a folder named test. I want to execute all the files in it except one particular file. what do I do? relocating the file or executing the file one after an other manually is not an option. Is there any way I could do this in single line. Or perhaps, adding something to sh path/to/test/*.sh, which executes all files?
for file in test/*; do
[ "$file" != "test/do-not-run.sh" ] && sh "$file"
done
If you are using bash, you can use extended patterns to skip the undesired script:
shopt -s extglob
for file in test/!(do-not-run).sh; do
sh "$file"
done
for FILE in `ls "$YOURPATH"` ; do
test "$FILE" != "do-not-run.sh" && sh "$YOURPATH/$FILE";
done
find path/to/test -name "*.sh" \! -name $pattern_for_unwanted_scripts -exec {} \;
Find will recursively execute all entries in the directory which end in .sh (-name "*.sh") and don't match the unwanted pattern (\! -name $pattern_for_unwanted_scripts).
in bash, provided you do shopt -s extglob you can use "extended globbing" allowing to use !(pattern-list) which matches anything except one of the given patterns.
In your case:
shopt -s extglob
for f in !(do-not-run.sh); do if [ "${f##*.}" == "sh" ]; then sh $f; fi; done
I have written an executable in c++, which is designed to take input from a file, and output to stdout (which I would like to redirect to a single file). The issue is, I want to run this on all of the files in a folder, and the find command that I am using is not cooperating. The command that I am using is:
find -name files/* -exec ./stagger < {} \;
From looking at examples, it is my understanding that {} replaces the file name. However, I am getting the error:
-bash: {}: No such file or directory
I am assuming that once this is ironed out, in order to get all of the results into one file, I could simply use the pattern Command >> outputfile.txt.
Thank you for any help, and let me know if the question can be clarified.
The problem that you are having is that redirection is processed before the find command. You can work around this by spawning another bash process in the -exec call:
find files/* -exec bash -c '/path/to/stagger < "$1"' -- {} \;
The < operator is interpreted as a redirect by the shell prior to running the command. The shell tries redirecting input from a file named {} to find's stdin, and an error occurs if the file doesn't exist.
The argument to -name is unquoted and contains a glob character. The shell applies pathname expansion and gives nonsensical arguments to find.
Filenames can't contain slashes. The argument to -name can't work even if it were quoted. If GNU find is available, -path can be used to specify a glob pattern files/*, but this doesn't mean "files in directories named files", for that you need -regex. Portable solutions are harder.
You need to specify one or more paths for find to start from.
Assuming what you really wanted was to have a shell perform the redirect, Here's a way with GNU find.
find . -type f -regex '.*foo/[^/]*$' -exec sh -c 'for x; do ./stagger <"$x"; done' -- {} +
This is probably the best portable way using find (-depth and -prune won't work for this):
find . -type d -name files -exec sh -c 'for x; do for y in "$x"/*; do [ -f "$y" ] && ./stagger <"$y"; done; done' -- {} +
If you're using Bash, this problem is a very good candidate for just using a globstar pattern instead of find.
#!/usr/bin/env bash
shopt -s extglob globstar nullglob
for x in **/files/*; do
[[ -f "$x" ]] && ./stagger <"$x"
done
Simply escape the less-than symbol, so that redirection is carried out by the find command rather than the shell it is running in:
find files/* -exec ./stagger \< {} \;