I have a ton of screenshots on my desktop (it's the default location where they're saved) with titles of the form "Screen Shot 2020-10-11 at 7.08.12 PM.png" and I'd like to use a bash script with regex on the "Screen Shot" bit to move any such file from my desktop into a folder I created called "screenshots".
Right now, I'm playing around with find . -regex Screen*.*?.png but it's not quite working (it gives find: Screen Shot 2020-10-11 at 7.11.09 PM.png: unknown primary or operator).
I'm also not sure how I'd even use the output once it does find all the correct files to move them to a folder. Could you somehow iterate over all files in a folder using for i in seq 1 100 or something of the like?
You don't actually even need -regex here:
find . -type f -name 'Screen Shot*png' -maxdepth 1 -exec echo mv "{}" screenshots \;
You can run this command safely as it will not do anything but print
what it would do. Remove echo to actually run mv.
All options used are documented in man find but in short:
-type f will make find look only for files, not directories. This
is useful in case you have a directory that matches -name - we don't
want to touch it.
-maxdepth 1 will only look fire files in the same directory level -
it's very useful here because you might already have some files that
match the -name present in screenshots directory - we want to leave
them alone.
-name accepts shell pattern, not regex. We could of course use -regex here but I prefer -name because shell patterns are shorter and easier to use here.
{} is a placeholder that will be replaced will the name of found
file.
\; is a literal semicolon, escaped to prevent it from being
interpreted by shell that ends command specified with -exec.
Taking the regex at face value (probably a mistake), you should use single quotes around the regex:
find . -regex 'Screen*.*?.png'
This prevents the shell from expanding it, leaving that to find. Then to move the files to the ~/screenshots directory (change the name to match the directory you want to use), if you have GNU mv, you can use:
find . -regex 'Screen*.*?.png' -exec mv -t ~/screenshots {} +
This executes a single mv command to move many files to the target directory, reducing the number of times the mv is executed. It might still be executed multiple times, but it will be many fewer times than the alternative.
If you don't have GNU mv with the (very useful) -t option, then you should use:
find . -regex 'Screen*.*?.png' -exec mv {} ~/screenshots ';'
This executes one mv command for each file found, but is more portable.
The primary problem you ran into was that the shell was expanding what you wrote into a list of file names, and then find didn't understand what you meant. Using the quotes prevents the shell from expanding the 'regex'. You can add an echo to the other commands before the mv to see what would be executed.
However, I'm not sure whether you know what your regex matches. It isn't clear that the regex given is a valid regex for find — though it mostly works as a PCRE (Perl-compatible) regular expression. By default, GNU find uses GNU Emacs regular expressions, but you can control the dialect of regular expression it uses. The options available include Emacs, POSIX Awk, POSIX Basic Regular Expressions (BRE), POSIX egrep, and POSIX Extended Regular Expressions (ERE). It doesn't include PCRE. What you supply is more like a shell glob, and the -name operator handles globbing names.
It's quite probable that you should be using the -name operator, using a command along the lines of:
find . -name 'Screen Shot *.png' -exec mv -t ~/screenshots {} +
Related
I want to list all .jpg files in all subdirectories using ls.
For the same directory this works fine:
ls *.jpg
However, when using the -R for recursiveness:
ls -R *.jpg
I get:
zsh:no matches found: *.jpg
Why does this not work?
Note: I know it can be done using find or grep but I want to know why the above does not work.
The program ls is not designed to handle patterns by itself.
When you run ls -R *.jpg, the pattern *.jpg is not directly passed to ls. The shell replaces it by a list of all files that match the pattern. (Only if there is no file with a matching name, ls will see the file name *.jpg and not find a file of this name.)
Since you are using zsh (with the default setting setopt nomatch), it prints an error message instead of passing the pattern to ls.
If there are matching files, e.g. A.jpg, B.jpg, C.jpg, the command
ls *.jpg
will be run by the shell as
ls A.jpg B.jpg C.jpg
In contrast to this, find is designed to handle patterns with its -name test. When using find you should make sure the pattern is not replaced by the shell, e.g. by using -name '*.jpg' or -name \*.jpg. Otherwise you might get unexpected results or an error if there are matching files in the current directory.
Edit:
As shown in Martin Tournoij's answer you could use the recursive glob pattern ls **/*.jpg, but this is also handled by the shell not by ls, so you don't need option -R. In zsh this recursive pattern ** is enabled by default, in bash you need to enable it with shopt -s globstar.
The shell first expands any glob patterns, and then runs the command. So from ls's perspective, ls *.jpg is exactly the same as if you had typed ls one.jpg two.jpg. The -R flag to ls only makes sense if you use it on a directory, which you're not doing here.
This is also why mv *.jpg *.png doesn't work as expected on Unix systems, since mv never sees those patterns but just the filenames it expanded to (it does on e.g. Windows, where the globbing is done by the program rather than the shell; there are advantages and disadvantages to both approaches).
* matches all characters except a /, so *.jpg only expands to patterns in the current directory. **/ is similar, but also matches /, so it expands to patterns in any directory. This is supported by both bash and zsh.
So ls **/*.jpg will do what you want; you don't need to use find or grep. In zsh, especially you rarely need to use find since globbing is so much more powerful than in the standard Bourne shell or bash.
In zsh you can also use setopt glob_star_short and then **.jpg will work as well, which is a shortcut for **/*.jpg.
I'm running macOS and looking for a way to quickly sort thousands of jpg files. I need to create folders based on part of filenames and then move those files into it.
Simply, I want to put these files:
x_not_relevant_part_of_name.jpg
x_not_relevant_part_of_name.jpg
y_not_relevant_part_of_name.jpg
y_not_relevant_part_of_name.jpg
Into these folders:
x
y
Keep in mind that length of "x" and "y" part of name may be different.
Is there an automatic solution for that in maxOS?
I've tried using Automator and Terminal but i'm not a programmer so I haven't done well.
I would back up the files first to somewhere safe in case it all goes wrong. Then I would install homebrew and then install rename with:
brew install rename
Then you can do what you want with this:
rename --dry-run -p 's|(^[^_]*)|$1/$1|' *.jpg
If that looks correct, remove the --dry-run and run it again.
Let's look at that command.
--dry-run means just say what the command would do without actually doing anything
-p means create any intermediate paths (i.e. directories) as necessary
's|...|' I will explain in a moment
*.jpg means to run the command on all JPG files.
The funny bit in single quotes is actually a substitution, in its simplest form it is s|a|b| which means substitute thing a with b. In this particular case, the a is caret (^) which means start of filename and then [^_]* means any number of things that are not underscores. As I have surrounded that with parentheses, I can refer back to it in the b part as $1 since it is the first thing in parentheses in a. The b part means "whatever was before the underscore" followed by a slash and "whatever was before the underscore again".
Using find with bash Parameter Substitution in Terminal would likely work:
find . -type f -name "*jpg" -maxdepth 1 -exec bash -c 'mkdir -p "${0%%_*}"' {} \; \
-exec bash -c 'mv "$0" "${0%%_*}"' {} \;
This uses bash Parameter Substitution with find to recursively create directories (if they don't already exist) using the prefix of any filenames matching jpg. It takes the characters before the first underscore (_), then moves the matching files into the appropriate directory. To use the command simply cd into the directory you would like to organize. Keep in mind that without using the maxdepth option running the command multiple times can produce more folders; limit the "depth" at which the command can operate using the maxdepth option.
${parameter%word}
${parameter%%word}
The word is expanded to produce a pattern just as in filename expansion. If the pattern matches a trailing portion of the expanded
value of parameter, then the result of the expansion is the value of
parameter with the shortest matching pattern (the ‘%’ case) or the
longest matching pattern (the ‘%%’ case) deleted.
↳ GNU Bash : Shell Parameter Expansion
This code searches and recursively copies the files after the above date.
#!/bin/bash
directory=~/somefolder
DAYSAGO=8
for ((a=0; a <= DAYSAGO ; a++))
do
find $directory -mtime $a -type f | while read file;
do
cp "$file" -t ~/The\ other\ folder/
done
done
Try the following:
#!/usr/bin/env bash
directory=~/'somefolder'
DAYSAGO=8
find "$directory" -mtime -$(( DAYSAGO + 1 )) -type f -exec cp -t ~/'The other folder'/ {} +
Using - to prefix the -mtime argument applies less-than logic to the argument value. All find tests that take numeric arguments support this logic (and its counterpart, +, for more-than logic). Tip of the hat to miracle 173.
Since the desired logic is <= $DAYSAGO, 1 is added using an arithmetic expansion ($(( ... ))), to achieve the desired logic (needless to say, $DAYSAGO could be redefined with less-than logic in mind, to 9, so as to make the arithmetic expansion unnecessary).
Using -exec with the + terminator invokes the specified command with (typically) all matching filenames at once, which is much more efficient than piping to a shell loop.
{} is the placeholder for the list of matching filenames, and note that with + it must be the last argument before the + terminator (by contrast, with the invoke-once-for-each-matching-file terminator \;, the {} can be placed anywhere).
Note that the command above therefore only works with cp implementations that support the -t option, which allows placing the target directory first, notably, GNU cp (BSD/OSX cp and the POSIX specification, by contrast, do NOT support -t).
Also note the changes in quoting:
directory=~/'somefolder': single-quoting literal somefolder - while not strictly necessary in this particular case - ensures that the enclosed name works even if it contains embedded spaces or other shell metacharacters.
Note, however, that the ~/ part must remain unquoted for the ~ to expand to the current user's home dir.
"$directory": double-quoting the variable reference ensures that its value is not interpreted further by the shell, making it safe to use paths with embedded whitespace and other shell metacharacters.
~/'The other folder'/ provides a more legible alternative to ~/The\ other\ folder/ (and is also easier to type), demonstrating the same mix of unquoted and quoted parts as above.
You don't need the while loop at all. Using it as you are exposes you to problems with some corner cases like filenames containing newlines and other whitespace. Just use the -exec primary.
find "$directory" -mtime "$a" -type f -exec cp {} -t ~/The\ other\ folder/ \;
UPDATE: use mklement0's answer, though; it's more efficient.
I have searched looking for the right solution. I have found some close examples.Bash script to replace spaces in file names
But what I'm looking for is how to replace multiple .dots in current DIRECTORY/SUBDIRECTORY names, then replace multiple .dots in FILENAMES excluding the *.extention "recursively".
This example is close but not right:
find . -maxdepth 2 -name "*.*" -execdir rename 's/./-/g' "{}" \;
another example but not right either:
for f in *; do mv "$f" "${f//./-}"; done
So
dir.dir.dir/dir.dir.dir/file.file.file.ext
Would become
dir-dir-dir/dir-dir-dir/file-file-file.ext
You can assign to a variable and pipe like this:
x="dir.dir.dir/dir.dir.dir/file.file.file.ext"
echo "$(echo "Result: ${x%.*}" | sed 's/\./_/g').${x##*.}"
Result: dir_dir_dir/dir_dir_dir/file_file_file.ext
You have to escape . in regular expressions (such as the ones used for rename, because by default it has the special meaning of "any single character". So the replacement statement at least should be s/\./-/g.
You should not quote {} in find commands.
You will need two find commands, since you want to replace all dots in directory names, but keep the last dot in filenames.
You are searching for filenames which contain spaces (* *). Is that intentional?
I often use find to run the same script on a bunch of files. So for example if I want to run process.py on all the .png files in dir, I would do:
find dir -name '*.png' -execdir process.py \{\} \;
The picket fence thing in the end is annoying, any way around it?
Depending on your keyboard layout, " might be more convenient to use than ' or \. The {} does not need escaping, as far as the Unix & Linux Stack Exchange site knows. Nobody could identify a shell that would need {} escaped, and the examples in the man page do not escape the braces.
find dir -name "*.png" -execdir process.py {} ";"
Jonathan Leffler has a solution with + in the end, which is not identical in semantics, but often usable.
Use:
find dir -name '*.png' -execdir process.py {} +
The {} don't need escaping; they only have special meaning to the shell in rather limited circumstances. (In particular, echo {} echoes the braces, whereas echo {a,b,c} echoes a b c.) The + does not need escaping either. It tells find to 'play at being xargs'. That is, it will run the command with as many file names as it reasonably can for each execution.
Note that using -exec or -execdir automatically and comprehensively deals with the problem of spaces (and other awkward characters — newlines, backspaces, form feeds, anyone?) in file names. Piping names with -print into xargs runs foul of problems here. GNU find plus GNU xargs provides the -print0 option to find and the -0 option to xargs to get around issues with odd characters in file names.
If you must execute the script once per file, then you need an escaped semi-colon at the end; there is no easy way around that (unless you count: SC=";"; find ... {} $SC, which I don't).
The only issue I see is the -execdir which runs the script in the sub-directory. You'll have to check that it behaves sanely when there are different files in different directories, and you'll need to be sure that {} translates to 'the file name relative to the directory it is found in' when used with -execdir (as otherwise, the file won't be locatable via the name that is given to the script, in general). All of this should 'just work' as the options wouldn't be meaningfully usable if they didn't.
Personally, I'd rather use just plain -exec, but there's probably a good reason why you chose -execdir.
you can single quote '{}' ';' instead, but you would have to somehow prevent shell itself from interpreting these characters. xargs will work as well if the number of found is small.