(mogrify) ln -s creating copies of files - bash

Running the following script:
for i in $(find dir -name "*.jpg"); do
ln -s $i
done
incredibly makes symbolic links for 90% of the files and makes of a copy of the remaining 10%. How's that possible?
Edit: what's happening after is relevant:
Those are links to images that I rotate through mogrify e.g.
mogrify -rotate 90 link_to_image
It seems like mogrify on a link silently makes a copy of the image, debatable choice, but that's what it is.

Skip the first paragraph if you want to know more about processing of files with spaces in the names
It was not clear, what is the root of the problem and our assumption was that the problem is in the spaces in the filenames: that files that have them are not processed correctly.
The real problem was mogrify that applied to the created links, processed them and changed with real files.
No about spaces in filenames.
Processing of files with spaces in their names
That is because of spaces in names of the files.
You can write something like this:
find dir -name \*.jpg | while IFS= read i
do
ln -s "$i"
done
(IFS= is used here to avoiding stripping of leading spaces, thanks to #Alfe for the tip).
Or use xargs.
If it is possible that names contain "\n", it's better to use print0:
find dir -name \*.jpg -print0 | xargs -0 -N1 ln -s
Of course, you can use other methods also, for example:
find dir -name '*.jpg' -exec ln -s "{}" \;
ln -s "$(find dir -name '*.jpg')" .

(Imagemagick) mogrify applied on a link delete the link and makes a copy of the image

Try with single quotes:
find dir -name '*.jpg' -exec ln -s "{}" \;

Related

How do I delete all the MP4 files with a file name not ending with -converted?

I converted/compressed several MP4 files from several folders using VLC.
The names of the converted/compressed files end with -converted, for example. 2. bubble sort-converted.mp4.
It's really cumbersome to go into each folder and delete all the original files and leave the converted files.
Using some zsh/bash command I'd like to recursively delete all the original files and leave the converted files.
For example I'll delete 3 - sorting/2. bubble sort.mp4 and will leave 3 - sorting/2. bubble sort-converted.mp4.
TLDR;
In easy words, delete all the files with .mp4 extension, where filesnames don't end with -converted using some zsh/bash command.
Also If there is some way to rename the converted file to the original name after deleting the original files, that will be a plus.
Thank you!
find can be used with a logical expression to match the desired files and delete them.
In your case the following can be used to verify whether it matches the files you want to delete. It finds all files that don't have converted in their names but do end in .mp4.
find . -type f -not \( -name '*converted*' \) -a -name "*.mp4"
Once you are satsified with the file list result then add -delete to do the actual delete.
find . -type f -not \( -name '*converted*' \) -a -name "*.mp4" -delete
Give this a try:
find . -name '*.mp4' | grep -v 'converted' | xargs rm -f
The zsh pure solution:
rm -f ^(*.mp4-converted)(.)
^ ................. negates
*-converted ....... pattern
(.) ............... regular files
Using gnu parallel (in case of many files)
parallel --no-notice rm -rf ::: ^(*converted)(.)
This will work even if your file names contain ', " or space:
find . -name '*.mp4' |
grep -v 'converted' |
parallel -X rm -f

Moving multiple files in subdirectories (and/or splitting strings by multichar delimeter) [bash]

So basically, I have a folder with a bunch of subfolders all with over 100 files in them. I want to take all of the mp3 files (really generic extension since I'll have to do this with jpg, etc.) and move them to a new folder in the original directory. So basically the file structure looks like this:
/.../dir/recup1/file1.mp3
/.../dir/recup2/file2.mp3
... etc.
and I want it to look like this:
/.../dir/music/file1.mp3
/.../dir/music/file2.mp3
... etc.
I figured I would use a bash script that looked along these lines:
#!/bin/bash
STR=`find ./ -type f -name \*.mp3`
FILES=(echo $STR | tr ".mp3 " "\n")
for x in $FILES
do
echo "> [$x]"
done
I just have it echo for now, but eventually I would want to use mv to get it to the correct folder. Obviously this doesn't work though because tr sees each character as a delimiter, so if you guys have a better idea I'd appreciate it.
(FYI, I'm running netbook Ubuntu, so if there's a GUI way akin to Windows' search, I would not be against using it)
If the music folder exists then the following should work -
find /path/to/search -type f -iname "*.mp3" -exec mv {} path/to/music \;
A -exec command must be terminated with a ; (so you usually need to type \; or ';' to avoid interpretion by the shell) or a +. The difference is that with ;, the command is called once per file, with +, it is called just as few times as possible (usually once, but there is a maximum length for a command line, so it might be split up) with all filenames.
You can do it like this:
find /some/dir -type f -iname '*.mp3' -exec mv \{\} /where/to/move/ \;
The \{\} part will be replaced by the found file name/path. The \; part sets the end for the -exec part, it can't be left out.
If you want to print what was found, just add a -print flag like:
find /some/dir -type f -iname '*.mp3' -print -exec mv \{\} /where/to/move/ \;
HTH

Looping through all files in a given directory [duplicate]

This question already has answers here:
Looping through all files in a directory [duplicate]
(6 answers)
Closed 4 years ago.
Here is what I'm trying to do:
Give a parameter to a shell script that will run a task on all files of jpg, bmp, tif extension.
Eg: ./doProcess /media/repo/user1/dir5
and all jpg, bmp, tif files in that directory will have a certain task run on them.
What I have now is:
for f in *
do
imagejob "$f" "output/${f%.output}" ;
done
I need help with the for loop to restrict the file types and also have some way of starting under a specified directory instead of current directory.
Use shell expansion rather than ls
for file in *.{jpg,bmp,tif}
do
imagejob "$file" "output/${file%.output}"
done
also if you have bash 4.0+, you can use globstar
shopt -s globstar
shopt -s nullglob
shopt -s nocaseglob
for file in **/*.{jpg,bmp,tif}
do
# do something with $file
done
for i in `ls $1/*.jpg $1/*.bmp $1/*.tif`; do
imagejob "$i";
done
This is assuming you're using a bashlike shell where $1 is the first argument given to it.
You could also do:
find "$1" -iname "*.jpg" -or -iname "*.bmp" -or -iname "*.tif" \
-exec imagejob \{\} \;
You could use a construct with backticks and ls (or any other commando of course):
for f in `ls *.jpg *.bmp *.tif`; do ...; done
The other solutions here are either Bash-only or recommend the use of ls in spite of it being a common and well-documented antipattern. Here is how to solve this in POSIX sh without ls:
for file in *.jpg *.bmp *.tif; do
... stuff with "$file"
done
If you have a very large number of files, perhaps you also want to look into
find . -maxdepth -type f \( \
-name '*.jpg' -o -name '*.bmp' -o -name '*.tif' \) \
-exec stuff with {} +
which avoids the overhead of sorting the file names alphabetically. The -maxdepth 1 says to not recurse into subdirectories; obviously, take it out or modify it if you do want to recurse into subdirectories.
The -exec ... + predicate of find is a relatively new introduction; if your find is too old, you might want to use -exec ... \; or replace the -exec stuff with {} + with
find ... -print0 |
xargs -r0 stuff with
where however again the -print0 option and the corresponding -0 option for xargs are a GNU extension.

How to consolidate selected files from multiple sub-directories into one directory

I know this is probably elementary to unix people, but I haven't found a straightforward answer online.
I have a directory with sub-directories. Some of these sub-dirs have .mov files in them. I want to consolidate all the movs to a single directory. I don't need to worry about file naming conflicts because the files are from a digital camera and it names the files incrementally, but divides them into daily folders.
What is the Unix-fu for grabbing all these files and copying (or even better, moving them) to a directory in my home folder?
Thanks.
How about this?
find "$SOURCE_DIRECTORY" -type f -name '*.mov' -exec mv '{}' "$TARGET_DIRECTORY" ';'
If the source and target directories do not overlap this should work fine.
EDIT:
BTW, if you have mixed-case extensions (x.mov, y.Mov, Z.MOV) as is the case with many cameras, this would be better. It uses -iname which is case-insensitive when matching:
find "$SOURCE_DIRECTORY" -type f -iname '*.mov' -exec mv '{}' "$TARGET_DIRECTORY" ';'
Make sure to replace the $SOURCE_DIRECTORY and $TARGET_DIRECTORY variables with the actual directories and that they do not overlap (i.e. the target being somewhere under the source)
EDIT 2:
PS: I just noticed that khachik caught this one with his edit
mv `find . -name "*.mov" | xargs` OUTPUTDIR/
Update after thkala's comment:
find . -iname "*.mov" | while read line; do mv "$line" OUTPUTDIR/; done
If you need to cope with weird filenames (spaces, special characters), try this:
$ cd <source parent directory>
$ find -name '*.mov' -print0 | xargs -0 echo mv -v -t <target directory>
Remove the "echo" above to actually do the move, rather than print what would happen.
"mv -v" gives verbose output, "mv -t ..." specifies the target directory (possibly GNU-specific).
"-print0" and "-0" are extensions to cope with weird filenames. On non-GNU systems you might need to remove those options, which will result in newline-separated data. This will still work on filenames with spaces, but not filenames with newlines (yes, it's possible).

Bash: recursively copy and rename files

I have a lot of files whose names end with '_100.jpg'. They spread in nested folder / sub-folders. Now I want a trick to recursively copy and rename all of them to have a suffix of '_crop.jpg'. Unfortunately I'm not familiar with bash scripting so don't know the exact way to do this thing. I googled and tried the 'find' command with the '-exec' para but with no luck.
Plz help me. Thanks.
find bar -iname "*_100.jpg" -printf 'mv %p %p\n' \
| sed 's/_100\.jpg$/_crop\.jpg/' \
| while read l; do eval $l; done
if you have bash 4
shopt -s globstar
for file in **/*_100.jpg; do
echo mv "$file" "${file/_100.jpg/_crop.jpg}"
one
or using find
find . -type f -iname "*_100.jpg" | while read -r FILE
do
echo mv "${FILE}" "${FILE/_100.jpg/_crop.jpg}"
done
This uses a Perl script that you may have already on your system. It's sometimes called prename instead of rename:
find /dir/to/start -type f -iname "*_100.jpg" -exec rename 's/_100/_crop' {} \;
You can make the regexes more robust if you need to protect filenames that have "_100" repeated or in parts of the name you don't want changed.

Resources