write xmp data to all jpeg files in a folder - bash

can someone point me in the right direction if i need this
exiftool -tagsfromfile XYZ.xmp -all:all XYZ.jpg
to work for hundreds of jpgs? so i have a folgeder with houndreds of jpegs and xmps with the same name but different file ending (xmp and jpeg). what would be the elegant way to go through all of them and replace XYZ with the actual filename?
i want / need to do this in a shell on osx.
do in need something like a for loop? or is there any direct way in the shell?
Thank you so much in advance!!

Your command will be
exiftool -r --ext xmp -tagsfromfile %d%f.xmp -all:all /path/to/files/
See Metadata Sidecar Files example #15.
The -r (-recurse) option allows recursion into subdirectories. Remove it if recursion is not desired.
The -ext (-extension) option is used to prevent the copying from the XMP files back onto themselves.
The %d variable is the directory of the file currently being processed. The %f variable is the base filename without the extension of that file. Then xmp is used as the extension. The result creates a file path to a corresponding XMP file in the same directory for every image file found. This will work for any writable file found (see FAQ #16).
This command creates backup files. Add -overwrite_original to suppress the creation of backup files.
You do not want to loop exiftool as shown in the other answers. Exiftool's biggest performance hit is the startup time and looping it will increase the processing time. This is Exiftool Common Mistake #3.

solved it by doing this:
#!/bin/bash
FILES="$1/*.jpg"
for f in $FILES; do
if [ -f "$f" ]; then
echo "Processing $f file..."
#cat "$f"
FILENAME="${f%%.*}"
echo $FILENAME
# exiftool -tagsfromfile "$FILENAME".xmp -all:all "$FILENAME".jpg
exiftool -overwrite_original_in_place -ext jpg -tagsFromFile "$FILENAME".xmp -# xmp2exif.args -# xmp2iptc.args '-all:all' '-FileCreateDate<XMP-photoshop:DateCreated' '-FileModifyDate<XMP-photoshop:DateCreated' "$FILENAME".jpg
else
echo "Warning: Some problem with \"$f\""
fi
done

An elegant and easy way, IMHO, is to use GNU Parallel to do them all in parallel:
parallel --dry-run exiftool -tagsfromfile {.}.xmp -all:all {} ::: *.jpg
If that looks correct, remove --dry-run and run again to do it for real.
{} just means "the current file"
{.} just means "the current file without its extension"
::: is just a separator followed by the names of the files you want GNU Parallel to process
You can install GNU Parallel on macOS with homebrew:
brew install parallel

Related

Continuously Scan Directory and Perform Script on New Items

First, please forgive me and be easy on me if this question seems easy; the first time I tried posting a question about another subject, I didn't provide enough information a few months ago. My apologies.
I'm trying to scan my incoming media folder for new audio files and convert them to my preferred format into another folder, without removing the originals.
I've written the script below and while it seems to work for one-offs, I can't seem to get it to create the destination directory name based off the source directory name; and I can't seem to figure out how to keep it looping, "scanning", for new media to arrive without processing what it's already processed.
I hope this makes sense...
#! /bin/bash
srcExt=$1
destExt=$2
srcDir=$3
destDir=$4
opts=$5
# Creating the directory name - not currently working
# dirName="$(basename "$srcDir")"
# mkdir "$destDir"/"$dirName"
for filename in "$srcDir"/*.flac; do
basePath=${filename%.*}
baseName=${basePath##*/}
ffmpeg -i "$filename" $opts "$destDir"/"$baseName"."$destExt"
done
for filename in "$srcDir"/*.mp3; do
basePath=${filename%.*}
baseName=${basePath##*/}
ffmpeg -i "$filename" $opts "$destDir"/"$baseName"."$destExt"
done
there are different ways of doing this, the easiest way might just be to look at the "modification date" of the file and seeing if it has changed, something like:
#! /bin/bash
srcExt=$1
destExt=$2
srcDir=$3
destDir=$4
opts=$5
# Creating the directory name - not currently working
# dirName="$(basename "$srcDir")"
# mkdir "$destDir"/"$dirName"
for filename in ` find "$srcDir" \( -name '*.mp3' -o -name '*.flac' \) -mmin -10`; do
basePath=${filename%.*}
baseName=${basePath##*/}
ffmpeg -i "$filename" $opts "$destDir"/"$baseName"."$destExt"
done
Consider using mkdir -p which will a) create all necessary intermediate directories, and b) not complain if they already exist.
If you want the new items to be processesd immediately they arrive, look at inotify or fswatch on macOS. In general, if less urgent, schedule your job to run every 10 minutes under cron, maybe prefixing with nice so as not to be a CPU "hog".
Decide which files to generate by changing directory to the source directory and iterating over all files. For each file, work out what the corresponding output file should be according to your rules, test if it already exists, if not create it.
Don't repeat all your for loop code like that, just do:
cd "$srcDir"
for filename in *.flac *.mp3 ; do
GENERATE OUTPUT FILENAME
if [ ! -f "$outputfilename" ] ; then
mkdir -p SOMETHING
ffmpeg -i "$filename" ... "$outputfilename"
fi
done

shell script to delete certain files and rename others

Currently I've a lot of .mp4 files with these names:
103160626309temp1ep10.mp4
103160626309temp1ep10.mp4.mp4
148999555452temp1ep6.mp4
148999555452temp1ep6.mp4.mp4
6802547045temp1ep5.mp4
6802547045temp1ep5.mp4.mp4
335587012366temp1ep4.mp4
335587012366temp1ep4.mp4.mp4
...
I must delete all files with single .mp4 and rename .mp4.mp4 to .mp4. Do you have an idea about how can I do it?
I think about using:
for i in ./*.mp4; do
...
and
for i in ./*.mp4.mp4; do
...
But I'm afraid and I can't lose the files or damage them.
Can you help me solve this issue?
Thank you.
You don't need a separate deletion step if you just rename the .mp4.mp4 files over the .mp4 files.
(This assumes that for every .mp4 file there is a corresponding .mp4.mp4 file. If that is not the case, see the "multiple steps" instructions below.)
There are several ways to do that:
Plain shell:
for file in *.mp4.mp4; do
mv -- "$file" "${file%.mp4}"
done
(Use echo instead of mv to see what it's going to do before you run it "for real".)
If you have the perl version of rename (sometimes also known as prename or perl-rename):
rename -f 's/\.mp4\.mp4\z/.mp4/' *.mp4.mp4
(Use rename -n for a dry run.)
If you have the util-linux version of rename:
rename .mp4.mp4 .mp4 *.mp4.mp4
Beware; this will simply replace the first occurrence of .mp4.mp4 in the filenames, but hopefully that'll always be at the end of the filename.
(Again, use rename -n for a dry run.)
If you have mmv:
mmv -d '*.mp4.mp4' '#1.mp4'
If you want to do things in multiple steps:
Create a separate directory for the files you want to keep:
mkdir to-be-kept
Move all .mp4.mp4 files into that directory:
mv *.mp4.mp4 to-be-kept/
Delete all remaining .mp4 files (or move them somewhere else if you want):
rm *.mp4
Or move them somewhere else:
mv *.mp4 some/other/directory
Move the .mp4.mp4 files back:
mv to-be-kept/* .
rmdir to-be-kept
Use one of the above recipes to do the renaming.
You don't need rename -f (use plain rename instead) or mmv -d (use plain mmv instead) because in this case there's no need to overwrite existing files.

Batch convert PNGs to individual PDFs while maintaining deep folder hierarchy in bash

I've found a solution that claims to do one folder, but I have a deep folder hierarchy of sheet music that I'd like to batch convert from png to pdf. What do my solutions look like?
I will run into a further problem down the line, which may complicate things. Maybe I should write a script? (I'm a total n00b fyi)
The "further problem" is that some of my sheet music spans more than one page, so if the script can parse filenames that include "1of2" and "2of2" to be turned into a single pdf, that'd be neat.
What are my options here?
Thank you so much.
Updated Answer
As an alternative, the following should be faster (as it does the conversions in parallel) and also able to handle larger numbers of files:
find . -name \*.png -print0 | parallel -0 convert {} {.}.pdf
It uses GNU Parallel which is readily available on Linux/Unix and which can be simply installed on OSX with homebrew using:
brew install parallel
Original Answer (as accepted)
If you have bash version 4 or better, you can use extended globbing to recurse directories and do your job very simply:
First enable extended globbing with:
shopt -s globstar
Then recursively convert PNGs to PDFs:
mogrify -format pdf **/*.png
You can loop over png files in a folder hierarchy, and process each one as follows:
find /path/to/your/files -name '*.png' |
while read -r f; do
g=$(basename "$f" .png).pdf
your_conversion_program <"$f" >"$g"
done
To merge pdf-s, you could use pdftk. You need to find all pdf files that have a 1of2 and 2of2 in their name, and run pdftk on those:
find /path/to/your/files -name '*1of2*.pdf' |
while read -r f1; do
f2=${f1/1of2/2of2} # name of second file
([ -f "$f1" ] && [ -f "$f2" ]) || continue # check both exist
g=${f1/1of2//} # name of output file
(! [ -f "$g" ]) || continue # if output exists, skip
pdftk "$f1" "$f2" output "$g"
done
See:
bash string substitution
Regarding a deep folder hierarchy you may use find with -exec option.
First you find all the PNGs in every subfolder and convert them to PDF:
find ./ -name \*\.png -exec convert {} {}.pdf \;
You'll get new PDF files with extension ".png.pdf" (image.png would be converted to image.png.pdf for example)
To correct extensions you may run find command again but this time with "rename" after -exec option.
find ./ -name \*\.png\.pdf -exec rename s/\.png\.pdf/\.pdf/ {} \;
If you want to delete source PNG files, you may use this command, which deletes all files with ".png" extension recursively in every subfolder:
find ./ -name \*\.png -exec rm {} \;
if i understand :
you want to concatenate all your png files from a deep folders structure into only one single pdf.
so...
insure you png are ordered as you want in your folders
be aware you can redirect output of a command (say a search one ;) ) to the input of convert, and tell convert to output in one pdf.
General syntax of convert :
convert 1.png 2.png ... global_png.pdf
The following command :
convert `find . -name '*'.png -print` global_png.pdf
searches for png files in folders from cur_dir
redirects the output of the command find to the input of convert, this is done by back quoting find command
converts works and output to pdf file
(this very simple command line works fine only with unspaced filenames, don't miss quoting the wild char, and back quoting the find command ;) )
[edit]Care....
be sure of what you are doing.
if you delete your png files, you will just loose your original sources...
it might be a very bad practice...
using convert without any tricky -quality output option could create an enormous pdf file... and you might have to re-convert with -quality "60" for instance...
so keep your original sources until you do not need them any more

bash: moving files to original directory based on filename?

I've got a bunch of subdirectories with a couple thousand PNG files that will be sent through Photoshop, creating PSD files. Photoshop can only output those to a single folder, and I want to move each one back to their original directory - so the new file foo_bar_0005.psd should go to where foo_bar_0005.png already is. Every filename only exists once.
Can somebody help me with this? I'm on OSX.
You might start from this minimal script:
#!/bin/bash
search_dir="search/png/from/this/directory/"
psd_dir="path/to/psd/directory/"
for psd_file in "$psd_dir"*.psd; do
file_name="$(echo $psd_file | sed 's/.*\/\(.*\).psd$/\1/g')"
png_dir="$(find $search_dir -name $file_name.png | grep -e '.*/' -o)"
mv $psd_file $png_dir
done
But note that this script doesn't include any error handlers e.g. file collision issue, file not found issue, etc.
Each file found with this find is piped to a Bash command that successively make the psd conversion and move the .psd to the .png original directory.
psd_dir=/psd_dir/
export psd_dir
find . -type f -name '*.png' | xargs -L 1 bash -c 'n=${1##*/}; echo photoshop "$1" && echo mv ${psd_dir}${n%.png}.psd ${1%/*}/; echo' \;
The echo are here to give you an overview of the result.
You should remove them to launch the real photoshop command.

Bash Use Origin File as Destination File

I'm working with the ImageMagick bash tool, which uses commands of the form:
<command> <input filename> <stuff> <output filename>
I'm trying to do the following command:
<command> x.png <stuff> x.png
but for every file in a directory. I tried:
<command> *.png <stuff> *.png
But that didn't work. What's the correct way to perform such a command on every file in a directory?
Many Unix command-line tools, such as awk, sed, grep are stream-based or line-oriented, which means they process files one line at a time. When using these, it is necessary to write output to an intermediate file and then rename that (with mv) back over the original input file. The reason for that is that you may be writing over the input file before you have read it all, so you will clobber your inputs. In that case, #sjsam's answer is absolutely correct - especially since he is careful to use the && in between so that the mv is not done if the command is not successful.
In the specific case of ImageMagick however, and its convert, compare, compose commands, this is NOT the case. These programs are file-oriented, not line-oriented, so they read the entire input file into memory before writing any outputs. The reason is that they often need to know if there are any transparent pixels in an image before they can start processing, or how many colours there are in an image and they cannot know this till they have read the entire file. As such, it is perfectly safe, and in fact idiomatic to use the same input filename as output filename, like this:
convert image.jpg ... -crop 100x100 -colors 16 ... image.jpg
In answer to your question, when you have multiple files to process, the preferred method is to use the mogrify tool (also part of the ImageMagick suite) which is specifically intended for processing multiple files. So, you would do this:
mogrify -crop 100x100 -colors 16 *.jpg
which would overwrite your input files with the results - which is what you asked for. If you did not want it to overwrite your input files, you would add a -path telling ImageMagick the path to an output directory like this:
mogrify -path /path/to/thumbnails -thumbnail -colors 16 *.jpg
If you had too many files for the Windows COMMAND.EXE/CMD.EXE to expand and pass to mogrify, you would let ImageMagick expand the glob (the asterisk) internally like this and be able to deal with unlimited numbers of files:
mogrify -crop 100x100 '*.jpg'
The point is that not only is the mogrify syntax concise, and portable across Windows/OSX/Linux, it also means you only need to create a single process to do all your images, rather than having to write a FOR loop and create and execute a convert process and a mv process for each and every one of potentially thousands of files - so it is much more efficient and less error-prone.
For a single file do like this :
<command> x.png <stuff> temp.png && mv temp.png x.png
For a set of files do like this :
#!/bin/bash
find `pwd` -name \*.png | while read line
do
<command> "$line" <stuff> temp.png && mv temp.png "$line"
done
Save the script in the folder containing the png files as processpng. Make it an executable and run it.
./processpng
A general version of the above script which takes the path to folder containing the above files as argument is below :
#!/bin/bash
find $1 -name \*.png | while read line
do
<command> "$line" <stuff> temp.png && mv temp.png "$line"
done
Save the script as processpng anywhere in the computer. Make it an executable and run it like :
./processpng /path/to/your/png/folder
Edit:
Incorporating #anishsane's solution, a compact way of achieving the same
results would be
#!/bin/bash
find $1 -name \*.png -exec bash -c "<command> {} <stuff> temp.png && mv temp.png {}"
In the context of find .... -exec:
{} indicates (contains) the result(s) from the find expression. Note
that empty curly braces {} have no special meaning to shell so we can
get away without escaping {}
Note: Emphasis mine.
Reference: This AskUbuntu Question.

Resources