Convert files on directories that are not always there - bash

I've got a script that runs when a torrent download is finished to see if there are FLAC audio files and if yes convert them to MP3. Until today I've used:
for file in "$torrentpath"/"$torrentname"/*.flac
do
ffmpeg -i "$file" -qscale:a 0 "${file[#]/%flac/mp3}"
done
But I realised that when a torrent comes containing sub-directories the script is useless. I've tried messing around for the past few days with "find" and "if" and other ways but I can't really see the answer. I know it's there.
The script should just test if there are sub-dirs and execute ffmpeg on those, otherwise directly go with the conversion.
Any little hint will be appreciated.

to handle arbitrary subdirectories in bash:
shopt -s globstar nullglob
for file in "$torrentpath/$torrentname"/**/*.flac
do ...

find "$torrentpath"/"$torrentname" -name '*.flac' -print | while read file; do
ffmpeg -i "$file" -qscale:a 0 "${file[#]/%flac/mp3}"
done

Related

Continuously Scan Directory and Perform Script on New Items

First, please forgive me and be easy on me if this question seems easy; the first time I tried posting a question about another subject, I didn't provide enough information a few months ago. My apologies.
I'm trying to scan my incoming media folder for new audio files and convert them to my preferred format into another folder, without removing the originals.
I've written the script below and while it seems to work for one-offs, I can't seem to get it to create the destination directory name based off the source directory name; and I can't seem to figure out how to keep it looping, "scanning", for new media to arrive without processing what it's already processed.
I hope this makes sense...
#! /bin/bash
srcExt=$1
destExt=$2
srcDir=$3
destDir=$4
opts=$5
# Creating the directory name - not currently working
# dirName="$(basename "$srcDir")"
# mkdir "$destDir"/"$dirName"
for filename in "$srcDir"/*.flac; do
basePath=${filename%.*}
baseName=${basePath##*/}
ffmpeg -i "$filename" $opts "$destDir"/"$baseName"."$destExt"
done
for filename in "$srcDir"/*.mp3; do
basePath=${filename%.*}
baseName=${basePath##*/}
ffmpeg -i "$filename" $opts "$destDir"/"$baseName"."$destExt"
done
there are different ways of doing this, the easiest way might just be to look at the "modification date" of the file and seeing if it has changed, something like:
#! /bin/bash
srcExt=$1
destExt=$2
srcDir=$3
destDir=$4
opts=$5
# Creating the directory name - not currently working
# dirName="$(basename "$srcDir")"
# mkdir "$destDir"/"$dirName"
for filename in ` find "$srcDir" \( -name '*.mp3' -o -name '*.flac' \) -mmin -10`; do
basePath=${filename%.*}
baseName=${basePath##*/}
ffmpeg -i "$filename" $opts "$destDir"/"$baseName"."$destExt"
done
Consider using mkdir -p which will a) create all necessary intermediate directories, and b) not complain if they already exist.
If you want the new items to be processesd immediately they arrive, look at inotify or fswatch on macOS. In general, if less urgent, schedule your job to run every 10 minutes under cron, maybe prefixing with nice so as not to be a CPU "hog".
Decide which files to generate by changing directory to the source directory and iterating over all files. For each file, work out what the corresponding output file should be according to your rules, test if it already exists, if not create it.
Don't repeat all your for loop code like that, just do:
cd "$srcDir"
for filename in *.flac *.mp3 ; do
GENERATE OUTPUT FILENAME
if [ ! -f "$outputfilename" ] ; then
mkdir -p SOMETHING
ffmpeg -i "$filename" ... "$outputfilename"
fi
done

Bash: loop ffmpeg command through sets of subfolders and direct it to files in the folders for processing

I am playing around with embedding captions into mp4 video files, and want to find a way to do this across large sets of directories with the .mp4 and .srt files in them. Each pair of .mp4 and .srt files will be subfoldered together in their own directory, and the basename should be the same between the two. Example:
Video1
Video1.mp4
Video1.srt
Video2
Video2.mp4
Video2.srt
I’ve tried several things but I’m a novice at this and only write very simple bash scripts for much more straightforward processes. For this I need to figure out how to write the bash script to run an ffmpeg command in every subfolder that will grab the mp4 and srt file and output a new mp4 of the merged data. The basic ffmpeg command to do this is:
ffmpeg -i filename.mp4 -i filename.srt -c copy -c:s mov_text output.mp4
I’ve tried to add:
for dir in ./*/; do ffmpeg -i *.mp4 -i *.srt -c copy -c:s move_text “$file”.mp4
…and several variations of this, but ffmpeg always stops with a “*.mp4: No such file or directory” error. Then I tried to add "for file in..." after the "for dir in" statement but didn't have any positive results. The following is closest to what I need - it at least goes to each folder and processes the files - but it does them independently and doesn't combine the mp4 and srt source files as the ffmpeg command should. It outputs a video.mp4.mp4 and video.srt.mp4, and fails to combine them in either case.
for dir in ./**/*;
do ffmpeg -i "$dir" -i "$dir" -c copy -c:s mov_text "$dir".mp4
I tried "$dir".mp4 and "$dir".srt but that just results in an error. I tried to pull just directory names:
for dir in ./**/*;
do ffmpeg -i "$(basename $dir)" -i "$(basename $dir)" -c copy -c:s mov_text "$dir".mp4
and my attempts using "$(basename $dir).extension" have resulted in errors - it looks for video.mp4.mp4 or video.srt.mp4. Any tips as to what to add to get this process to work or another approach entirely would be greatly appreciated! I figure it's a simple bash thing I'm just ignorant of, but certainly need to learn how to do! Thanks!
Run this within the dir containing Video1/, Video2/...
#!/bin/bash -e
shopt -s globstar
for v in ./**/*.mp4; do
s=${v%.*}.srt
if [ -f "$s" ]; then
ffmpeg -i "$v" -i "$s" -c copy -c:s mov_text "${v##*/}"
fi
done
./**/*.mp4 expands to ./Video1/Video1.mp4 ./Video1/Video2.mp4 ...,
${v%.*} removes the extension (./Video1/Video1.mp4 > ./Video1/Video1),
[ -f "$s" ] checks if $s (i.e. ./Video1/Video1.srt) exists,
${v##*/} extracts the basename of $v (./Video1/Video1.mp4 > Video1.mp4).
So the final structure of . will be like:
Video1.mp4 # subbed
Video1
Video1.mp4
Video1.srt
Video2.mp4 # subbed
Video2
Video2.mp4
Video2.srt
As a tweak to the excellent answer by ogizismail, the below is an approach that works with versions of bash too old to support globstar:
while IFS= read -r -d '' v; do
s=${v%.mp4}.srt
[[ -e $s ]] && ffmpeg -i "$v" -i "$s" -c copy -c:s mov_text "${v##*/}"
done < <(find . -mindepth 2 -name '*.mp4' -printf '%P\0')
The general technique is discussed in Using Find. Using -mindepth 2 stops it from finding your already-subbed output files.

Bash script that lists files in a directory doesn't work

I made a bash script because I need to convert a lot of files in a directory from .MOV to .mp4 format.
I created this script for the purpose:
#!/bin/bash
touch .lista
ls -1 "$1" | grep -i .MOV > .lista
list= `pwd`/.lista
cd "$1"
while read -r line;
do filename=${line%????}
ffmpeg -i "$line" -vcodec copy -acodec copy "$filename.mp4"; done < $list
rm .lista
This script is supposed to convert me each .MOV file into the directory indicated by $1, but it doesn't work, it converts me only one file, then it terminates. I can't understand why. What's wrong with that?
It's better to simply loop using globs:
for file in "$1"/*.MOV; do
ffmpeg -i "$file" ... "${file%.*}.mp4"
done
Why you shouldn't parse the output of ls.
Do them all fast and succinctly in parallel with GNU Parallel like this:
parallel --dry-run ffmpeg -i {} -vcodec copy -acodec copy {.}.mp4 ::: movies/*MOV
Sample Output
ffmpeg -i movies/a.MOV -vcodec copy -acodec copy movies/a.mp4
ffmpeg -i movies/b.MOV -vcodec copy -acodec copy movies/b.mp4
If that looks good, do it again but without --dry-run.
Note how easily GNU Parallel takes care of all the loops, all the quoting and changing the extension for you.
Your code is working for me. I cannot see any error. But I can suggest you a better approach. Don't use ls to get the filenames, it is not a good idea. Also, you can avoid changing dir.
#!/bin/bash
for line in $(find "$1" -maxdepth 1 -type f -iname "*.mov")
do
ffmpeg -i "$line" -vcodec copy -acodec copy "${line%????}.mp4"
done
You don't need to start by touching the file. In any case, you don't need a file at all, you can use a for loop to iterate over the files returned by find directly. With find, I'm already selecting all the files in the specified folder that have the expected extension.
Here I add a one-liner that should avoid problems with spaces:
find "$1" -maxdepth 1 -type f -iname "*.mov" -print0 | xargs -0 -n 1 -I{} bash -c "F={}; ffmpeg -i \"\$F\" -vcodec copy -acodec copy \"\${F%.*}\".mp4"

Converting *.wma to *.mp3 by SHELL-script with mplayer, lame, and find

I want to convert my older *.wma files into *.mp3. For that purpose I found a short script to convert with using mplayer + lame (found here: https://askubuntu.com/questions/508625/python-v2-7-requires-to-install-plugins-to-play-media-files-of-the-following-t).
This works fine in a single directory. Now I wanted to improve it that way, that it's able to work with 'find'. Its intended to find a *.wma-file and then calling the script to convert that file to *.mp3.
Here is the script:
FILENAME=$1
FILEPATH="$(dirname $1)"
BASENAME="$(basename $1)"
mplayer -vo null -vc dummy -af resample=44100 -ao pcm:waveheader "$FILENAME"
lame -m j -h --vbr-new -b 320 audiodump.wav -o "`basename "$FILENAME" .wma`.mp3"
echo "Path: $FILEPATH" # just to see if its correct
echo "File: $BASENAME" # just to see if its correct
rm -f audiodump.wav
rm -f "$FILENAME"
At the moment I'm dealing with the issue, that the script put the converted *.mp3 in the directory which the console is working with (e.g. /home/user/ instead of /home/user/files/ where the *.wma comes from).
What can I do to let the script putting the new *.mp3 into the same directory as the *.wma?
If I want to use 'mv' within the script I get trouble with embedded spaces in the *.wma-filenames.
Thanks for any hints. I thought about setting the IFS to tab or newline, but I wounder if there is a better way to deal with this.
Here's something that uses ffmpeg for the conversion (after using ffprobe for figuring out what the bit_rate should be). It's based off of what I found in (https://askubuntu.com/questions/508278/how-to-use-ffmpeg-to-convert-wma-to-mp3-recursively-importing-from-txt-file). But I didn't have access to avprobe, so had to hunt for an alternative.
First navigate to the directory with all your files and run the following from your shell:
find . -type f | grep wma$ > wma-files.txt
Once that's done, you can put this into a script and run it:
#!/usr/bin/env bash
readarray -t files < wma-files.txt
ffprobe=<your_path_here>/ffprobe
ffmpeg=<your_path_here>/ffmpeg
for file in "${files[#]}"; do
out=${file%.wma}.mp3
bit_rate=`$ffprobe -v error -show_entries format=bit_rate -of default=noprint_wrappers=1:nokey=1 "$file"`
$ffmpeg -i "$file" -vn -ar 44100 -ac 2 -ab "$bit_rate" -f mp3 "$out"
done
This will save the mp3 files alongside the wma ones.
The problem is that basename is stripping both the .wma extension and the path leading to the file. And you only want the .wma stripping.
So the answer is not to use basename and instead just do the .wma stripping yourself (with Parameter Expansion).
outfile=${FILENAME%.wma}
lame -m j -h --vbr-new -b 320 audiodump.wav -o "$outfile.mp3"
(Note that I used lowercase $outfile. Generally $ALL_CAPS variables are reserved for the shell/terminal/environment and should be avoided in scripts.)

Bash while loop wait until task has completed

I have a bash script that I created to process videos from within a folder and it's subfolders:
find . -type f -name '*.mkv' | while read file;
do
ffmpeg -i $file ...
done
The problem: Instead of the while loop waiting ffmpeg to complete, it continues iterate through the loop. The end result is, files not getting processed. I need a way to have the current while loop iteration to wait until ffmpeg is complete before continuing to the next. Or alternatively a way to queue these items.
Edit: So The solution when iterating over a set of files is to pass the -nostdin param to ffmpeg. Hope this helps anyone else who might have a similar issue.
Also file --> $file was a copy/paste typo.
I realize I posted this a while ago but I found the solution. Thanks for all of the responses. Providing the -nostdin param to ffmpeg will do the trick. It will process only the current file before moving onto the next file for processing.
ffmpeg's -nostdin option avoids attempting to read user input from stdin otherwise the video file itself is interpreted.
ffmpeg -i <filename> ... -nostdin
The best part about using the above is that you can continue to use verbosity in case an error is thrown in the output:
ffmpeg -i <filename> ... -nostdin -loglevel panic
OR if you would rather report the output to a file do so this way:
# Custom log name (optional). Helpful when multiple files are involved.
# FFREPORT=./${filename}-$(date +%h.%m.%s).log
ffmpeg -i <filename> ... -nostdin -report
You can also use a combination of the two as well. Also thanks #Barmar for the solution!
I think that this is as simple as you missing the $ before file.
find . -type f -name '*.mkv' | while read file;
do
ffmpeg -i $file ...
done
This is good for you?
find . -type f -name '*.mkv' -exec ffmpeg -i {} \;
I'm a vicious answer snatcher. I found one:
ffmpeg -i $file &
wait $!
Thanks to puchu, here: apply ffmpeg to many files

Resources