Why doesn't this work:
vim -d <(ffmpeg -i vid1.mp4 2>&1) <(ffmpeg -i vid2.mp4 2>&1)
and how can I get it to work?
Currently it just clears my screen completely and causes my terminal to freeze, completely unresponsive to everything ctrl-c ctrl-d and ctrl-z. I have to quit my terminal every time.
You must use ffprobe (comes with ffmpeg) if you want an output suitable for diffing:
$ vim -d <(ffprobe -i vid1.mp4 2>&1) <(ffprobe -i vid2.mp4 2>&1)
Why?
ffmpeg is a media converter that outputs a lot of things during processing, including some information on the source file. Using it without providing an output file/stream/whatever only to get information on the source file is not how it is supposed to be used and, well… it just doesn't work correctly anyway: you get your information but the terminal may be left in a weird state and the operation returns a non-zero status.
By using ffmpeg, you are essentially relying on a side-effect of using the wrong tool incorrectly.
ffprobe, on the other hand, exists specifically for getting information on the source file.
By using ffprobe, you are relying on the expected outcome of using the right tool correctly.
That said, ffprobe probably shares a lot of code with ffmpeg because you need that 2>&1 hack to make its output Vim-friendly. Oh well…
Related
I want to convert a list of flac files to mp3 using ffmpeg.
I have written the list of files to convert in a file.
Here is my script
#!/bin/bash
while read -r line
do
ffmpeg -i "$line" -ab 320k "${line%.flac}.mp3"
done < flac_list
It works, however when a filename contains a single quote, it does not work.
And here begins my escaping nightmare.
I have found dozens of combinations without find how to make it work.
Could someone help ?
Thanks to #chepner, the -nostdin flag appended to ffmpeg solves the issue.
I have found too that ffmpeg has surprising problems sometimes if it encounters certain characters in the filename. I stumbled over this problem when converting m4a to mp3 using a script. I didn't know that an innocent single-quote is one of them.
What you can do - aside of reporting a bug to ffmpeg - is to test, whether your filename contains an unwanted character, and either rename the file or create a symbolic link to the file, using a "good" name, and undo these changes when your conversion is done.
I am using a small program written by someone else in bash that runs according to cron on my Synology NAS and basically it does search for subtitles for my movies collection and convert their encoding to utf8 if needed.
In general the main bash script calls another subscripts, and unfortunetly it doesn't work 100% as it should. During my investigation I have narrowed down the problem being this specific function in one of the subscripts:
subs_getCharset_SO() {
local file="$1"
local charset=
local et=
tools_isDetected "file" || return $G_RETFAIL
et=$(file \
--brief \
--mime-encoding \
--exclude apptype \
--exclude tokens \
--exclude cdf \
--exclude compress \
--exclude elf \
--exclude soft \
--exclude tar \
"$file" | wrappers_lcase_SO) || {
return $G_RETFAIL
}
case "$et" in
*utf*) charset="UTF8";;
*iso*) charset="ISO-8859-2";;
us-ascii) charset="US-ASCII";;
csascii) charset="CSASCII";;
*ascii*) charset="ASCII";;
*) charset="WINDOWS-1250";;
esac
echo "$charset"
}
It turns out that running the file command on every movie file causes always a Segmentation fault. I have reproduced it by running this command in terminal manually:
admin#Synek:/volume1/video/Filmy/Ghostland.2018$ file --brief --mime encoding Ghostland.2018.txt
The output is:
utf-8
Segmentation fault
So my main problem as I think is that the output of the file command is not assigned to et variable. My goal ideally would be to capture the first line of the output and assign it to et variable. Or at least redirect the output to a file, so far I have tried some solutions that I have found in the web:
admin#Synek:/volume1/video/Filmy/Ghostland.2018$ { file --brief --mime-encoding ./Ghostland.2018.txt; } 2> log
which outputs in terminal just the line that I need and omits the Segmentation fault message:
utf8
Running:
admin#Synek:/volume1/video/Filmy/Ghostland.2018$ cat log
Gives:
Segmentation fault
But I just can't find a way to get the first line before Segmentation fault written in the log output file.
Any help appreciated!
When stdout is to a TTY, GNU libc (like most implementations) configures line-buffering by default, so output written with the standard C library is printed whenever a full line is complete (since it's assumed that a human is watching and wants to see results as soon as they're available, even if that makes overall execution take longer). By contrast, when stdout is to a FIFO or a file, a larger output buffer is used for better efficiency.
Because a SIGSEGV doesn't allow a program to flush its buffers, that means that data still in the buffer at the time of the failure is lost.
On a system with GNU coreutils, you can configure unbuffered or line-buffered stdout (by default, programs can still override it) using the tool stdbuf:
result=$(stdbuf -o0 file --brief --mime-encoding ./Ghostland.2018.txt)
...or, on systems without GNU coreutils but with expect installed, you can use the tool unbuffer:
result=$(unbuffer file --brief --mime-encoding ./Ghostland.2018.txt)
See BashFAQ #9 for more background on buffering and its control from the shell.
So, I made a script for Cygwin that uses Windows's ImageMagick and FFmpeg, but I am not sure if the results here will also apply for bash on Linux. So, what the script does is I have some cartoon video files and I'm using Waifu2x to enhance and upscale the images to 4K, and then using ImageMagick to pipe it to FFmpeg, which is also used to resize it to 3840x2160 in case the resolution is slightly different. Here's a small script I wrote for this example to simplify how it outputs to FFmpeg, as the real script is extremely lengthy and complex.
#!/bin/bash
fun(){
convert out.png JPG:-|tee "$outfile"
}
fun|ffmpeg -f image2pipe -r 60 -i - -c:v libx265 -movflags +faststart "$outputfile"
Now, what I noticed is that if FFmpeg fails to encode, the function continues but fails to output to $outfile. What I want to do is have it able to output to that file in case the encoding fails since I also write all the images to a cache folder for FFmpeg to run through in case the encoding fails, but I also want to write to both the pipe for FFmpeg and the file at the same time. What seems to be happening is that the command tee appears to be refusing to write to the file if it can't write to the pipe. I'm not sure if this behavior is intended, and/or if it also does this on Linux bash. How can I get around this and have it write to the file even if it can't write to the pipe, but write to both at the same time rather than writing to the file and attempting to read it back to the pipe?
Have you tried tee with the -p option? It makes tee continue writing even if tee can't write to its standard output, which in your case means it should cope if ffmpeg fails.
fun() {
convert out.png JPG:- | tee -p "$outfile"
}
I am trying to use youtube-dl to get the urls of some videos and then pipe the resulting urls into the input of my script. So in my terminal I do
youtube-dl --ignore-config -iga ~/Desktop/youtube/videolist.txt | myscript.sh
In my script I define things as
command='ffmpeg'
inputArgs='-i'
outputArgs='-c:v libx264 -preset ultrafast -qp 0'
directory="${HOME}/Desktop/Videos/"
output="video${count}"
extension='mp4'
I test it with echo to make sure everything appears in the correct order.
echo "${command}" "${inputArgs}" "${input}" "${outputArgs}" \
"${directory}""${output}${count}"."${extension}"
And the output from that looks correct. But when I try to run the same thing without the preceding echo command, i.e.,
"${command}" "${inputArgs}" "${input}" "${outputArgs}" \
"${directory}""${output}${count}"."${extension}"
I get an error message that says
At least one output file must be specified.
So it seems pretty obvious to me that I'm doing something wrong when attempting to execute it.
I have tried:
quoting the entire line as a whole
quoting different sections together
using the exec command in front of everything
No matter what I do, an error occurs at some point in the process. I know it's something simple I'm doing wrong. Would someone please enlighten me as to what that might be?
I feel very strongly that the . shouldn't just be in the middle of everything like that, but I really don't know.
Again, everything looks as it should when I run echo before the string of shell parameters.
If more of the script I'm using is needed to understand what I'm talking about, that is not a problem.
The problem is that because you put it in quotes "${outputArgs}" is expanded as a single argument. It doesn't get split up into separate arguments so ffmpeg only see it as a single -c option with a really long stream specifier. The next argument, the output file is interpreted as the codec instead.
To fix the problem simply remove the quotes:
"$command" $inputArgs "$input" $outputArgs "$directory$output$count.$extension"
I removed the curly braces ({}) just to save space. There's nothing wrong with using them if you prefer.
You tried to rely on a string to convey multiple arguments. You probably would want to use an array in all cases like this. An array is easy to use and more versatile (works with any arbitrary strings) and you don't have to walk on that many eggshells in order to avoid quirks and security holes, unlike when leaving quotes off.
command='ffmpeg'
inputArgs=(-i)
outputArgs=(-c:v libx264 -preset ultrafast -qp 0
-metadata 'title=another * string'
-metadata 'artist=; echo "Just a string'
-metadata "comment=Processed by my ${command} script.")
directory="${HOME}/Desktop/Videos/"
output="video${count}"
extension='mp4'
outputArgs+=(-metadata "track=${count}")
When expanding an array, the reference must have the {} around it. When used in the form: "${name[#]}", it behaves as if you had typed the contents on the line directly.
printf '%q\n' is a more useful way of examining a command compared to echo, as it can clearly tell you what belongs to which separate argument.You can also expand an array into another array:
whole_thing=("${command}" "${inputArgs[#]}" "${input}"
"${outputArgs[#]}"
"${directory}/${output}"."${extension}")
#This will output lines you can paste
# into bash to recreate the array:
printf 'whole_thing=()\n'
printf 'whole_thing+=(%q)\n' "${whole_thing[#]}"
#This will run the command:
"${whole_thing[#]}"
There seems to be a bug in my bash script, and after a long time I managed to reduce it to this test case:
find . -maxdepth 1 | while read blah
do
echo "$blah"
ffmpeg -loglevel error -i ./test.jpg -f null /dev/null
done
the output from this is
/test.jpg
/test.mp4
/test.sh
if I remove the ffmpeg invocation, the output becomes this (what I expected):
./test.jpg
./test.mp4
./test.sh
this seems to occur only when the ffmpeg decoder is activated, as ffmpeg -version doesn't produce the error. Why would ffmpeg affect an unrelated string in this way?
I'm at my wit's end, any help would be appreciated.
FFmpeg is eating your standard input. Do like this instead:
find | while read
do
ffmpeg -nostdin
done
Creating forks of `ffmpeg` in loop inside a shell script "loses" some iterations