spaces in ffmpeg bash script filename variables - bash

I am using OS X Yosemite. I want to make a simple bash script that allows me to transcode a video. Everything works well as long as a file is not located in a directory which has spaces.
Here is my script:
#!/bin/sh
ffmpeg -i “$1” -c:v ffv1 -level 3 -g 1 -c:a copy “$1.mkv”
Initially, I did not have the double quotes around the variable, but I read some stack overflows that used this as a solution. I'd rather not have to alter the path or add slashes etc. I want to just run:
./script.sh filename.mov
Here's the error I get:
Kierans-iMac:~ bla$ ./firstscript.command "/Users/bla/Desktop/untitled\ folder\ 2/v210.mov.mkv"
ffmpeg version 2.7.2 Copyright (c) 2000-2015 the FFmpeg developers
built with Apple LLVM version 6.1.0 (clang-602.0.53) (based on LLVM 3.6.0svn)
configuration: --prefix=/usr/local/Cellar/ffmpeg/2.7.2_1 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-opencl --enable-libx264 --enable-libmp3lame --enable-libvo-aacenc --enable-libxvid --enable-libfreetype --enable-libfaac --enable-libass --enable-ffplay --enable-libopenjpeg --disable-decoder=jpeg2000 --extra-cflags='-I/usr/local/Cellar/openjpeg/1.5.2_1/include/openjpeg-1.5 ' --enable-nonfree --enable-vda
libavutil 54. 27.100 / 54. 27.100
libavcodec 56. 41.100 / 56. 41.100
libavformat 56. 36.100 / 56. 36.100
libavdevice 56. 4.100 / 56. 4.100
libavfilter 5. 16.101 / 5. 16.101
libavresample 2. 1. 0 / 2. 1. 0
libswscale 3. 1.101 / 3. 1.101
libswresample 1. 2.100 / 1. 2.100
libpostproc 53. 3.100 / 53. 3.100
“/Users/bla/Desktop/untitled\: No such file or directory

You are quoting the spaces twice:
./firstscript.command "/Users/bla/Desktop/untitled\ folder\ 2/v210.mov.mkv"
should just be
./firstscript.command "/Users/bla/Desktop/untitled folder 2/v210.mov.mkv"
A quoted string "foo" is equivalent to \f\o\o; every character is escaped. The backslashes in your original are treated as literal backslashes.
Inside the script, you still need to quote the parameter expansion:
ffmpeg -i "$1" -c:v ffv1 -level 3 -g 1 -c:a copy "$1.mkv"
Note that you have to use regular ASCII quotes ("), not the "smart" quotes (“) implied by the error message in your link.

Related

Compiled FFmpeg not accepting -c:v and -c:a

I compiled FFmpeg with libsrt, with the online compile guide. https://trac.ffmpeg.org/wiki/CompilationGuide/Ubuntu & how to compile ffmpeg with enabling libsrt
It seems to compile correctly.
ffmpeg version N-96575-g843c24a Copyright (c) 2000-2020 the FFmpeg developers
built with gcc 7 (Ubuntu 7.4.0-1ubuntu1~18.04.1)
configuration: --prefix=/home/ubuntu/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/ubuntu/ffmpeg_build/include --extra-ldflags=-L/home/ubuntu/ffmpeg_build/lib --extra-libs='-lpthread -lm' --bindir=/home/ubuntu/bin --enable-gpl --enable-libaom --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-libsrt --enable-nonfree
libavutil 56. 38.100 / 56. 38.100
libavcodec 58. 67.100 / 58. 67.100
libavformat 58. 37.100 / 58. 37.100
libavdevice 58. 9.103 / 58. 9.103
libavfilter 7. 72.100 / 7. 72.100
libswscale 5. 6.100 / 5. 6.100
libswresample 3. 6.100 / 3. 6.100
libpostproc 55. 6.100 / 55. 6.100
But when running this command to convert a incoming SRT stream to HLS, it doesn't know the -c:a command. When switching the order, it runs that it doesn't know about the -c:v command.
ffmpeg -re -i srt://0.0.0.0:25000?pkt_size=1316&mode=listener -c:a copy -c:v copy -strict -f hls -hls_time 4 -hls_playlist_type event stream.m3u8
~$ ffmpeg -re -i srt://0.0.0.0:25000?pkt_size=1316&mode=listener -c:a copy -c:v copy -strict -f hls -hls_time 4 -hls_playlist_type event stream.m3u8
[2] 9930
ffmpeg version N-96575-g843c24a Copyright (c) 2000-2020 the FFmpeg developers
built with gcc 7 (Ubuntu 7.4.0-1ubuntu1~18.04.1)
configuration: --prefix=/home/ubuntu/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/ubuntu/ffmpeg_build/include --extra-ldflags=-L/home/ubuntu/ffmpeg_build/lib --extra-libs='-lpthread -lm' --bindir=/home/ubuntu/bin --enable-gpl --enable-libaom --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-libsrt --enable-nonfree
libavutil 56. 38.100 / 56. 38.100
libavcodec 58. 67.100 / 58. 67.100
libavformat 58. 37.100 / 58. 37.100
libavdevice 58. 9.103 / 58. 9.103
libavfilter 7. 72.100 / 7. 72.100
libswscale 5. 6.100 / 5. 6.100
libswresample 3. 6.100 / 3. 6.100
libpostproc 55. 6.100 / 55. 6.100
-c:a: command not found
[2]+ Stopped ffmpeg -re -i srt://0.0.0.0:25000?pkt_size=1316
I have searched the issue, but I could not find anything similar.
Does someone know what I have missed in the setup?
Everything is manually compiled through the guide, this was the final command I run to compile FFmpeg:
cd ~/ffmpeg_sources && \
wget -O ffmpeg-snapshot.tar.bz2 https://ffmpeg.org/releases/ffmpeg-snapshot.tar.bz2 && \
tar xjvf ffmpeg-snapshot.tar.bz2 && \
cd ffmpeg && \
PATH="$HOME/bin:$PATH" PKG_CONFIG_PATH="$HOME/ffmpeg_build/lib/pkgconfig" ./configure \
--prefix="$HOME/ffmpeg_build" \
--pkg-config-flags="--static" \
--extra-cflags="-I$HOME/ffmpeg_build/include" \
--extra-ldflags="-L$HOME/ffmpeg_build/lib" \
--extra-libs="-lpthread -lm" \
--bindir="$HOME/bin" \
--enable-gpl \
--enable-libaom \
--enable-libass \
--enable-libfdk-aac \
--enable-libfreetype \
--enable-libmp3lame \
--enable-libopus \
--enable-libvorbis \
--enable-libvpx \
--enable-libx264 \
--enable-libx265 \
--enable-libsrt \
--enable-nonfree && \
PATH="$HOME/bin:$PATH" make && \
make install && \
hash -r
Nothing to do with ffmpeg. The shell treats 1316 as the end of the command and & as the background operator.
Enclose the full URL in quotes to avoid this.
"srt://0.0.0.0:25000?pkt_size=1316&mode=listener"

How can I merge two .g729 files into one .wav file with ffmpeg?

How can I merge two .g729 files into one .wav file with ffmpeg?
I used the command
ffmpeg -filter_complex [0\:a][1\:a]amerge\=inputs\=2[aout] -map [aout]
The result is with two files reported .wav and not .g729 and output reported .wav
Here is the FFMPEG data:
ffmpeg version git-2015-07-01-06a0d5e Copyright (c) 2000-2015 the FFmpeg developers
built with gcc 4.3 (SUSE Linux)
configuration: --prefix=/home/cartella/Encoder_Build --extra-cflags=-I/home/cartella/Encoder_Build/include --extra-ldflags=-L/home/cartella/Encoder_Build/lib --bindir=/home/cartella/Encoder --pkg-config-flags=--static --enable-gpl --enable-nonfree --enable-libfdk_aac --enable-libfreetype --enable-libmp3lame --enable-libopus --disable-yasm
libavutil 54. 27.100 / 54. 27.100
libavcodec 56. 46.100 / 56. 46.100
libavformat 56. 40.100 / 56. 40.100
libavdevice 56. 4.100 / 56. 4.100
libavfilter 5. 19.100 / 5. 19.100
libswscale 3. 1.101 / 3. 1.101
libswresample 1. 2.100 / 1. 2.100
libpostproc 53. 3.100 / 53. 3.100

Spaces in filename for ffmpeg in bash [duplicate]

This question already has answers here:
When to wrap quotes around a shell variable?
(5 answers)
Closed 6 years ago.
I try to write a simple transcoding script for VHS backups with ffmpeg. But I fail to handle spaces in filenames.
I build my ffmpeg command together in the script and echo it and when I copy paste the echoed command it works, but not diretcly from the script.
Has anayone a idea whats wrong with my script?
Script:
#!/bin/bash
# VHStoMP4Backup Script
INPUT=$1
OUTPUT="/Volumes/Data/oliver/Video/Encodiert/${2}"
command="ffmpeg \
-i \"$INPUT\" \
-vcodec copy \
-acodec copy \
\"$OUTPUT\""
if [ ! -z "$1" ] && [ ! -z "$2" ] ;
then
echo ${command}$'\n'
${command}
else
echo "missing parameters"
echo "Usage: script INPUT_FILENAME OUTPUT_FILENAME"
fi
exit
Script calling:
./VHStoMP4Backup.sh /Volumes/Data/oliver/Video/RAW\ Aufnahmen/Ewelina\ -\ Kasette\ 1.dv ewe.mp4
Commandline output
olivers-mac-pro:Desktop oliver$ ./VHStoMP4Backup.sh /Volumes/Data/oliver/Video/RAW\ Aufnahmen/Ewelina\ -\ Kasette\ 1.dv ewe.mp4
ffmpeg -i "/Volumes/Data/oliver/Video/RAW Aufnahmen/Ewelina - Kasette 1.dv" -vcodec copy -acodec copy "/Volumes/Data/oliver/Video/Encodiert/ewe.mp4"
ffmpeg version git-2016-04-16-60517c3 Copyright (c) 2000-2016 the FFmpeg developers
built with Apple LLVM version 5.1 (clang-503.0.40) (based on LLVM 3.4svn)
configuration: --prefix=/usr/local/Cellar/ffmpeg/HEAD --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-opencl --enable-libx264 --enable-libmp3lame --enable-libxvid --enable-libfreetype --enable-libvorbis --enable-libvpx --enable-librtmp --enable-libfaac --enable-libass --enable-libssh --enable-libspeex --enable-libfdk-aac --enable-openssl --enable-libopus --enable-libvidstab --enable-libx265 --enable-nonfree --enable-vda
libavutil 55. 22.100 / 55. 22.100
libavcodec 57. 34.102 / 57. 34.102
libavformat 57. 34.101 / 57. 34.101
libavdevice 57. 0.101 / 57. 0.101
libavfilter 6. 42.100 / 6. 42.100
libavresample 3. 0. 0 / 3. 0. 0
libswscale 4. 1.100 / 4. 1.100
libswresample 2. 0.101 / 2. 0.101
libpostproc 54. 0.100 / 54. 0.100
"/Volumes/Data/oliver/Video/RAW: No such file or directory
Never store a command and its arguments in a regular variable, expecting to execute the command simply by expanding the variable.
Use an array to store the arguments, then expand the array when you call the actual command.
if [ $# -lt 3 ]; then
echo "missing parameters"
echo "Usage: script INPUT_FILENAME OUTPUT_FILENAME"
else
INPUT=$1
OUTPUT="/Volumes/Data/oliver/Video/Encodiert/${2}"
args=( -i "$INPUT" -vcodec -acodec "$OUTPUT" )
ffmpeg "${args[#]}"
fi
You need to do a little more work to log the command properly, but that is a small price to pay for safe, correct code.
printf 'ffmpeg'
printf ' %q' "${args[#]}"
printf '\n'
(The logged command won't look exactly like you expect, but it can be used as a valid command line to run the same command. In particular, the %q specifier tends to escape characters individually with a backslash instead of putting longer strings in quotes.)

FFmpeg for marking time video based on a reference date

I am trying to mark a timestamp in a video using drawtext filter.
FFmpeg easily marks timestamps based on localtime, gmtime or even PTS. However, I want to assign a reference time (start time) for the timestamp in order to represent the time the video was recorded (not encoded).
Reading the documentation, I found that option basetime can be used for this purpose. However it seems that is not working or I am missing something.
The command line I am using is:
ffmpeg -y -i input.mp4 -filter_complex drawtext="fontfile=/tmp/UbuntuMono-B.ttf: fontsize=36: fontcolor=yellow: box=1: boxcolor=black#0.4: text='Wall Clock Time\: %{gmtime\:%Y-%m-%d %T}': basetime=1456007118" output.mp4
By using basetime=1456007118, it was expected the start time was set to '02/20/2016 20:25:18' since 1456007118 is the UTC time for that time and date:
date -d '02/20/2016 20:25:18' +"%s" # format MM/DD/AAAA hh:mm:ss
1456007118
However, no error is issued by FFmpeg and the video is marked with current GMT, ignoring basetime option.
Any hint?
Thanks.
Complete information about FFmpeg version and output is:
ffmpeg -y -i /home/denio/Videos/Interstellar_2014_Trailer_4_5.1-1080p-HDTN.mp4 -filter_complex drawtext="fontfile=/tmp/UbuntuMono-B.ttf: fontsize=36: fontcolor=yellow: box=1: boxcolor=black#0.4: text='Wall Clock Time\: %{gmtime\:%Y-%m-%d %T}': basetime=1470226363" /tmp/x.mp4
ffmpeg version 3.1.1 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 5.3.1 (Ubuntu 5.3.1-14ubuntu2.1) 20160413
configuration: --enable-libxavs --enable-bzlib --enable-libfaac --enable-libfreetype --enable-libfontconfig --enable-libmp3lame --enable-libschroedinger --enable-libspeex --enable-libvorbis --enable-libx264 --enable-libx265 --enable-libxvid --enable-zlib --enable-x11grab --enable-static --enable-pthreads --enable-gpl --enable-nonfree --enable-version3 --disable-ffserver --enable-libgsm --enable-librtmp --enable-libvpx --enable-libschroedinger --enable-libopencore-amrnb --enable-libopenjpeg
libavutil 55. 28.100 / 55. 28.100
libavcodec 57. 48.101 / 57. 48.101
libavformat 57. 41.100 / 57. 41.100
libavdevice 57. 0.101 / 57. 0.101
libavfilter 6. 47.100 / 6. 47.100
libswscale 4. 1.100 / 4. 1.100
libswresample 2. 1.100 / 2. 1.100
libpostproc 54. 0.100 / 54. 0.100
...
...
I see the basetime in the source code, but not in the web documentation, so not sure how it's supposed to work.
You can instead use the pts function.
ffmpeg -y -i input.mp4 -vf "drawtext=fontfile=/tmp/UbuntuMono-B.ttf:
fontsize=36:fontcolor=yellow:
box=1:boxcolor=black#0.4:
text='Wall Clock Time\: %{pts\:gmtime\:1456007118}'"
output.mp4
You may need to reset PTS (setpts=PTS-STARTPTS) before the drawtext.

FFMPEG Video encoding error

I am trying to encode my video using ffmpeg. I have taken output of each frame as a separate image and then I'm joining them into a video using ffmpeg. I compiled ffmpeg from source.
This is the command I used and the errors I keep running into!
---:/media/New Volume/temp$ ffmpeg -f image2 -i image%1d.png -vcodec libx264 \
-preset ultrafast -crf 15 output.mp4
ffmpeg version git-2012-03-24-2571506 Copyright (c) 2000-2012 the FFmpeg developers
built on Mar 24 2012 03:47:02 with gcc 4.6.1
configuration: --enable-gpl --enable-libfaac --enable-libmp3lame \
--enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libtheora \
--enable-libvorbis --enable-libx264 --enable-nonfree --enable-version3 \
--enable-x11grab
libavutil 51. 44.100 / 51. 44.100
libavcodec 54. 12.100 / 54. 12.100
libavformat 54. 2.100 / 54. 2.100
libavdevice 53. 4.100 / 53. 4.100
libavfilter 2. 65.102 / 2. 65.102
libswscale 2. 1.100 / 2. 1.100
libswresample 0. 7.100 / 0. 7.100
libpostproc 52. 0.100 / 52. 0.100
// ERORRS::
[png # 0xa4d5120] Missing png signature
[image2 # 0xa4ceb00] decoding for stream 0 failed
[image2 # 0xa4ceb00] Could not find codec parameters (Video: png)
image%1d.png: could not find codec parameters
It's having an error reading your image frames.
Do they open correctly with other software? Are they named image1.png, image2.png, etc.?

Resources