FFMPEG Overlay Not Found For Putting Watermark in Video - ffmpeg

I am trying to put a logo in an rtmp stream using ffmpeg. My version of ffmpeg is ffmpeg version 4.3.1 Currently in my complex filter I have:
ffmpeg -re -i 'video.mp4' -filter_complex "tpad=start_duration=10:stop_duration=15:start_mode=add:color=black:stop_mode=add" -f flv rtmp://example.com/a/stream
And it works! But when I add :overlay=0:0 at the end:
ffmpeg -re -i 'video.mp4' -i image.jpeg -filter_complex "tpad=start_duration=10:stop_duration=15:start_mode=add:color=black:stop_mode=add:overlay=0:0" -f flv rtmp://example.com/a/stream
I get the errors:
[Parsed_tpad_0 # 0x555bc5d99f40] Option 'overlay' not found
[AVFilterGraph # 0x555bc5e7a980] Error initializing filter 'tpad' with args 'start_duration=10:stop_duration=15:start_mode=add:color=black:stop_mode=add:overlay=0:0'
Error initializing complex filters.
Option not found
What might I be doing wrong?

ffmpeg -re -i 'video.mp4' -i image.jpeg -filter_complex "tpad=start_duration=10:stop_duration=15:start_mode=add:color=black:stop_mode=add[bg];[bg][1]overlay" -f flv rtmp://example.com/a/stream
The overlay filter requires 2 inputs but you are only giving it 1.
Filters in the same linear chain are separated by commas (,) and distinct linear chains of filters are separated by semicolons (;). See FFmpeg Filtering Introduction.

Related

Merging two audio files over a silent track

I'm using FFMPEG and I want to use a silent track as a template. I want to take two audio streams from a WEBM file and concatenate them together, but the second audio has a delayed start. I want an audio silence between them. How would I do that?
This is what I currently have:
ffmpeg -i W1.webm -itsoffset 10 -i W2.webm -f lavfi -t 600 -i anullsrc=cl=stereo -filter_complex '[0:1][1:1][2:1] amerge=inputs=3' output.webm
Furthermore, I want to end the output at the end of the second audio stream.
No need to use amerge; the concat filter will work.
ffmpeg -i W1.webm -i W2.webm -filter_complex '[1:a]adelay=10s|10s[a1];[0:a][a1]concat=n=2:v=0:a=1' -ac 2 output.webm
Use ffmpeg 4.2 or newer.

How to use multiple palettegen arguments in FFMPEG

I am creating a palette with FFMPEG using the following:
ffmpeg -i movie.mov -vf palettegen=max_colors=5 palette.gif
I want to use additional features such as reserve_transparent as mentioned in the FFMPEG palettegen docs, but having trouble getting it to work:
ffmpeg -i movie.mov -vf palettegen=max_colors=5,reserve_transparent=1 palette.gif
What am I doing wrong?
Options for a filter are separated by a colon, so
ffmpeg -i movie.mov -vf palettegen=max_colors=5:reserve_transparent=1 palette.gif

FFMPEG - Filter volume has an unconnected output

I have the following FFMPEG command:
ffmpeg -i ./master_video.mp4 -i ./temp/temp1.mp4 -i ./temp/temp2.mp4 -y -filter_complex [0:v]setpts=PTS-STARTPTS[v0];[1:a]asetpts=PTS-STARTPTS,volume=0.1[aud1];[1:v]setpts=PTS-STARTPTS+5/TB,fade=t=in:st=5:d=1:alpha=1,fade=t=out:st=14:d=1:alpha=1[v1];[2:a]asetpts=PTS-STARTPTS,volume=0.1[aud2];[2:v]setpts=PTS-STARTPTS+10/TB,fade=t=in:st=10:d=1:alpha=1,fade=t=out:st=19:d=1:alpha=1[v2];[v0][v1]overlay=eof_action=pass[out1];[out1][v2]overlay=eof_action=pass[out2] -map [out2] -map [aud1][aud2] temp.mp4
But when I run it, I received the following error:
error: ffmpeg exited with code 1: Filter volume has an unconnected output
Any ideas why that error is occurring?
If you wish to mix the audio outputs, it needs to be done within the filtergraph.
Use
ffmpeg -y -i ./master_video.mp4 -i ./temp/temp1.mp4 -i ./temp/temp2.mp4 -filter_complex
"[0:v]setpts=PTS-STARTPTS[v0];
[1:a]asetpts=PTS-STARTPTS,volume=0.1[aud1];
[1:v]setpts=PTS-STARTPTS+5/TB,fade=t=in:st=5:d=1:alpha=1,fade=t=out:st=14:d=1:alpha=1[v1];
[2:a]asetpts=PTS-STARTPTS,volume=0.1[aud2];
[2:v]setpts=PTS-STARTPTS+10/TB,fade=t=in:st=10:d=1:alpha=1,fade=t=out:st=19:d=1:alpha=1[v2];
[v0][v1]overlay=eof_action=pass[out1];
[out1][v2]overlay=eof_action=pass[vout];
[aud1][aud2]amix[aout]"
-map [vout] -map [aout] temp.mp4
Note that any audio from the master video is ignored, as it would have been if your original command had worked. Also, the audio and video from the temp videos are no longer synchronized since the setpts expressions are different.

how to output gif with same size as input video

I am following How do I convert a video to GIF using ffmpeg, with reasonable quality?
It gives example:
ffmpeg -i input.flv -i palette.png -filter_complex "fps=10,scale=320:-1:flags=lanczos[x];[x][1:v]paletteuse" output.gif
However I want the gif output to be the same size as video and not 320 as specified here so I removed scale=320:-1 so I have
ffmpeg -i input.flv -i palette.png -filter_complex "fps=10,flags=lanczos[x];[x][1:v]paletteuse" output.gif
When I run that I get:
No such filter: 'flags' Error initializing complex filters.
If I remove:
-filter_complex "fps=10,flags=lanczos[x];[x][1:v]paletteuse"
Then it works but quality of the video is bad. So it seems that I must use a scale for those palette flags to work, how can I get ffmpeg to output gif same size as input video?
Omit the scale filter
By default the output uses the same width and height as the input. The :flags=lanczos was part of the scale filter. So your command will look like:
ffmpeg -i in.flv -i palette.png -filter_complex "fps=10[x];[x][1:v]paletteuse" out.gif
I have figured it out:
ffmpeg -i video.mkv -y -i palette.png -filter_complex "fps=10,scale=iw:ih:flags=lanczos[x];[x][1:v]paletteuse" output_mkv.gif
scale=iw:ih does the trick, same size as input video

FFmpeg output to MacOSX screen?

How do I route ffmpeg output to the screen on MacOSX?
If I type:
ffmpeg -i input.mp4 -i logo.png -filter_complex overlay output.mp4
The output file contains the input file with the logo overlayed on top of it.
If I type:
ffplay -i input.mp4 -i logo.png -filter_complex overlay
Then it throws the error:
Argument 'logo.png' provided as input filename, but 'input.mp4' was already specified.
...but typing:
ffplay -filters
displays a list of filters including:
T.C overlay VV->V Overlay a video source on top of the input.
Clearly, I'm missing something obvious.
How do I route ffmpeg output to the screen, and where can I find a list of which filters and options work in ffmpeg but not in ffplay?
With lots of help from Mortiz and Carl on the ffmpeg-user mailing list, I have an incantation that works in both ffmpeg and ffplay:
ffplay -i input.mp4 -vf movie=logo.png,[in]overlay
and:
ffmpeg -i input.mp4 -vf movie=logo.png,[in]overlay test.mp4

Resources