FFmpeg | Option loop not found - ffmpeg

The loop option is not working with gif image.
When I'm working with png image the code good.
But when I'm working with animated gif image the error is thrown Option loop not found.
In my example I'm trying to create the video from input image with specific duration.
ffmpeg -loop 1 -t 5 -i 15324210315b56e3a78abe5.png -i watermark.png -filter_complex "[0]scale=trunc(iw/2)*2:trunc(ih/2)*2[v];[v][1]overlay=x=(W-w-10):y=(H-h-10)" output.mp4
Below command is not working
ffmpeg -loop 1 -t 5 -i 15323488345b55c9a2b2908.gif -i watermark.png -filter_complex "[0]scale=trunc(iw/2)*2:trunc(ih/2)*2[v];[v][1]overlay=x=(W-w-10):y=(H-h-10)" output.mp4

GIFs are handled by a seperate demuxer module, not the generic image sequence demuxer. The gif demuxer has a separate option. See command below.
ffmpeg -ignore_loop 0 -t 5 -i 15323488345b55c9a2b2908.gif ...

The python script for gif to video conversion using ffmpeg library
f"/opt/ffmpeglib/ffmpeg -ignore_loop 0 -i {lambda_file_path} -c:v libx264 -t 10 -pix_fmt yuv420p {lambda_output_file_path}

Related

I want to pipe these two ffmpeg commands for to convert a video to grayscale frames

Please, I want to pipe these two commands.
ffmpeg -i input.flv -vf fps=1 out%d.png | ffmpeg -i input -vf format=gray output
If you just need frames, try this:
ffmpeg -i input.flv -r 1 -pix_fmt gray out%d.png
There is no need to call it twice
-r sets the output frame rate (1 frame/sec) dropping excess frames
pix_fmt sets the output pixel format
[edit]
Try this to output both grayscale video and images:
ffmpeg -i input.flv \
-filter_complex format=gray,split[v0][v1]
-map [v0] -r 1 out%d.png
-map [v1] output.mp4

FFmpeg PNG overlayed on background image to video

I'm trying to overlay an image on a background image and make a video from it with a certain duration.
I found something on some old 2011 thread but FFmpeg doesn't seem to find '-loop_input' so I guess it's an outdated command.
ffmpeg -loop_input -f image2 -i background.png -r 25 -vframes 250 -an -vcodec png test.mov
How do I make this work in the current ffmpeg version?
Use the -loop option for the image demuxer:
ffmpeg -loop 1 -i background.png -frames:v 250 -c:v png test.mov
But because you are going from PNG to PNG you can stream copy it:
ffmpeg -loop 1 -i background.png -frames:v 250 -c:v copy test2.mov
Default frame rate is 25, so I removed the -r 25. If you want to set frame rate with image inputs then use the image demuxer -framerate input option, such as ffmpeg -loop 1 -framerate 24 -i background.png ...
Your input has no audio, so I removed -an.
-f image2 is not needed: it will automatically determine the proper demuxer.

animation between images using FFmpeg

Hi I am new in FFmpeg,
I have made video from slideshow of sequential images (img001.jpg, img002.jpg, img003.jpg....). Using following commands in Ubuntu 14.04
ffmpeg -framerate 1/5 -i img%03d.jpg -c:v libx264 -r 30 -pix_fmt yuv420p -vf scale=320:240 out.mp4
But now I want to put animation like fade-in, fade-out between each sequential images, I want to generate video,
can anybody help me how to make it, i have searched lots of things but could not get....
The best way to do this is create intermediate mpeg's for each image and then concatenate them all into a video. For example, say you have 5 images; you would run this for each one of the images to create the intermediate mpeg's with a fade in at the beginning and a fade out at the end.
ffmpeg -y -loop 1 -i image -vf "fade=t=in:st=0:d=0.5,fade=t=out:st=4.5:d=0.5" -c:v mpeg2video -t 5 -q:v 1 image-1.mpeg
where t is the duration, or time, of each image. Once you have all of these mpeg's, you use ffmpeg's concat command to combine them all into an mp4.
ffmpeg -y -i image-1.mpeg -i image-2.mpeg -i image-3.mpeg -i image-4.mpeg -i image-5.mpeg -filter_complex '[0:v][1:v][2:v][3:v][4:v] concat=n=5:v=1 [v]' -map '[v]' -c:v libx264 -s 1280x720 -aspect 16:9 -q:v 1 -pix_fmt yuv420p output.mp4
This gives you the desired video and is the simplest and highest quality solution with ffmpeg. Let me know if you have any questions about how the above command works.

Ffmpeg video overlay

I am trying to create a video output from multiple video cameras.
Following the example given here Presenting more than 2 videos using FFmpeg
and other similar examples.
but Im getting the error
Output pad "default" for the filter "src" of type "buffer" not connected to any destination
when i run
ffmpeg -i /dev/video1 -i /dev/video0 -filter_complex "[0:0]pad=iw*2:ih[a];[a][1:0]overlay=w[b];[b][2:0]overlay=w:h" -shortest output.mp4
Im not really sure what this means or how to fix it.
Any help would be greatly appreciated!
Thanks.
When using the "padding" option, you have to specify which is the size of the output image and where you want to put the input image
[0:0]pad=iw*2:ih:0:0
tested under windows 7 with file of same size
ffmpeg -i out.avi -i out.avi -filter_complex "[0:0]pad=iw*2:ih:0:0[a];[a][1:0]overlay=w" -shortest output.mp4
and with WebCam Cap (vfwcap) and a still picture (as i have only o=1 WebCam). BTW you can see how to scale one the source to fit in the target (just in case your source have different resolution)
ffmpeg -y -f vfwcap -r 10 -i 0 -loop 1 -i photo.jpg -filter_complex "[0:0]pad=iw*2:ih:0:0[a];[1:0]scale=640:480[b];[a][b]overlay=w" -shortest output.mp4
under Linux:
ffmpeg -i /dev/video1 -i /dev/video0 -filter_complex "[0:0]pad=iw*2:ih:0:0[[a];a][1:0]overlay=w" -shortest output.mp4
if it doesn't work test a simple record of video 1 and after of video 0 and check their properties (type, resolution, fps).
ffmpeg -i /dev/video1 -shortest output1.mp4
ffmpeg -I output1.mp4
If you still have issue, update your question with ffmpeg console output (as text) for video and video 0 capture and also of the call with the overlay

Overlay animated images with transparency over a static background image using ffmpeg?

I'm looking to create a video using a set of png images that have transparency merged with a static background.
After doing a lot of digging I seems like it's definitely possible by using the filters library.
My initial video making without including the background is:
ffmpeg -y -qscale 1 -r 1 -b 9600 -loop -i bg.png -i frame_%d.png -s hd720 testvid.mp4
Using -vf I can apply the background as overlay:
ffmpeg -y -qscale 1 -r 1 -b 9600 -i frame_%d.png -vf "movie=bg.png [wm];[in][wm] overlay=0:0 [out]" -s hd720 testvid.mp4
However the problem is it's overlaying the background over the input. According libacfilter I can split the input and play with it's content. I'm wondering if I can somehow change the overlay order?
Any help greatly appreciated!
UPDATE 1:
I'm trying to make the following filter work but I'm getting the movie without the background:
ffmpeg -y -qscale 1 -r 1 -b 9600 -i frame_%d.png -vf "movie=bg.png [bg]; [in] split [T1], fifo, [bg] overlay=0:0, [T2] overlay=0:0 [out]; [T1] fifo [T2]" -s hd720 testvid.mp4
UPDATE 2:
Got video making using -vf option. Just piped the input slit it applied image over it and overlayed the two split feeds! Probably not the most efficient way... but it worked!
ffmpeg -y -r 1 -b 9600 -i frame_%d.png -vf "movie=bg.png, scale=1280:720:0:0 [bg]; [in] format=rgb32, split [T1], fifo, [bg] overlay=0:0, [T2] overlay=0:0 [out]; [T1] fifo [T2]" -s hd720 testvid.mp4
The overlay order is controlled by the order of the inputs, from the ffmpeg docs
[...] takes two inputs and one output, the first input is the "main" video on which the second input is overlayed.
You second command thus becomes:
ffmpeg -y -loop 1 -qscale 1 -r 1 -b 9600 -i frame_%d.png -vf "movie=bg.png [wm];[wm][in] overlay=0:0" -s hd720 testvid.mp4
With the latest versions of ffmpeg the new -filter_complex command makes the same process even simpler:
ffmpeg -loop 1 -i bg.png -i frame_%d.png -filter_complex overlay -shortest testvid.mp4
A complete working example:
The source of our transparent input images (apologies for dancing):
Exploded to frames with ImageMagick:
convert dancingbanana.gif -define png:color-type=6 over.png
(Setting png:color-type=6 (RGB-Matte) is crucial because ffmpeg doesn't handle indexed transparency correctly.) Inputs are named over-0.png, over-1.png, over-2.png, etc.
Our background image (scaled to banana):
Using ffmpeg version N-40511-g66337bf (a git build from yesterday), we do:
ffmpeg -loop 1 -i bg.png -r 5 -i over-%d.png -filter_complex overlay -shortest out.avi
-loop loops the background image input so that we don't just have one frame, crucial!
-r slows down the dancing banana a bit, optional.
-filter_complex is a very recently added ffmpeg feature making handling of multiple inputs easier.
-shortest ends encoding when the shortest input ends, which is necessary as looping the background means that that input will never end.
Using a slightly less cutting-edge build, ffmpeg version 0.10.2.git-d3d5e84:
ffmpeg -loop 1 -r 5 -i back.png -vf 'movie=over-%d.png [over], [in][over] overlay' -frames:v 8 out.avi
movie doesn't allow rate setting, so we slow down the background instead which gives the same effect. Because the overlaid movie isn't a proper input, we can't use -shortest and instead explicitly set the number of frames to output to how many overlaid input frames we have.
The final result (output as a gif for embedding):
for references in the future as of 17/02/2015, the command-line is :
ffmpeg -loop 1 -i images/background.png -i images/video_overlay%04d.png -filter_complex overlay=shortest=1 testvid.mp4
thanks for llogan who took the time to reply here : https://trac.ffmpeg.org/ticket/4315#comment:1

Resources