This is for you FFMPEG gurus!
I have video that I take a screenshot image of. This works fine:
ffmpeg -i sourceMovie.mp4 -ss 0 -vframes 1 destImage.jpg
But I was hoping to also scale down the image to 150 px wide, in one fell swoop. Apparently, I should add
scale=150:-1
But where and how do I insert that in the command?
I have tried everything; nothing works ...
scale is a VideoFilter, so you use "-vf":
ffmpeg -i sourceMovie.mp4 -ss 0 -vframes 1 -vf "scale=150:-1" destImage.jpg
Related
Good morning, I hope you are well.
Please, I need help with the following...
I'm moving an image vertically, and I see that the background of the video is green.
How can I change this background color?
Example code:
ffmpeg -loop 1 -t 24 -i "image.jpg" -filter_complex "nullsrc=size=640x360[background];[background][0:v]overlay=shortest=1:y='min(0,-(t)*26)'" -qscale 1 -y out.mpg
Video Result: https://youtu.be/98rKLVO56wA
I hope I can help,
Thank you very much,
Greetings,
Hugo
Use
ffmpeg -loop 1 -t 24 -i "image.jpg" -filter_complex "color=000000:s=640x360[bg];[bg][0]overlay=shortest=1:y='min(0,-(t)*26)'" -qscale 1 -y out.mpg
The color filter takes RGB values as hex RRGGBB
I want to generate an image from a video, but first i want to scale it to a certain width/height then crop it to a set square size, the problem is that my new version ffmpeg doesnt seem to work with scaling first.
ffmpeg version 2.8.6-1ubuntu2
fails:
ffmpeg -y -i input.mp4 -an -ss 5 -s 150x150 -vf scale=-1:150,crop=150:150 -vframes 1 output-small.jpg
Invalid too big or non positive size for width '150' or height '150'
works:
ffmpeg -y -i input.mp4 -an -ss 5 -s 150x150 -vf crop=150:150,scale=-1:150 -vframes 1 output-small.jpg
However i cannot settle for the second command because i am generating images that could be larger then the original size (i'm creating a few different sizes for each image), therefore scale MUST come first. Anybody have any idea what changed or what i am doing wrong here?
This may be happening because your video is portrait, and so the scaled image has width smaller than 150px. Hence the crop fails.
Also, you should skip the -s option, otherwise you're triggering two scaler executions.
Try
ffmpeg -y -i input.mp4 -ss 5 -vf scale='if(gt(iw,ih),-1,150)':'if(gt(iw,ih),150,-1)',crop=150:150 -vframes 1 output-small.jpg
Need to add watermark for first 3 seconds the video using ffmpeg. Here's what I got right now:
ffmpeg -y -i '255871.mov' -qscale:v 0 -qscale:a 0 -vf '[in] transpose=1 [out];movie=watermark.png , select=lte(t\,3) [bg]; [out][bg] overlay=x=20:y=main_h-60 [out]' output.mp4
It rotates video to the right and adds watermark at the bottom of the video for first 3 seconds. The problem is watermark is visible during the whole video.
Thought that select doesn't work at all. Tried following command
ffmpeg -y -i '255871.mov' -qscale:v 0 -qscale:a 0 -vf '[in] transpose=1 [out];movie=watermark.png , select=0 [bg]; [out][bg] overlay=x=20:y=main_h-60 [out]' output.mp4
Watermark is not visible. This is correct and proves that select filter works as expected. As I understand this is how ffmpeg works: it leaves last frame of the shortest video visible.
How can I force ffmpeg to discard show watermark after N seconds?
Have to answer it myself. ffmpeg mailing list helped me to solve the issue.
The main idea is to convert existing watermark into video using Apple Animation codec (it supports transparency) and fade out last frame of created video using fade filter.
Example:
ffmpeg -loop 1 -i watermark.png -t 3 -c qtrle -vf 'fade=out:73:1:alpha=1' watermark.mov
ffmpeg -y -i '255871.mov' -qscale:v 0 -qscale:a 0 -vf '[in] transpose=1 [out];movie=watermark.mov [bg]; [out][bg] overlay=x=20:y=main_h-60 [out]' output.mp4
Fade out is required because ffmpeg uses last frame of overlaid video for the rest of the video. This filter makes last frame fully transparent via alpha=1 parameter. In fact it should be fade=out:74:1:alpha=1, but it didn't work for me, don't know why
I'm looking to create a video using a set of png images that have transparency merged with a static background.
After doing a lot of digging I seems like it's definitely possible by using the filters library.
My initial video making without including the background is:
ffmpeg -y -qscale 1 -r 1 -b 9600 -loop -i bg.png -i frame_%d.png -s hd720 testvid.mp4
Using -vf I can apply the background as overlay:
ffmpeg -y -qscale 1 -r 1 -b 9600 -i frame_%d.png -vf "movie=bg.png [wm];[in][wm] overlay=0:0 [out]" -s hd720 testvid.mp4
However the problem is it's overlaying the background over the input. According libacfilter I can split the input and play with it's content. I'm wondering if I can somehow change the overlay order?
Any help greatly appreciated!
UPDATE 1:
I'm trying to make the following filter work but I'm getting the movie without the background:
ffmpeg -y -qscale 1 -r 1 -b 9600 -i frame_%d.png -vf "movie=bg.png [bg]; [in] split [T1], fifo, [bg] overlay=0:0, [T2] overlay=0:0 [out]; [T1] fifo [T2]" -s hd720 testvid.mp4
UPDATE 2:
Got video making using -vf option. Just piped the input slit it applied image over it and overlayed the two split feeds! Probably not the most efficient way... but it worked!
ffmpeg -y -r 1 -b 9600 -i frame_%d.png -vf "movie=bg.png, scale=1280:720:0:0 [bg]; [in] format=rgb32, split [T1], fifo, [bg] overlay=0:0, [T2] overlay=0:0 [out]; [T1] fifo [T2]" -s hd720 testvid.mp4
The overlay order is controlled by the order of the inputs, from the ffmpeg docs
[...] takes two inputs and one output, the first input is the "main" video on which the second input is overlayed.
You second command thus becomes:
ffmpeg -y -loop 1 -qscale 1 -r 1 -b 9600 -i frame_%d.png -vf "movie=bg.png [wm];[wm][in] overlay=0:0" -s hd720 testvid.mp4
With the latest versions of ffmpeg the new -filter_complex command makes the same process even simpler:
ffmpeg -loop 1 -i bg.png -i frame_%d.png -filter_complex overlay -shortest testvid.mp4
A complete working example:
The source of our transparent input images (apologies for dancing):
Exploded to frames with ImageMagick:
convert dancingbanana.gif -define png:color-type=6 over.png
(Setting png:color-type=6 (RGB-Matte) is crucial because ffmpeg doesn't handle indexed transparency correctly.) Inputs are named over-0.png, over-1.png, over-2.png, etc.
Our background image (scaled to banana):
Using ffmpeg version N-40511-g66337bf (a git build from yesterday), we do:
ffmpeg -loop 1 -i bg.png -r 5 -i over-%d.png -filter_complex overlay -shortest out.avi
-loop loops the background image input so that we don't just have one frame, crucial!
-r slows down the dancing banana a bit, optional.
-filter_complex is a very recently added ffmpeg feature making handling of multiple inputs easier.
-shortest ends encoding when the shortest input ends, which is necessary as looping the background means that that input will never end.
Using a slightly less cutting-edge build, ffmpeg version 0.10.2.git-d3d5e84:
ffmpeg -loop 1 -r 5 -i back.png -vf 'movie=over-%d.png [over], [in][over] overlay' -frames:v 8 out.avi
movie doesn't allow rate setting, so we slow down the background instead which gives the same effect. Because the overlaid movie isn't a proper input, we can't use -shortest and instead explicitly set the number of frames to output to how many overlaid input frames we have.
The final result (output as a gif for embedding):
for references in the future as of 17/02/2015, the command-line is :
ffmpeg -loop 1 -i images/background.png -i images/video_overlay%04d.png -filter_complex overlay=shortest=1 testvid.mp4
thanks for llogan who took the time to reply here : https://trac.ffmpeg.org/ticket/4315#comment:1
Anyone knows the trick?
And how to install ffmpeg ? yum install mpeg only returns this:
======================================================================================== Matched: mpeg ========================================================================================
libiec61883.i386 : Streaming library for IEEE1394
libiec61883.x86_64 : Streaming library for IEEE1394
qffmpeg-devel.i386 : Development package for qffmpeg
qffmpeg-devel.x86_64 : Development package for qffmpeg
qffmpeg-libs.i386 : Libraries for qffmpeg
qffmpeg-libs.x86_64 : Libraries for qffmpeg
I've cobbled up this command line from various answers that works great for me to get the absolutely first frame out from a video. I use this to save a thumbnail screenshot for the video.
ffmpeg -i inputfile.mkv -vf "select=eq(n\,0)" -q:v 3 output_image.jpg
Explanation:
The select filter -vf "select=eq(n\,0)" is to select only frame #0.
-q:v allows you to set the quality of the output jpeg between 1 and 31. Lower the number, higher the quality. 2 - 5 works good, I use 3.
Note: This will get you an image with the same size as the video. To get a thumbnail, you can use the scale filter to get a thumbnail to fit whatever width you need, like so:
ffmpeg -i inputfile.mkv -vf "select=eq(n\,0)" -vf scale=320:-2 -q:v 3 output_image.jpg
The above command will give you a thumbnail jpeg that will be scaled to match width of 320, and height will be calculated to match the aspect ratio.
It's on the manpage:
* You can extract images from a video, or create a video from many
images:
For extracting images from a video:
ffmpeg -i foo.avi -r 1 -s WxH -f image2 foo-%03d.jpeg
This will extract one video frame per second from the video and will
output them in files named foo-001.jpeg, foo-002.jpeg, etc. Images
will be rescaled to fit the new WxH values.
If you want to extract just a limited number of frames, you can use
the above command in combination with the -vframes or -t option, or in
combination with -ss to start extracting from a certain point in time.
But of course you have to install it first. I'm on Debian and don't use yum.
[update for the other question]
i=1
for avi in *.avi; do
ffmpeg -i $avi -vframes 1 -f image2 /tmp/$i.jpg; i=$((i+1))
done
Tested and works.
[update for yet another question...]
for flv in *.flv; do
ffmpeg -i $flv -vframes 1 -f image2 ${flv%%.flv}.jpg
done
An easy to grok solution that works for me is
ffmpeg -i <input> -vframes 1 <output>.jpeg
Note that I do get an error "[swscaler # 0x111652000] deprecated pixel format used, make sure you did set range correctly" but according to a little reading (see for example https://stackoverflow.com/a/43038480/1241736) that can safely be ignored.
It's works for me
ffmpeg -i sample-mp4-file.mp4 -ss 1 -vframes 1 output.jpg