I'm trying to overlay a 15 second video transition on the beginning of an image sequence (png sequence with an alpha to reveal the image below), which I can do fine with the overlay filter. But I want to hold the first frame of the image sequence for 5 seconds before playing the animation. I've tried trim/select but I can't seem to get it be a duration of 5 seconds, I also can't seem to concat it back with the other video to do the transition. So my questions are:
How do I get the first frame and hold it for 5 seconds, the method below works but doesn't seem like the cleanest option?
-framerate 30 -t 60.0 -i input1.%04d.jpg -framerate 30 -t 15.0 -i transition1_%03d.png -filter_complex "color=c=red:d=5:s=480x270:r=30[bg]; [bg][1:v]overlay[transhold]; [0:v][transhold]overlay=repeatlast=0[out]"
How can I concat that with the original before I overlay it on the main video, I can do it with two overlays with the start of the actual transition offset by the length of the hold using the command below, but it seems a bit clunky.
-framerate 30 -t 60.0 -i input1.%04d.jpg -framerate 30 -t 15.0 -i transition1_%03d.png -filter_complex "color=c=red:d=5:s=480x270:r=30[bg]; [1:v]split[trans][transhold]; [trans]setpts=PTS+5/TB[trans];[transhold]select=eq(n\0)[transhold];[bg][transhold]overlay[transhold]; [0:v][transhold]overlay=repeatlast=0[tmp1]; [tmp1][trans]overlay[out]"
This is all part of a larger command where I'm compiling four HD images into a 4k feed each with it's own transition so the cleaner I can be the better really. I'd also like to be able to vary the duration of the hold for the different HD inputs. If I need to I could bring in the first image as a different input but I would still need to concat them. I thought there must be a way to do this with filters...
This was answered in another post:
https://video.stackexchange.com/questions/23551/ffmpeg-extract-first-frame-and-hold-for-5-seconds
-framerate 30 -t 60.0 -i input1.%04d.jpg
-framerate 30 -t 15.0 -i transition1_%03d.png
-filter_complex
"[1]loop=149:1:0[trans];
[0][trans]overlay=eof_action=pass" out.mp4
The first frames of the second input is repeated 149 times, so that there are 150 instances (30 fps x 5s). The 0 at the end of loop is the starting index of the frame(s) to loop. The middle 1 is the number of frames to loop starting at the index in the 3rd argument.
Related
I need to create a video from a list of images, in such a way that the final video should be 24fps, but each image should stay during 3 frames before showing the next image (and I don't want to change the fps, I really need 3 identical frames).
For now I'm using:
ffmpeg -framerate 24 -pattern_type glob -i "build/*.jpg" "$#"
but each image stays only one frame.
You need to combine the input and output frame rates to achieve this:
ffmpeg -framerate 8 -pattern_type glob -i "build/*.jpg" -r 24 "$#"
The input -framerate 8 sets to show each image for 1/8=0.125 seconds, and output -r 24 sets the output framerate to be 24 fps and use each input frame for 24/8=3 output frames.
I am creating a dashcam library where video files are written constantly to 2 buffers. When an event happens, the most recent buffer is returned. Everything works fine except when I try to customize the FPS, I see inconsistent behavior.
This is the ffmpeg command I use:
ffmpeg -y -i rtsp://admin:admin#192.168.1.200 -f segment -segment_time 3 -segment_wrap 2 out_%d.mp4
This works as expected and constantly spits out 2 three second files - out_0.mp4 & out_1.mp4. The default FPS of the streaming device is 100. When I add the fps parameter like so:
ffmpeg -y -i rtsp://admin:admin#192.168.1.200 -f segment -segment_time 3 -segment_wrap 2 -r 60 out_%d.mp4
I see that one or both the files are 4s long and all the frames are the same. When I drop the FPS to 30, the files are at least 8s long.
What am I doing wrong? How can I ensure that the dumped video files are valid and as long as it is specified by -segment_time
The segment muxer, by default, only splits at keyframes. Default keyframe interval is around 250.
Add -g X where X is segment_time * fps to set an appropriate interval.
I am trying to extract the first frame of a second from a video. I have tried different ways to achieve this, but I failed. Here the commands I tried.
ffmpeg -i input.ts -s 400x222 -q:v 3 -start_number 0 -vf fps=1 %d.jpg
Later, I am trying to extract the frames again using the below command of that particular second. Here I'm extracting the frames of 210th second.
ffmpeg -ss 210 -i input.ts -s 300x250 -t 1 -start_number 0 images.%d.jpg
I want to extract only the starting frame of the second. Let's say from 0.001 of that particular second.
When I compare the frame of 210 second extracted by my first command with the first frame extracted by second command are completely different.
For the later use, to prevent the conflicts I want to extract only the very first frame of the original input video. I tried using the command which was told by stackoverflow experts in the past here. But when I run it. It is only extracting the starting frame(only 1 frame).
How can I extract the very starting frame of the video per every second?
try this command :
ffmpeg -i input.mp4 -vf "select=between(mod(n\, 25)\, 0\, 0), setpts=N/24/TB" output-%04d.png
you have to pass your video framerate or fps in between(mod(n\, 25)\, 0\, 0) my video fps is 25 so i pass 25
if your fps 60 then you have to pass 60 in between(mod(n\, 60)\, 0\, 0)
also if you want 1st 5 frames of every second then use between(mod(n\, 25)\, 0\, 4) it will give your 1st 5 frames of every second.
I wish to extract 10 consecutive frames every 2 seconds.
(this is because I wish to choose "best one" from the "nearby offset").
I know how to extract a frame each x seconds:
ffmpeg -i /tmp/V.MP4 -vf fps=1 %02d.jpg
I know how to extract 10 frames from some starting offset:
ffmpeg -ss 20.0 -i /tmp/V.MP4 -vframes 10 %02d.jpg
I have 2 issues:
How do I find the offset for each output image? I can try and calculate it (using the video fps, which is 29.97 in my case) but it sounds like a bad idea - the data is right there in the video for ffmpeg to grab..
Is there an efficient way to "merge" the two commands into one, therefore getting 10 consecutive frames each x seconds?
Use
ffmpeg -i source -vf select='eq(n,0)+if(mod(trunc(t)+1,2)*(trunc(t)-trunc(prev_t)),st(1,n),lt(n,ld(1)+10))' -vsync 0 -frame_pts 1 %d.jpg
How do I find the offset for each output image?
See what frame_pts values mean, at ffmpeg output images filename with time position
this is because I wish to choose "best one" from the "nearby offset"
the thumbnail filter can sort of do this.
i have 60 images 000001.jpeg to 000060.jpeg now this is my command:
"-f image2 -i E:\\REC\\Temp\\%06d.jpeg -r 12 E:\\REC\\Video\\" + label1.Text + ".wmv"
The output is about 3 second but i expect to get only one minute, so how to set duration for each frame to be one image = one second so 60 image = one minute?
Default frame rate for inputs is 25, so in your example ffmpeg is dropping frames to go from 25 to 12 fps (the console output will confirm this).
You can declare just an input frame rate and the output will inherit this same frame rate:
ffmpeg -framerate 1 -i %06d.jpeg output.wmv
Or both an input and an output frame rate:
ffmpeg -framerate 1 -i %06d.jpeg -r 25 output.wmv
This second example is recommended if Windows Media Player has issues with 1 fps content. You will have to experiment to see which example works best for you.
See the image2 demuxer documentation for more information.