How to mix images and movies with ffmpeg - ffmpeg

I have a bunch if h264 encoded mp4 files (of about 10-15 seconds) and I want to mix them with another bunch of jpegs (which should be displayed for x seconds each).
So I've setup the concat.txt file :
file slide_1.jpg
duration 3
file movie_1.mp4
file slide_2.jpg
duration 5
file movie_2.mp4
and I am trying to run
yes | scripts/ffmpeg -f concat -i concat.txt -vcodec copy -c:a copy final.mp4
which generates a movie with the length of 6 hours (6:48:34) and in which I can only see the 1st picture.
How do I fix this ?

As LordNeckbeard said, the slides should first be converted to movies first.
So in my case I convert the slide to movie like this (slide 1 will be a 3 seconds clip):
yes | scripts/ffmpeg -loop 1 -r 25 -i slide_1.jpg -t 00:00:03 -vcodec libx264 -pix_fmt yuv420p -an slide_1.mp4
Then the concat file looks like this:
file slide_1.mp4
file movie_1.mp4
file slide_2.mp4
file movie_2.mp4
and the concatenation command is:
yes | scripts/ffmpeg -f concat -i concat.txt -vcodec copy -c:a copy final.mp4
Note that all the movie pieces must be of the same width and height

Related

FFmpeg remove 2 sec from middle of video and concat the parts. Single line solution

I have a video file that is 22 seconds long.
I want to remove the segment from 10 seconds to 12 seconds.
Then return a concatenated video file of seconds 1-10 and 12-22.
I want to do this in a single FFmpeg command.
This is the easy way
Source
https://www.labnol.org/internet/useful-ffmpeg-commands/28490/
ffmpeg -i input.mp4 -ss 00:00:00.0 -codec copy -t 10 output_1.mp4
and
ffmpeg -i input.mp4 -ss 00:00:12.0 -codec copy -t 10 output_2.mp4
then create an input file with all the source file names and run
ffmpeg -f concat -i file-list.txt -c copy output.mp4
But I'm looking for a one line solution
Any help would be appreciated.
For exact trimming, you'll have to re-encode
Use
ffmpeg -i input.mp4 -vf select='not(between(t,10,12))',setpts=N/FRAME_RATE/TB -af aselect='not(between(t,10,12))',asetpts=N/SR/TB out.mp4

FFMPEG - concat demuxer with duration filter issue

I am trying to generate video from images using ffmpeg concat demuxer.I am creating a text file with image file path. Since images can be of different duration, i am using duration filter to specify the duration of each image. Sample text file is like :
file 1.jpg
duration 3
file 2.jpg
duration 3
file 3.jpg
duration 5
1.jpg and 2.jpg both are displayed for specified 3 sec each but 3.jpg comes for just 2 seconds.
FFMPEG command:
ffmpeg -f concat -i D:/textfile.txt -y -r 10 -crf 22 -threads 2 -preset veryfast D:/video.mp4
Use
ffmpeg -f concat -i textfile -y -vf fps=10 -crf 22 -threads 2 -preset veryfast video.mp4
where textfile is
file 1.jpg
duration 3
file 2.jpg
duration 3
file 3.jpg
duration 5
file 3.jpg
My usage
I'm facing a similar issue with different ffmpeg command options to concat a list of JPG images to an MP4 video.
FFmpeg version:
ffmpeg version 5.0.1-full_build-www.gyan.dev Copyright (c) 2000-2022 the FFmpeg developers
built with gcc 11.2.0 (Rev7, Built by MSYS2 project)
Command:
ffmpeg -f concat -i "input.txt" -c:v libx264 -r 30 -pix_fmt yuv420p "output.mp4"
input.txt:
file '1.jpg'
duration 5
file '2.jpg'
duration 5
Reference: https://shotstack.io/learn/use-ffmpeg-to-convert-images-to-video/
The problem I'm facing
The resulting video only shows the 1st image for 5 seconds, but at the end of the video, I can see like a frame of the 2nd image.
From what I have tested so far, the issue only happens when I have 1 or 2 images. For 3 or more images, I get the expected result. For 3 images and duration 5 applied to all 3 files, the output duration is 14.967000 seconds (close to the expected duration of 15 seconds).
Findings
I can see from the below FFmpeg defect ticket that this is a known bug in the concat demuxer.
https://trac.ffmpeg.org/ticket/6128
Resolution
Whoever getting such kind of weird issues, no matter it's a misuse or bug, I think all we can do before it gets fixed in a future release is that, we need to do some hacks.
As Gyan commented below the question, you need to add one more file to the end of the input text file.
I tried but I'm not able to get the expected duration just like what OP replied Gyan.
Then for me, I'll just make the input file like this to just convert images to videos one by one:
file '1.jpg'
duration 5
file '1.jpg'
I get an output duration of 5.034000 seconds.
Then I'll just repeat the same process for the 2nd image, and concat the 2 videos with another ffmpeg command.
ffmpeg -safe 0 -f concat -i "concat_input.txt" -c copy "output.mp4"
concat_input.txt:
file '1.mp4'
file '2.mp4'
The output duration is 10.068000, very close to what I'm expecting.
Other info
The command to check video duration is:
ffprobe output.mp4 -show_entries format=duration -v 0

create video from images but repeat the first and the last frame 100 times

I have images in a specific order in a directory
Order of Images is as follows
frame2_0000.jpeg
frame2_0001.jpeg
frame2_0002.jpeg
frame2_0003.jpeg
....
....
,etc
I generate the video with the following command
"ffmpeg -y -r 23 -i location_of_image_folder/frame2_%04d.jpeg -c:v libx264 -s 1280*1024 -movflags faststart location_of_output_location.mp4"
Now I want to create a video such that the first frame is repeated 100 times to create the video and the last frame is repeated 100 times.
What strategy should I employ here?
Create a text file list containing all your image file names, but duplicating the first and last 100 times:
1.jpeg
1.jpeg
1.jpeg
......
2.jpeg
3.jpeg
....
Then:
cat $(cat list.txt) | ffmpeg -y -framerate 25/1 -i - -f image2pipe -c:v libx264 -pix_fmt yuv420p out.mp4
Try
ffmpeg -framerate 23 -loop 1 -i location_of_image_folder/frame2_0000.jpeg
-framerate 23 -start_number 1 -i location_of_image_folder/frame2_%04d.jpeg
-framerate 23 -loop 1 -i location_of_image_folder/frame2_1000.jpeg
-filter_complex "[0]trim=start_frame=0:end_frame=100[pre];
[2]trim=start_frame=0:end_frame=99[post];
[pre][1][post]concat,scale=1280:1024,setsar=1"
-c:v libx264 -movflags faststart location_of_output_location.mp4"
I've assumed the first image is frame2_0000.jpeg and last is frame2_1000.jpeg. Change accordingly. You'll have to alter start_number as well to have a number of the file after the first image file.

FFMPEG (windows7) can't get the output video to show more than 3 out of 10 jpgs

I have 10 jpg files (image0.jpg, image1.jpg, image2.jpg ... image9.jpg) and one .mp3 and I'm trying to create a video but I can't get it to show more than the first 3 images in the output.
I played with the output -r option and for example if I change it to 30 it shows all of them but very fast so the whole video plays for under a second.
This is my code:
ffmpeg -i image%d.jpg -i audio.mp3 -r 1 -c:v libx264 -tune stillimage -c:a aac -strict experimental -b:a 192k -r 1/5 -pix_fmt yuv420p -shortest out.mp4
What am I doing wrong ?
The image file demuxer by default uses a frame rate of 25 fps if you do not tell it otherwise. Since you used -r 1/5 as an output option the frame rate will be converted resulting in duplicated or, as in your case, dropped frames to compensate. To change this use -framerate as an input option (this is a private option of the image file demuxer):
ffmpeg -framerate 1/5 -i image%d.jpg output
Some crappy players may not like a "non-standard" frame rate, so you can add an output frame rate to change it while keeping the "timing" of the input:
ffmpeg -framerate 1/5 -i image%d.jpg -r 25 output

How to create a video from a series of images with varying image durations?

I'd like to programmatically create a video file that is composed of a series of images. However, I'd also like to be able to specify a duration for each image. I often see ffmpeg examples suggested for similar tasks, but they always assume the same duration for each image. Is there an efficient way to accomplish this? (An inefficient solution might be setting the frame rate to something high and repeatedly copying each image until it matches the intended duration)
I will be dynamically generating each of the images as well, so if there is way to encode the image data into video frames without writing each image to disk, that's even better. This, however, is not a requirement.
Edit: To be clear, I don't necessarily need to use ffmpeg. Other free command-line tools are fine, as are video-processing libraries. I'm just looking for a good solution.
I was able to solve the exact same problem with the following commands.
vframes is set to the number of seconds * fps
In the example the first video has 100 frames (100 frame / 25 fps = 4 seconds) and second one has 200 frames (8 seconds)
ffmpeg -f image2 -loop 1 -vframes 100 -r 25 -i a.jpg -vcodec mpeg4 a.avi
ffmpeg -f image2 -loop 1 -vframes 200 -r 25 -i b.jpg -vcodec mpeg4 b.avi
mencoder -ovc copy -o out.mp4 a.mp4 b.mp4
The mencoder part is just like the one of d33pika
You can use the concat demuxer to manually order images and to provide a specific duration for each image.
ffmpeg -f concat -i input.txt -vsync vfr -pix_fmt yuv420p output.mp4
Your input.txt should look like this.
file '/path/to/dog.png'
duration 5
file '/path/to/cat.png'
duration 1
file '/path/to/rat.png'
duration 3
file '/path/to/tapeworm.png'
duration 2
file '/path/to/tapeworm.png'
You can write this txt file dynamically according to your needs and excute the command.
For more info refer to https://trac.ffmpeg.org/wiki/Slideshow
It seems like there is no way to have different durations for different images using ffmpeg. I would create separate videos for each of the images and then concat them using mencoder like this:
ffmpeg -f image2 -vframes 30 -i a.jpg -vcodec libx264 -r 1 a.mp4
ffmpeg -f image2 -vframmes 10 -i bjpg -vcodec libx264 -r 1 b.mp4
mencoder -ovc copy -o out.mp4 a.mp4 b.mp4
mencoder for the concat operation needs all the output videos to have same resolution,framerate and codec.
Here a.mp4 has 30 frames of duration 30 seconds and b.mp4 has 10 frames of 10 seconds.

Resources