I'm currently using this command:
ffmpeg -f concat -safe 0 -i info.txt -s 1280x720 -crf 24 output.mp4
to join all videos in a folder. Before running the command I have entered "file " into the info.txt file. This works perfectly, but I would like to get a fade effect between all videos (crossfade). How is this possible? I have tried adding the following argument, but it didn't work. I found it online on an old post.
-filter_complex xfade=offset=4.5:duration=1
If anyone has a simple way of doing it, please let me know. I'm using the latest FFmpeg, so all features should be available.
Related
I'm currently trying to learn everything related to videos and encountered a problem that I need help with.
The Question is: How can I save the difference between 2 videos to a seperate file with ffmpeg?
For example here is the ffplay command I'm trying with:
(Source: https://superuser.com/questions/854543/how-to-compare-the-difference-between-2-videos-color-in-ffmpeg)
ffplay -f lavfi "movie=left.mp4,setpts=PTS-STARTPTS,split=3[a0][a1][a2];
movie=right.mp4,setpts=PTS-STARTPTS,split[b0][b1];
[a0][b0]blend=c0_mode=difference[y];
[a1]lutyuv=y=val:u=128:v=128[uv];
[y][uv]mergeplanes=0x001112:yuv420p,pad=2*iw:ih:0:0[down];
[a2][b1]hstack[up];[up][down]vstack"
In this case I would want to have the bottom left video saved to a new file.
Can someone help me get together the right ffmpeg filter and explain the proccessing of ffmpeg?
Your modified command:
ffmpeg -i left.mp4 -i right.mp4 -filter_complex "[0][1]blend=c0_mode=difference[y];[0]lutyuv=y=val:u=128:v=128[uv];[y][uv]mergeplanes=0x001112:yuv420p[v]" -map "[v]" output.mp4
See documentation for blend, lutyuv, and mergeplanes filters.
I would like to create a thumbnail for HLS stream.
I am doing it already with mp4 files
like this ffmpeg -y -ss 00:00:10.000 -i file.mp4 -vframes 1 -vf scale=256:144 out.jpg
And it works great.
But when I try it with HLS live stream, it just spams
Opening 'liveX.ts' for reading.
Even though 10th second is in 'live1.ts'
Any solution to this? And I would like if duration is not in stream, just report error.
I know this is an old question, but I was working with FFMPEG today to see how this could be done with a live stream. I discovered that it can be done pretty easily.
Here is what I use...
ffmpeg.exe -y -i http://username:password#[hls feed ip address]/[path.m3u8] -s 800x450 -vframes 1 -f image2 -updatefirst 1 MyThumbnail.jpg
This is similar to the way you get a thumbnail from an rtsp stream, but seems to work faster.
I hope this helps someone.
From the shell, when I specify a sequence of images via %d in the input filename, FFMPEG insists "No such file or directory", despite evidence to the contrary. Looking online, I haven't managed to find any references to generating video from a sequence of images using FFMPEG where %d is not used, yet it seems to fail here.
My images should be identified by FFMPEG from img%06d.gif. Issuing ls img[0-9][0-9][0-9][0-9][0-9][0-9].gif succeeds in the very same directory I issue the FFMPEG command.
The command I use is:
ffmpeg -i img%06d.gif -c:v libx264 -r 30 -pix_fmt yuv720p test.mp4
What could possibly be going wrong???
The following definitely works:
ffmpeg -i images%06d.png -c:v libx264 -r 30 test.mp4 -y
However it doesn't work with GIF pictures.
You can losslessly convert your pictures to PNG and run the above command line.
I have a folder that has about 10,000 separate jpegs in it, and I want to take all of these and convert them into one single mp4 video. When I do
ffmpeg -r 1 -pattern_type glob -i '/media/e/serv01/Dorgem/camera_history/$f_date/*.jpg' -c:v libx264 /media/e/serv01/Dorgem/camera_history/$f_date/$f_date.mp4
from terminal it works fine, but once I put this into a bash script I get an error that it can't find *.jpg
I tried executing ffmpeg on Ubuntu 13.10 for doing exactly that but it started complaining with all sorts of errors. It seems ffmpeg is deprecated by avconv. Here is how I compiled my JPEGs into an .mp4 using avconv:
avconv -r 30 -i line-%06d.jpg -qscale 2 -r 30 out.mp4
What I was trying to do is create a video of panning across a huge JPEG. If anyone's interested, details are here:
Video panning across your family tree chart (.jpg to .mp4)
Novice user of ffmpeg but going thru whatever docs I can find online.
For a current project I will need to composite 2 videos together to create a .flv file.
Does anyone know the commands to do this?
This works for me:
ffmpeg -i background_file.l -i file_to_overlay.flv -filter_complex overlay=0:0 -acodec aac -strict -2 out.flv
See http://ffmpeg.org/ffmpeg.html#overlay-1 for more details.
Also you can add the scaler in the filter chain and scale things appropriately too.
Do a ffmpeg -filters to see the filters available.