everyone!
I'm trying to add a splash screen to fade out after 2 seconds into a video using FFMPEG.
I'm using the following command:
ffmpeg -loop 1 -framerate 2 -t 2 -i image.png \
-i video.mp4 \
-filter_complex "[0:v]fade=t=in:st=0:d=0.500000,fade=t=out:st=4.500000:d=0.500000,setsar=1; \
[0:0] [1:0] concat=n=2:v=1:a=0" \
-c:v libx264 -crf 23 output.mp4
but it is generating a video whose duration is correct, but plays for just 2 seconds, exactly the splash screen duration.
Since I don't have very experience on FFMPEG and got this code from the internet I don't know where the problem is...
Use
ffmpeg -i video.mp4 -loop 1 -t 2 -i image.png \
-filter_complex \
"[1]fade=t=in:st=0:d=0.500000,fade=t=out:st=1.500000:d=0.500000,setsar=1[i]; \
[i][0]concat=n=2:v=1:a=0" \
-c:v libx264 -crf 23 output.mp4
The image should be the same resolution as the video. It will fade-in for 0.5 seconds, remain for 1 second, then fade out for 0.5 seconds.
Related
I'm using the following to stream an image to YouTube:
ffmpeg -threads:v 2 -threads:a 8 -filter_threads 2 -thread_queue_size 1080 \
-loop 1 -re -i ./image.png \
-i ./track.mp3 \
-pix_fmt yuv420p -c:v libx264 -qp:v 19 -profile:v high -rc:v cbr_ld_hq -level:v 4.2 -r:v 60 -g:v 120 -bf:v 3 -refs:v 16 -preset fast -f flv rtmp://a.rtmp.youtube.com/live2/xxx
And the looping for the image (to keep it streaming over) works, but not the sound.
Remember that FFmpeg input options are applied per input. So, -loop 1 is only specified for -i image.png input, and -i ./track.mp3 has no input options defined. Now, to loop audio track, you need to use -stream_loop input option like this:
ffmpeg -threads:v 2 -threads:a 8 -filter_threads 2 -thread_queue_size 1080 \
-loop 1 -re -i ./image.png \
-stream_loop -1 -i ./track.mp3 \
...
I am trying to make 15 second video where the background layer is a video made up of 2 images, the first line creates a 15 second video from 2 images.
I chose a small framerate so it renders an mp4 quickly. I then overlay a webm video (which has transparency) over the images. The final video seems to keep the framerate of 2, but i would rather keep the 24 framerate of the webm video.
Is this possible? & is it also possible to turn the below into 1 statement.
ffmpeg -loop 1 -framerate 2 -t 11 -i image1.png -loop 1 -framerate 2 -t 4 -i image2.png -filter_complex "[0][1]concat=n=2" backgroundvideo.mp4;
ffmpeg -i backgroundvideo.mp4 -c:v libvpx-vp9 -i overlayvideo.webm -filter_complex overlay newvid.mp4
You can use the filter fps to adjust your background's framerate
ffmpeg \
-loop 1 -framerate 2 -t 11 -i image1.png \
-loop 1 -framerate 2 -t 4 -i image2.png \
-c:v libvpx-vp9 -i overlayvideo.webm \
-filter_complex '[0][1]concat,fps=24[bg];[2][bg]overlay' \
backgroundvideo.mp4
I am new to ffmpeg. i am doing Zoom & Pan with Fade In/Out Transition on image to make video i used this script, but this is for 4 image and i want to only for single image so i have tried this command:
ffmpeg -y -loop 1 -i 1.jpg -filter_complex "\
[0:v]setpts=PTS-STARTPTS,scale=w='if(gte(iw/ih,1280/720),-1,1280)':h='if(gte(iw/ih,1280/720),720,-1)',crop=1280:720,setsar=sar=1/1,format=rgba,split=2[stream1out1][stream1out2];\
[stream1out1]trim=duration=1,select=lte(n\,30),split=2[stream1in][stream1out];\
[stream1out2]trim=duration=2,select=lte(n\,60)[stream1];\
[stream1in]fade=t=in:s=0:n=30[stream1fadein];\
[stream1out]fade=t=out:s=0:n=30[stream1fadeout];\
[stream1fadein][stream1][stream1fadeout]concat=n=3:v=1:a=0,scale=1280*5:-1,zoompan=z='min(pzoom+0.002,2)':d=1:x='iw/2-(iw/zoom/2)':s=1280x720 ,format=yuv420p[video]" -map [video] -vsync 2 -async 1 -rc-lookahead 0 -g 0 -profile:v main -level 42 -c:v libx264 -r 30 df.mp4
It works fine but it generate 4 second video. So i'm confused about how to set video duration in this command.
That command is a lot more complicated than it needs to be.
Use
ffmpeg -y -i 1.jpg \
-vf "scale=w='if(gte(iw/ih,1280/720),-1,1280*5)':h='if(gte(iw/ih,1280/720),720*5,-1)',\
crop=1280*5:720*5,setsar=1,\
zoompan=z='min(zoom+0.002,2)':d=X:x='iw/2-(iw/zoom/2)':s=1280x720,\
fade=in:s=0:n=25,fade=out:s=X-25:n=25,format=yuv420p" \
-c:v libx264 -profile:v main df.mp4
Replace X in zoompan and fade out with the number of frames you want. The stream fps is 25, so duration is X/25 in seconds.
I need your help. I stream to Twitch with this Command:
ffmpeg -i input.mp4 -i image.jpg -filter_complex 'overlay=x=10:x=10' -s \
1920x1200 -framerate 15 -c:v libx264 -preset ultrafast -pix_fmt yuv420p \
-threads 0 -f flv 'rtmp://'
How is it possible to change the image.jpg picture to another picture on a variable time? I will don't restart the FFMPEG Command.
Add the -f image2 -loop 1 input options for the image input, then atomically replace image.jpg when desired such as by using mv.
Basic example:
ffmpeg -i input.mp4 -f image2 -loop 1 -i image.jpg -filter_complex overlay output.mp4
Streaming example:
ffmpeg -re -i input.mp4 -f image2 -loop 1 -i image.jpg -filter_complex "overlay,format=yuv420p" -c:v libx264 -preset fast -g 50 -b:v 4000k -maxrate 4000k -bufsize 8000k -f flv 'rtmp://'
To answer the "variable time" part of your question use a cron job to run scripts that update the overlay image at a specified time i.e. every 5 mins. For example you can create a folder of various overlays and select one randomly every 5 minutes and copy it to image.jpg. FFMPEG will then render the new image to your stream.
It is important to use -f image 2 -loop 1 -thread_queue_size 512 -i image.jpg especially when rendering other image formats.
I am using this code to combine 2 files together (overlay file over original file):
ffmpeg -r 60 \
-i originalfile.webm -i overlayfile.mov \
-filter_complex " \
[0:v]setpts=PTS-STARTPTS[base]; \
[1:v]setpts=PTS-STARTPTS+0.5/TB, \
format=yuva420p,colorchannelmixer=aa=0.7[overlay]; \
[base][overlay]overlay=x=(W-w)/2:y=0[v]" -map "[v]" -map 0:a -c:a copy -c:v libvpx-vp9 -lossless 1 -threads 4 -quality realtime -speed 8 -tile-columns 6 -frame-parallel 1 -vsync 1 -shortest resultfile.webm
Encoding speed is not bad and quality output also, but after some time video picture could freeze for several seconds, then again it plays ok and then again could freeze.
How could I optimize this code to make fast speed with the highest possible quality as original file without picture freezing?
Thank you
To avoid retiming of the webm and to crop 10% of the overlay from top and bottom, run
ffmpeg \
-i originalfile.webm -i overlayfile.mov \
-filter_complex " \
[0:v]setpts=PTS-STARTPTS[base]; \
[1:v]crop=iw:0.80*ih,setpts=PTS-STARTPTS+0.5/TB, \
format=yuva420p,colorchannelmixer=aa=0.7[overlay]; \
[base][overlay]overlay=x=(W-w)/2:y=0[v]" \
-map "[v]" -map 0:a -c:a copy -c:v libvpx-vp9 -lossless 1 -threads 4 -quality realtime \
-speed 8 -tile-columns 6 -frame-parallel 1 -vsync 2 -shortest resultfile.webm
The crop filter centers the crop window by default, so when cropping to 80%, the top and bottom 10% will get cut off.