Can I dynamic crop two input video then stack them using ffmpeg? - ffmpeg

I want to make an effect of switching back and forth between two videos.
I tried to dynamic crop two input video then stack them using ffmpeg:
ffmpeg -i input1.mp4 -i input2.mp4 -filter_complex \
"[0:v]crop=iw:'2+(mod(n,ih))':0:0[a];[1:v]crop=iw:'ih-2-
(mod(n,ih))':0:'2+(mod(n,ih))'[b];[a][b]vstack=inputs=2[v]" \
-map [v] output.mp4
Skip 2 pixel to prevent crop zero.
But the output video is not what I want. It seems '(mod(n,ih))' is zero all the time.
I don't know what's wrong with it.

Related

Ffmpeg/lavfi Is it possible to fade out an image overlay that was loaded with a movie= filter not a -i parameter

I've tried something like:
movie='overlaylogo.png',fade=out:st=5:d=1[logo],[in][logo]overlay='0:0'[out]
But it seems to stick on 100% opacity, adding in a loop=loop-1 and fps=24 filter after the movie= load seem to have little effect, is there some step required to convert an image into a video for fade to apply over?
The key is to keep your stream alive so the fade filter can do its job. An image input dies immediately after it sends out its frame.
With -i we'd do
ffmpeg -i input.mp4 -loop 1 -i logo.png \
-filter_complex [1:v]fade=out:st=5:d=1:alpha=1,[0:v]overlay=0:0:shortest=1[out] \
-map [out] -map 0:a output.mp4
The -loop image2 input option keeps the stream alive. The same thing must be done with movie source filter. It has loop option but with a caveat: "Note that when the movie is looped the source timestamps are not changed, so it will generate non monotonically increasing timestamps." So, you need to add setpts filter to establish monotonically increasing timestamps yourself:
ffmpeg -i input.mp4 \
-vf movie=logo.png:loop=0,setpts=N/FR/TB,fade=out:st=5:d=1,[in]overlay=0:0:shortest=1[out] \
output.mp4
P.S., loop=-1 (not 1) should also work (and it likely won't need the setpts filter to fix the timestamp).

How can I resize an overlay image with ffmpeg?

I am trying to add an overlay to a video using the following command
ffmpeg -y -i "$videoPath" -i "$overlayPath" -filter_complex "[0:v] [1:v] overlay=$overlayPosition" -pix_fmt yuv420p -c:a copy "$outputPath"
However, I would like to be able to resize the overlay I am about to apply to some arbitrary resolution (no care for keeping proportions). However, although I followed a couple of similar solutions from SO (like FFMPEG - How to resize an image overlay?), I am not quite sute about the meaning of the parameters or what I need to add it in my case.
I would need to add something like (?)
[1:v]scale=360:360[z] [1:v]overlay=$overlayPosition[z]
This doesn't seem to work so I'm not sure what I should be aiming for.
I would appreciate any assistance, perhaps with some explanation.
Thanks!
You have found all parts. Let's bring them together:
ffmpeg -i "$videoPath" -i "$overlayPath" -filter_complex "[1:v]scale=360:360[z];[0:v][z]overlay[out]" -map "[out]" -map "0:a" "$outputPath"
For explanation:
We're executing here two filter within the "filter_complex" parameter separated by a semicolon ";".
First we scale the second video input ([1:v]) to a new resolution and store the output in variable "z" (you can put here any name).
Second we bring the first input video ([0:v]) and the overlay ([z]) together and store the output in variable "out".
Now it's time to tell ffmpeg what he should pack into our output file:
-map "[out]" (for the video)
-map "0:a" (for the audio of the first input file)

FFMPEG - crop and pad a video (keep 3840x1920 but with black borders)

I am trying to crop a video so that I can remove a chunk of the content from the sides of a 360-degree video file using FFmpeg.
I used the following command and it does part of the job:
ffmpeg -i testVideo.mp4 -vf crop=3072:1920:384:0,pad=3840:1920:384:0 output.mp4
This will remove the sides of the video and that was initially exactly what I wanted (A). Now I'm wondering if it is possible to crop in the same way but to keep the top third of video. As such, A is what I have, B is what I want.:
I thought I could simply do this:
ffmpeg -i testVideo.mp4 -vf crop=3072:1920:384:640,pad=3840:1920:384:640 output.mp4
But that doesn't seem to work.
Any input would be very helpful.
Use the drawbox filter to fill cropped portion with default colour black.
ffmpeg -i testVideo.mp4 -vf drawbox=w=384:h=1280:x=0:y=640:t=fill,drawbox=w=384:h=1280:x=3840-384:y=640:t=fill -c:a copy output.mp4
The first filter acts on the left side, and the 2nd on the right.

Where I made a mistake - FFmpeg (Linux) basic problem

I just started learning FFmpeg. I have code (like below), but it's doing nothing.
fmpeg -i videoplayback.mp4 -filter_complex "[1:v]trim=start=0:end=1,setpts=PTS-STARTPTS,scale=480x360,setsar=sar=16/9[intro1];
[1:v]trim=start=1:end=123.39,setpts=PTS-STARTPTS,scale=480x360,setsar=sar=16/9[main1];
[1:v]trim=start=123.39:end=124.39,setpts=PTS-STARTPTS,scale=480x360,setsar=sar=16/9[end1];
[intro1]format=pix_fmts=yuva420p, fade=t=in:st=0:d=1:alpha=1[intro1];
[end1]format=pix_fmts=yuva420p, fade=t=in:st=0:d=1:alpha=1[end1];
[intro1][main1][end1][output];
[a:1][audio]; -vcodec libx264 -map "[output]" -map"[audio]" "output.mp4"
fmpeg should be ffmpeg.
You only have one input so [1:v] should be [0:v] (it starts counting from 0).
No need for alpha for fading because you are not overlapping or blending frames.
Ending fade needs to be a fade out (not fade in).
You can't re-use filter output labels within the filtergraph.
Some of your filterchains can be combined.
Some of your labels are not associated with a filter (it appears you forgot to use the concat filter).
You can add scale and setsar at the end instead of using them for each segment.
Replace the last ; with ".
You didn't map the audio properly.
Stream copy (re-mux) the audio.
Example:
ffmpeg -i videoplayback.mp4 -filter_complex "[0:v]trim=end=1,setpts=PTS-STARTPTS,fade=t=in:d=1[intro];[0:v]trim=start=1:end=123.39,setpts=PTS-STARTPTS[main];[0:v]trim=start=123.39,setpts=PTS-STARTPTS,fade=t=out:d=1[end];[intro][main][end]concat=n=3:v=1:a=0,scale=480x360,setsar=16/9[v]" -map "[v]" -map 0:a -c:a copy output.mp4

I can't overlay and center a video on top of an image with ffmpeg. The output is 0 seconds long

I have an mp4 that I want to overlay on top of a jpeg. The command I'm using is:
Ffmpeg -y -i background.jpg -i video.mp4 -filter_complex "overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/2" -codec:a copy output.mp4
But for some reason, the output is 0 second long but the thumbnail does show the first frame of the video centred on the image properly.
I have tried using -t 4 to set the output's length to 4 seconds but that does not work.
I am doing this on windows.
You need to loop the image. Since it loops indefinitely you then must use the shortest option in overlay so it ends when video.mp4 ends.
ffmpeg -loop 1 -i background.jpg -i video.mp4 -filter_complex \
"overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/2:shortest=1" \
-codec:a copy -movflags +faststart output.mp4
See overlay documentation for more info.
Well you should loop the image until the video duration. So to do the you need to add -loop 1 before the input image. Then the image will have a infinite duration. So to control it specify -shortest before the output file which will trim all the streams to the shortest duration among them. Else you can use -t to trim the image duration to the video length. This will do what you want.
Hope this helps!

Resources