I want to take an input video and horizontally concatenate it with itself in reverse (play it forwards then backwards) using a filter_complex. So far I have split it over 3 commands, reversing, concatenating, and streaming, but I want to do it in one command. The following streams a video twice but I don't know how to reverse the second video. How does one reverse the second copy of the video without changing the first one?
ffmpeg -re -stream_loop -1 -i self_recording.mkv -filter_complex \
"[0:v]split=2[r1][r2];[0:v][r1][0:v][r2] concat=n=4:v=1[v]" \
-map "[v]" -f v4l2 /dev/video2
Video Prep:
ffmpeg -i "$video" -i "$watermark" \
-filter_complex "[0]split=2[fr][rv];[rv]reverse[rv]; \
[fr][rv]concat=n=2:v=1:a=0,$memestr,$rotatestr[jt]; \
[1][jt]scale2ref[i][m];[m][i]overlay=format=auto,format=yuv420p[v]" \
-map "[v]" -g 30 stream.mp4
Stream:
ffmpeg -stream_loop -1 -re -i "stream.mp4" -c copy -f v4l2 "/dev/video$output"
Related
I'm trying to achieve what I asked in title, but FFMpeg produces a 0 byte size file.
Here is the code [not working]:
ffmpeg -i input.mkv -codec copy -i watermark.png -filter_complex '[0:v] scale=1920:-1,setsar=1:1; [1:v] overlay=0:0' output.mp4
if possible I want to do the whole thing in just 1 command.
Thanks
Try this:
ffmpeg -i input.mkv -i watermark.png \
-filter_complex '[0:v] scale=1920:-1,setsar=1:1[v]; [v][1:v]overlay=0:0[out]' \
-map [out] -an output.mp4
-codec copy - No can do if you want to apply filter
-an disables audio
Filtergraph: you were missing label [v] to connect to the chains.
ffmpeg -i foo.mp4 -filter_complex "fade=d=0.5, reverse, fade=d=0.5, reverse" output.mp4
can be used to fade in and out foo.mp4 video. (we do not care about audio). According to https://video.stackexchange.com/questions/19867/how-to-fade-in-out-a-video-audio-clip-with-unknown-duration
It's good but only works in the simple situation of 1 input video, and 1 output. Now, how can I apply the fade in and out effect in the following more complex situation? I'm trying to concat a.jpg (a picture) with bar.mp4. I want only the bar.mp4 portion to fade in and out.
ffmpeg -loop 1 -t 2 -framerate 1 -i a.jpg -f lavfi -t 2 -i anullsrc -r 24 -i bar.mp4 -filter_complex "[0][1][2:v][2:a] concat=n=2:v=1:a=1 [vpre][a];[vpre]fps=24[v]" -map "[v]" -map "[a]" out.mp4 -y
Of course, I could first create a temporary temp.mp4 from bar.mp4 by running the first command, then input temp.mp4 in my second command. This involves an extra step and extra encoding. Could anyone help fix the commands or suggest something even better?
Use
ffmpeg -loop 1 -t 2 -framerate 24 -i a.jpg -f lavfi -t 2 -i anullsrc -r 24 -i bar.mp4 -filter_complex "[2:v]fade=d=0.5, reverse, fade=d=0.5, reverse[v2];[0][1][v2][2:a] concat=n=2:v=1:a=1 [v][a]" -map "[v]" -map "[a]" out.mp4 -y
I am trying to use FFMPEG to splice few videos and output one combined video.
I managed to get all video stream with this command :
ffmpeg.exe -i 1.mov -i 2.mov -filter_complex "[0:v]scale=1920:1080[v0];[1:v]scale=1920:1080[v1];[v0][v1] concat=n=2:v=1[v]" -map "[v]" out.mp4
Also, to add a dummy audio to a video with this command:
ffmpeg.exe -i 1.mov -f lavfi -i aevalsrc=0 -shortest -i out.mov
Above commands work perfectly, however 2.mov has an audio stream while 1.mov does not.
Is there any method that can set a dummy audio for 1.mov and then combine both video and audio streams from 1.mov and 2.mov at one go, so that output a combined video that can play sound when it is at clip 2.mov.
Use
ffmpeg.exe -i 1.mov -i 2.mov -f lavfi -t 1 -i anullsrc -filter_complex "[0:v]scale=1920:1080[v0];[1:v]scale=1920:1080[v1];[v0][2:a][v1][1:a] concat=n=2:v=1:a=1[v][a]" -map "[v]" -map "[a]" out.mp4
-f lavfi -t 1 -i anullsrc adds a silent 1 second audio input, which is used as a counterpart to the video input from 1.mov. The concat filter will pad the audio to match the video duration of 1.mov.
I want to add multiple file sequences in single ffmpeg command, below is my code, video is getting created but only first image sequence is getting used, second is getting ignored
ffmpeg -y -i input.mp4 -start_number 0000001 -i 1/%07d.png -i 2/%07d.png -filter_complex "[0][1]overlay=x=10:y=10:enable='between(t,0,3)'[v1];[v1][2]overlay=x=10:y=10:enable='between(t,3.8561422222222,6.9761777777778)'[v2]" -map "[v2]" -map 0:a out.mp4
Now the problem is FFMPEG wants continous images, which i don't have i have images starting from 0000001.png in each folder, how can i accomplish this without changing much in my images
Try the glob pattern to deal with inconsistent numbering and pad the PTS with setpts so the overlay doesn't get consumed before it is displayed:
ffmpeg -y -i input.mp4 -pattern_type glob -i "1/*.png" -pattern_type glob -i "2/*.png" -filter_complex "[0][1]overlay=x=10:y=10:enable='between(t,0,3)'[v1];[2]setpts=PTS+3.856/TB[fg];[v1][fg]overlay=x=10:y=10:enable='between(t,3.8561422222222,6.9761777777778)'[v2]" -map "[v2]" -map 0:a out.mp4
Can you pipe the images to -f image2pipe ?
cat $(find 1 2 -name '*.png' -print) | ffmpeg -y -i input.mp4 \
-f image2pipe -vcodec png -i - \
-filter_complex "[0][1]overlay=x=10:y=10:enable='between(t,0,3)'[v1];[v1][2]overlay=x=10:y=10:enable='between(t,3.8561422222222,6.9761777777778)'[v2]" \
-map "[v2]" -map 0:a out.mp4
I'm using this command to put image in fromt of video in ffmpeg:
ffmpeg -re -i input -i myimage -filter_complex "overlay=10:10" -f mpegts udp://127.0.0.1:port
I need to put another image in differant position but on the same video, is there any way to do it ?
You would use
ffmpeg -re -i input -i image1 -i image2 \
-filter_complex "[0][1]overlay=10:10[a];[a][2]overlay=W-10:10" \
-f mpegts udp://127.0.0.1:port