ffmpeg - how to pan left on an image - ffmpeg

I have a command that works well to pan right on an image:
ffmpeg -nostdin -loop 1 -i image0.jpg -filter_complex "[0:v]crop=ih:ih:iw/2*t/20:0,trim=duration=5,scale=-2:720" -c:a copy -pix_fmt yuv420p output.mp4
but I can't find what would be the equivalent to pan right. I tried something like this
ffmpeg -nostdin -loop 1 -i image0.jpg -filter_complex "[0:v]crop=ih:ih:iw/2-t*20:0,trim=duration=5,scale=-2:720" -c:a copy -pix_fmt yuv420p output.mp4
but it's not giving good results

Use
ffmpeg -nostdin -loop 1 -i image0.jpg -filter_complex "[0:v]crop=ih:ih:iw-ih-(iw-ih)*t/20:0,trim=duration=5,scale=-2:720" -c:a copy -pix_fmt yuv420p output.mp4
You have to set an initial offset, mine is iw-ih and then subtract an argument which expresses movement in time, here (iw-ih)*t/20

Related

Multiple sounds + watermark overlay not working with ffmpeg

I have a problem with an ffmpeg command.
I want to add the same sound several times in the final video and then add a watermark above.
When I do the full command, it doesn't work correctly because the sound is only played once (the first reference):
ffmpeg -i "assets/frame%05d.png" -i "assets/sound.mp3" -loop 1 -i
"assets/watermark.png" -filter_complex
"[1:a]adelay=1000|1000[s1];[1:a]adelay=3000|3000[s2];[s1][s2]amix=2[a];[0:v][2:v]overlay=shortest=1[outv]"
-map "[outv]" -map "[a]" -c:v libx264 -pix_fmt yuv420p -preset ultrafast -y "result.mp4"
When I don't add the watermark, it works correctly:
ffmpeg -i "assets/frame%05d.png" -i "assets/sound.mp3" -filter_complex
"[1:a]adelay=1000|1000[s1];[1:a]adelay=3000|3000[s2];[s1][s2]amix=2[a]"
-map 0:v -map "[a]" -c:v libx264 -pix_fmt yuv420p -preset ultrafast -y "result.mp4"
Try this:
ffmpeg -i "assets/frame%05d.png" \
-stream_loop -1 -i "assets/sound.mp3" \
-loop 1 -i "assets/watermark.png" \
-filter_complex "[1:a]adelay=1000|1000[s1];\
[1:a]adelay=3000|3000[s2];\
[s1][s2]amix=2[a];\
[0:v][2:v]overlay=shortest=1[outv]" \
-map "[outv]" -map "[a]" -shortest \
-c:v libx264 -pix_fmt yuv420p -preset ultrafast -y "result.mp4"
-stream_loop -1 input option loops the input infinitely and -shortest output option stops the audio when video is done.
p.s., I think an aecho filter can combine the 2nd adelay and amix filters.
I found the problem was that the different frames that make up the video were not all correctly encoded which caused the problem with the sounds.
ffmpeg can't read indexed PNG correctly.
You must therefore use this option (for those like me who use imagick):
$imagick->setOption('png:color-type', 6);
$imagick->writeImage('frame00000.png');

FFMPEG - How to resize an image overlay?

I need to resize input 3 (logo.gif) to 360x360, but using scale=360:360 just made my video quality really bad. Here's my code:
ffmpeg -y -hide_banner -safe 0 -f concat -i "concat.txt" -i "overlay.png" -i "audio.mp3" -ignore_loop 0 -i "logo.gif" -filter_complex "[0]scale=3840x2160,zoompan=z='if(lte(zoom,1.0),1.25,max(1.001,zoom-0.0012))':x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)':fps=20:d=200:s=1920x1080[p];[p][1]overlay, scale=1920:1080, drawtext=fontfile=Heathergreen.otf:text=TITLE':fontcolor=black:fontsize=62:x=135:y=940, drawtext=fontfile=voxbox.ttf:text='TEXT':fontcolor=white:fontsize=70:x=120:y=885[v];[2:a]showwaves=mode=cline:s=178x56:r=20:scale=sqrt:colors=0x222222,colorkey=0x000000:0.01:0.1,format=yuva420p[w];[v][3]overlay=20:500[z];[z][w]overlay=108:740[outv]" -map "[outv]" -map 2:a -pix_fmt yuv420p -c:v libx264 -c:a aac -preset veryfast -shortest -movflags faststart -fflags genpts -r 20 "output.mp4"
UPDATE: I've simply resized the image and used that as input rather than resizing during the encode. It works fine, but if anyone has an answer to this I'd be curious to know where I was going wrong.
Instead of [v][3]overlay=20:500[z] you would use [3]scale=360:360[3v];[v][3v]overlay=20:500[z]. Your GIF should be square-shaped to begin with, to avoid distorting it.

FFMPEG: Combine "Create video from images" + scale to x + add audio + overlay logo

I´m working on a webcam-project. It is for generating timelapse videos of sunset/sundown.
I´m using a raspberrypi to generate them with gphoto2 + DSLR.
At the end of the day the images should get to an video, with audio and an overlay logo.
And it should be scaled to 1920 pixel.
I got a nice solution an it worked.
Producing the timelapse video an scale it:
ffmpeg -y -framerate 25 -start_number 0000001 -i /var/www/html/webcam/2020-01-05_bilder/%7d.jpg -vf scale=1920:-1 -pix_fmt yuv420p /var/www/html/webcam/2020-01-05-tag-output-1920.mp4
Taking the output of (1) and add an overlay-logo, add audio
ffmpeg -y -i '/var/www/html/webcam/2020-01-05-tag-output-1920.mp4'
-i '/var/www/html/webcam-scripts/graphics/logo.png'
-i '/var/www/html/webcam-scripts/sounds/chill_time_5.mp3'
-shortest -filter_complex '[1][0]scale2ref=h=ow/mdar:w=iw/6[#A logo][liebfrauen]; [#A logo]format=argb,colorchannelmixer=aa=0.95[#B logo transparent]; [liebfrauen][#B logo transparent] overlay=(main_w-w)-(main_w*0.05):(main_h-h)-(main_h*0.01)'
-c:v libx264 -crf 18 -preset slow -pix_fmt yuv420p -c:a aac -strict -2
'/var/www/html/webcam/2020-01-05-tag-1920.mp4
I tried to combine both actions, but I get an error:
ffmpeg -y -framerate 25 -start_number 0000001 -i '/var/www/html/webcam/2020-01-05_bilder/%7d.jpg' -vf scale=1920:-1 -pix_fmt yuv420p -i '/var/www/html/webcam-scripts/graphics/logo.png' -i '/var/www/html/webcam-scripts/sounds/chill_time_5.mp3' -shortest -filter_complex '[1][0]scale2ref=h=ow/mdar:w=iw/6[#A logo][liebfrauen]; [#A logo]format=argb,colorchannelmixer=aa=0.95[#B logo transparent]; [liebfrauen][#B logo transparent] overlay=(main_w-w)-(main_w*0.05):(main_h-h)-(main_h*0.01)' -c:v libx264 -crf 18 -preset slow -pix_fmt yuv420p -c:a aac -strict -2 '/var/www/html/webcam/2020-01-05-tag-1920.mp4'
Error: Filtergraph 'scale=720:-1' was specified through the -vf/-af/-filter option for output stream 0:0, which is fed from a complex filtergraph.
-vf/-af/-filter and -filter_complex cannot be used together for the same stream.
Isn`t it possible to combine these inputs and scale it? Or ... Where is my misunderstanding?
Don't mix -vf and -filter_complex. Do all filtering in one filtergraph.
ffmpeg -y -framerate 25 -i '/var/www/html/webcam/2020-01-05_bilder/%7d.jpg' -i '/var/www/html/webcam-scripts/graphics/logo.png' -i '/var/www/html/webcam-scripts/sounds/chill_time_5.mp3' -filter_complex '[0]scale=1920:-2[v0];[1][v0]scale2ref=h=ow/mdar:w=iw/6[#A logo][liebfrauen]; [#A logo]format=argb,colorchannelmixer=aa=0.95[#B logo transparent]; [liebfrauen][#B logo transparent] overlay=(main_w-w)-(main_w*0.05):(main_h-h)-(main_h*0.01),format=yuv420p' -c:v libx264 -crf 18 -preset slow -c:a aac -shortest '/var/www/html/webcam/2020-01-05-tag-1920.mp4'
No need for -strict -2. It does nothing for modern ffmpeg.
I replaced -pix_fmt yuv420p with format=yuv420p so it is more organized.
-start_number 0000001 is not needed because 1 is the default.

Adding overlay image to video on specific position ffmpeg

I want to add one or multiple resized images anywhere over the video using ffmpeg. It works well for some position. However, it does not add images to the exact position I want. I have tested it on console and its embedded in php with dynamic variables.
ffmpeg -y -i vid_1561454052.mp4 -i Penguins.jpg -filter_complex
"[0:v][1:v] overlay=221:127:enable='between(t,0,5)'" -pix_fmt yuv420p
-aspect 16:9 -c:a copy vid_1562740969.mp4
Please Help me out...
use loop option
ffmpeg -y -i vid_1561454052.mp4 -loop 1 -i Penguins.jpg \
-filter_complex "[0:v][1:v] overlay=221:127:enable='between(t,0,5)'" \
-pix_fmt yuv420p -aspect 16:9 -c:a copy vid_1562740969.mp4

Correct syntax for ffmpeg filter combination?

I'm playing with ffmpeg to generate a pretty video out of an mp3 + jpg.
I've managed to generate a video that takes a jpg as a background, and adds a waveform complex filter on top of it (and removes the black bg as an overlay).
This works:
ffmpeg -y -i 1.mp3 -loop 1 -i 1.jpg -filter_complex "[0:a]showwaves=s=1280x720:mode=cline,colorkey=0x000000:0.01:0.1,format=yuva420p[v];[1:v][v]overlay[outv]" -map "[outv]" -pix_fmt yuv420p -map 0:a -c:v libx264 -c:a copy -shortest output.mp4
I've been trying to add text somewhere in the generated video too. I'm trying the drawtext filter. I can't get this to work however, so it seems I don't understand the syntax, or how to combine filters.
This doesn't work:
ffmpeg -y -i 1.mp3 -loop 1 -i 1.jpg -filter_complex "[0:a]showwaves=s=1280x720:mode=line,colorkey=0x000000:0.01:0.1,format=yuva420p[v];[1:v][v]overlay[outv]" -filter_complex "[v]drawtext=text='My custom text test':fontcolor=White#0.5: fontsize=30:font=Arvo:x=(w-text_w)/5:y=(h-text_h)/5[out]" -map "[outv]" -pix_fmt yuv420p -map 0:a -c:v libx264 -c:a copy -shortest output.mp4
Would love some pointers!
Filteres operating in series should be chained together
ffmpeg -y -i 1.mp3 -loop 1 -i 1.jpg \
-filter_complex "[0:a]showwaves=s=1280x720:mode=line,colorkey=0x000000:0.01:0.1,
format=yuva420p[v];
[1:v][v]overlay,
drawtext=text='My custom text test':fontcolor=White#0.5:
fontsize=30:font=Arvo:x=(w-text_w)/5:y=(h-text_h)/5[outv]"
-map "[outv]" -pix_fmt yuv420p -map 0:a -c:v libx264 -c:a copy -shortest output.mp4
(You applied the drawtext onto the output of showwaves; it can be directly applied on the overlay output)

Resources