ffmpeg showwaves color - ffmpeg

I'm trying to produce a video that has a lime green colored waveform overlayed on top of a background image. Unfortunately though, there is a grey color in the lines as you can see here:
How can I make the grey parts lime green as well?
And if possible, I would like to make the lines thicker as well.
Here is my ffmpeg command:
ffmpeg -i input.aac -i background.jpg -filter_complex "[0:a]aformat=sample_fmts=s16:sample_rates=4410:channel_layouts=mono,showwaves=size=300x200:mode=p2p:rate=10:colors=#68b847[fg];[1:v][fg]overlay=130:150,format=yuv420p[v]" -map "[v]" -map 0:a -c:v libx264 -r 10 -c:a copy -r 10 -movflags +faststart output.mp4

Try this
ffmpeg -i input.aac -i background.jpg -filter_complex "[0:a]aformat=channel_layouts=mono,showwaves=s=1920x1080:mode=p2p:r=30:colors=#68b847[v];[1:v][v]overlay=format=auto:x=(W-w)/2:y=(H-h)/2,format=yuv420p[outv]" -map "[outv]" -map 0:a -c:v libvpx-vp9 -c:a copy -shortest output.mp4

Related

FFMPEG - Merge 2 Files (video_video), with TC and watermark

I need to merge two video files, add a watermaker and a timecode burned in the video.
I see this (by #llogan:)
ffmpeg -i video.mp4 -i audio.mp3 -i watermark.png -filter_complex "[0:v:0]drawtext=fontfile=/usr/share/fonts/TTF/DejaVuSansMono.ttf:timecode='01\:23\:45\:00':r=25:x=(w-text_w)/2:y=h-text_h-20:fontsize=20:fontcolor=white:box=1:boxborderw=4:boxcolor=black[bg];[1][bg]overlay=W-w-10:H-h-12:format=auto[v]" -map "[v]" -map 1:a -shortest output.mp4
But I can't apply for two videos, because of the map. Can someone help me, please? My last attempt was:
ffmpeg -i [video1] -i [video2] -i [image-overlay] -filter_complex "[0:v:0]drawtext=fontfile=/Windows/Fonts/arial.ttf: timecode='00\:00\:00\:00': r=25: x=(w-tw)/2: y=h-(2*lh): fontcolor=0xccFFFF#1: fontsize=85: box=1: boxcolor=0x000000#0.2[bg];concat=n=2:v=1:a=1[vv][a];[vv][2:v]overlay=0:0[v];[vv][bg]overlay=0:0" -map "[v]" -map "[a]" -c:v libx264 -b 2000k -preset fast -c:a aac [output file]
Assuming you want to draw timecode to only video1, and concatenate video1 to video2, and add image overlay over the combined videos:
ffmpeg -i [video1] -i [video2] -i [image-overlay] -filter_complex "[0:v:0]drawtext=fontfile=/Windows/Fonts/arial.ttf: timecode='00\:00\:00\:00': r=25: x=(w-tw)/2: y=h-(2*lh): fontcolor=0xccFFFF#1: fontsize=85: box=1: boxcolor=0x000000#0.2[tc];[tc][0:a][1:v][1:a]concat=n=2:v=1:a=1[cat][a];[cat][2:v]overlay=0:0[v]" -map "[v]" -map "[a]" -c:v libx264 -b:v 2000k -preset fast -c:a aac [output file]

How to overlay a video into other video as blackground at specific time in ffmpeg?

MY CODE
-i "A.mov" -i "B.mp4" -filter_complex "[1:v]setpts=PTS+5/TB[a];[1:v][a]overlay=enable=gte(t\,5):eof_action=pass,format=yuv420p[out]" -map "[out]" -map 0:a? -c:v libx264 -crf 18 -c:a copy "f.mov" -y
This will overlay B.mp4 in font of A.mov but I need B.mp4 As blackground
Thank You.

How can I overlay a video ontop of background image?

I have a command which creates waves ontop of a background image:
ffmpeg -y -i "Assets/Screens/new.png" -i "Temp/video.mp4" -i "Temp/audio.mp3" -filter_complex "[2:a]showwaves=mode=cline:s=255x81:scale=sqrt:colors=0x222222,colorkey=0x000000:0.01:0.1,format=yuva420p[w];[v][w]overlay=240:594,scale=1920:1080[outv]" -map "[outv]" -map 2:a -movflags +faststart -c:v libx264 -c:a aac -preset veryfast -shortest "output.mp4"
How can I also overlay a 1024x576px video at position 756:252?
I have already included the video file as the second (1:v) input, but I can't seem to get the filters to play nicely.
Thanks for any help.
Overlay the image with the video, and feed the overlay-ed output to your existing filter.
The part [1:v][0:v]overlay=756:252[t] overlays video.mp4 and new.png into intermediate video stream [t].
[t][w]overlay=240:594 replaces your [v][w]overlay=240:594.
Complete command:
ffmpeg -y -i "Assets/Screens/new.png" -i "Temp/video.mp4" -i "Temp/audio.mp3" -filter_complex "[1:v][0:v]overlay=756:252[t];[2:a]showwaves=mode=cline:s=255x81:scale=sqrt:colors=0x222222,colorkey=0x000000:0.01:0.1,format=yuva420p[w];[t][w]overlay=240:594,scale=1920:1080[outv]" -map "[outv]" -map 2:a -movflags +faststart -c:v libx264 -c:a aac -preset veryfast -shortest "output.mp4"
I hope I got it right, testing is difficult without having the inputs...

FFMPEG map multiple audio input files to 1 single image file in order to create multiple video output files

I am attempting to have multiple output files in ffmpeg map to multiple inputs however instead of each input mapping to a unique output I get the first input mapping to the first video and then the next videos never get created at all, I will describe exactly what I am trying to achieve below,
I need to:
create multiple video output files
from multiple audio input files
which all use the same one common image file to create the
video
I will post my command below, any help would be greatly appreciated thanks
-y -i input1.mp3 -i input2.mp3 -f image2 -loop 1 -r 2 -i imagefile.png -shortest -c:a aac -c:v mpeg4 -crf 18 -preset veryfast -movflags faststart -map 0 output1.mp4 -map1 output2.mp4
Basic command
ffmpeg -i input1.mp3 -i input2.mp3 -loop 1 -framerate 10 -i imagefile.png -map 2:v -map 0:a -vf format=yuv420p -shortest output1.mp4 -map 2:v -map 1:a -vf format=yuv420p -shortest output2.mp4
The video is filtered and encoded once per output.
With the split filter
ffmpeg -i input1.mp3 -i input2.mp3 -loop 1 -framerate 10 -i imagefile.png -filter_complex "[2:v]format=yuv420p,split=outputs=2[v0][v1]" -map "[v0] -map 0:a -shortest -movflags +faststart output1.mp4 -map "[v1]" -map 1:a -shortest -movflags +faststart output2.mp4
The video is filtered once total, and encoded separately for each output.
Pipe
ffmpeg -y -v error -loop 1 -framerate 10 -i imagefile.png -filter_complex "[0:v]format=yuv420p[v]" -map "[v]" -c:v libx264 -f nut - | ffmpeg -y -i - -i input1.mp3 -i input2.mp3 -map 0:v -map 1:a -c:v copy -shortest -movflags +faststart output1.mp4 -map 0:v -map 2:a -c:v copy -shortest -movflags +faststart output2.mp4
The video is filtered and encoded only once and each output stream copies it.

Correct syntax for ffmpeg filter combination?

I'm playing with ffmpeg to generate a pretty video out of an mp3 + jpg.
I've managed to generate a video that takes a jpg as a background, and adds a waveform complex filter on top of it (and removes the black bg as an overlay).
This works:
ffmpeg -y -i 1.mp3 -loop 1 -i 1.jpg -filter_complex "[0:a]showwaves=s=1280x720:mode=cline,colorkey=0x000000:0.01:0.1,format=yuva420p[v];[1:v][v]overlay[outv]" -map "[outv]" -pix_fmt yuv420p -map 0:a -c:v libx264 -c:a copy -shortest output.mp4
I've been trying to add text somewhere in the generated video too. I'm trying the drawtext filter. I can't get this to work however, so it seems I don't understand the syntax, or how to combine filters.
This doesn't work:
ffmpeg -y -i 1.mp3 -loop 1 -i 1.jpg -filter_complex "[0:a]showwaves=s=1280x720:mode=line,colorkey=0x000000:0.01:0.1,format=yuva420p[v];[1:v][v]overlay[outv]" -filter_complex "[v]drawtext=text='My custom text test':fontcolor=White#0.5: fontsize=30:font=Arvo:x=(w-text_w)/5:y=(h-text_h)/5[out]" -map "[outv]" -pix_fmt yuv420p -map 0:a -c:v libx264 -c:a copy -shortest output.mp4
Would love some pointers!
Filteres operating in series should be chained together
ffmpeg -y -i 1.mp3 -loop 1 -i 1.jpg \
-filter_complex "[0:a]showwaves=s=1280x720:mode=line,colorkey=0x000000:0.01:0.1,
format=yuva420p[v];
[1:v][v]overlay,
drawtext=text='My custom text test':fontcolor=White#0.5:
fontsize=30:font=Arvo:x=(w-text_w)/5:y=(h-text_h)/5[outv]"
-map "[outv]" -pix_fmt yuv420p -map 0:a -c:v libx264 -c:a copy -shortest output.mp4
(You applied the drawtext onto the output of showwaves; it can be directly applied on the overlay output)

Resources