I want to synchronously play audio (.wav) file and video which is provided to me in rgb format.
The rgb file contains all the rgb images in the video frames. How can I combine rgb file and audio using ffmeg to get output video which can be played on vlc player?
Input 1 : audio.wav
Input 2 : allimages.rgb
Output : A video file which can be played in vlc player.
I was looking at ffmpeg documentation but couldn't find anything for rgb input. It would be great help if you can provide the ffmpeg command for doing above.
Thanks
The closest I got with this is using below command, but I see Green and Pink colors in my video after I play it. I think I am missing something in the ffmpeg command. Can anyone tell me what is wrong in above command and help to improve video quality and remove green and pink colors?
ffmpeg -s 480x270 -r 15 -pix_fmt gbrp -i /Users/sandeep/Downloads/Videos/input.rgb -c:v libx264 -y output.mp4
Related
I am creating a video with ffmpeg by stringing together a bunch of PNG files. The resulting video has horizontal lines running across it and the colors are not accurate. Here's the command I used:
ffmpeg -framerate 1 -i img%04d.png -pix_fmt yuv420p timer.mp4
I am attaching an example of one of the input PNG files and a frame from the video. Can anyone tell what's wrong?
input file
video frame
Using ffmpeg, I created a video from a list of PNG images to use as a Zoom virtual background. However, when I try to upload it to Zoom, it says "Unsupported format. Please upload a different file." Here is the command that I used:
ffmpeg -framerate 1 -i img%04d.png output.mp4
I get the same error if I try to output a .mov file. Am I missing some option in the ffmpeg command?
PNGs store pixel color data as RGB values. Videos store color data as YUV. However, when converting an RGB input, ffmpeg chooses a YUV format which is incompatible with most players (it does this to preserve full signal fidelity). The user has to set a compatible pixel format with a reduced chroma resolution. Also, framerate 1 isn't compatible by some players, so duplicate frames to increase output framerate.
ffmpeg -framerate 1 -i img%04d.png -r 5 -pix_fmt yuv420p output.mp4
I want to compose 2 videos into 1 video by putting side by side.
I also hope to set start/stop time for each video.
Final video should be H264/AAC codec and mp4 format.
I attached sample videos.
https://www.dropbox.com/s/e5eouyrrqsy44ts/1.webm?dl=0
https://www.dropbox.com/s/u0zqie0icxamt3q/2.webm?dl=0
I used the following ffmpeg command.
ffmpeg -i 1.webm -i 2.webm -filter_complex "[0:v][1:v]hstack" output.mp4
When I run this command on Terminal of Mac OS X 10.11, It gave me the following error.
Input 1 height 480 does not match input 0 height.
The video are from smartphone, so its orientation is not correct.
Please help me to make composed video with FFmpeg.
For this set of videos, you need
ffmpeg -i 1.webm -i 2.webm -filter_complex "[0:v]scale=480:640,setsar=1[l];[1:v]scale=480:640,setsar=1[r];[l][r]hstack;[0][1]amix" -vsync 0 output.mp4
The writing application hasn't written the stream attributes correctly. The videos should be tagged as 480x640. It's not about a missing rotation tag, as the frame context changes.
I want to compose 2 videos into 1 video by putting side by side.
I also hope to set start/stop time for each video.
Final video should be H264/AAC codec and mp4 format.
I attached sample videos.
https://www.dropbox.com/s/e5eouyrrqsy44ts/1.webm?dl=0
https://www.dropbox.com/s/u0zqie0icxamt3q/2.webm?dl=0
I used the following ffmpeg command.
ffmpeg -i 1.webm -i 2.webm -filter_complex "[0:v][1:v]hstack" output.mp4
When I run this command on Terminal of Mac OS X 10.11, It gave me the following error.
Input 1 height 480 does not match input 0 height.
The video are from smartphone, so its orientation is not correct.
Please help me to make composed video with FFmpeg.
For this set of videos, you need
ffmpeg -i 1.webm -i 2.webm -filter_complex "[0:v]scale=480:640,setsar=1[l];[1:v]scale=480:640,setsar=1[r];[l][r]hstack;[0][1]amix" -vsync 0 output.mp4
The writing application hasn't written the stream attributes correctly. The videos should be tagged as 480x640. It's not about a missing rotation tag, as the frame context changes.
I have some FLV videos with alpha channels, and I want to convert each of them to PNG images using ffmpeg but keep the transparency.
So far, I've tried this:
ffmpeg -i input.flv -an -y %d.png
But this outputs the PNG files with black background.
Is there any way to do this?
Alternate acceptable solution: If I can output the images and give the alpha channel a certain color of my choice. I can then remove it later via imagemagick and convert that color to transparency.
I know its quite late for an answer but I was searching for a similar solution and found this : ffmpeg -i video.flv -r 25 -vcodec png -pix_fmt rgb32 %d.png