rawvideo and rgb32 values passed to FFmpeg - image

I'm converting a file to PNG format using this call:
ffmpeg.exe -vframes 1 -vcodec rawvideo -f rawvideo -pix_fmt rgb32 -s <width>x<height> -i infile -f image2 -vcodec png out.png
I want to use a converter that can be linked or compiled into a closed-source commercial product, unlike FFmpeg, so I need to understand the format of the input file I'm passing in.
So, what does rawvideo mean to FFmpeg?
Is FFmpeg determining what type of raw format the input file has, or does rawvideo denote something distinct?
What does rgb32 mean here?
The size of the input file is a little more than (width * height * 8) bytes.

Normally a video file contains a video stream (whose format is specified using -vcodec), embedded in a media container (e.g. mp4, mkv, wav, etc.). The -f option is used to specify the container format. -f rawvideo is basically a dummy setting that tells ffmpeg that your video is not in any container.
-vcodec rawvideo means that the video data within the container is not compressed. However, there are many ways uncompressed video could be stored, so it is necessary to specify the -pix_fmt option. In your case, -pix_fmt rgb32 says that each pixel in your raw video data uses 32 bits (1 byte each for red, green, and blue, and the remaining byte ignored).
For more information on what the options mean, see the ffmpeg documentation.

Related

Converting images to video keeping GOP 1 using ffmpeg

I have a list of images, containing incremental integer values saved in png format starting from number 1, which need to be converted to a video with GOP 1 using ffmpeg. I have used the following command to convert the images to video and subsequently used ffplay to seek to a particular frame. The displayed frame doesn't match the frame being seek. Any help?
ffmpeg -i image%03d.png -c:v libx264 -g 1 -pix_fmt yuv420p out.mp4

FFmpeg raw h.264 set pts value

I am currently using ffmpeg to convert a custom container media format to mp4. It is straightforward to dump all the h.264 frames to one file and the aac audio to another. Then I can combine the two and create an mp4 file with ffmpeg.
The problem is that the video source isn't always perfect. From time to time frames are dropped or late etc. This causes an A/V sync issue since the pts is generated using a constant rate by ffmpeg. The source format I am using has the PTS value but I cant figure out a way to pass it to ffmpeg with the raw h.264 frames.
I suppose it would be possible to create a demuxer for the custom format, but it seems like a lot effort. I looked into ffmpeg's .nut container format thinking that I might be able to convert from the custom container to .nut first. Unfortunately it seems more complex than it looks on the surface.
It seems like there should be an easy way to pass a frame and its PTS value to ffmpeg, but I haven't come across it yet. Any help would be appreciated.
Here is the ffmpeg command I am using
ffmpeg -f s16le -ac 1 -ar 48k -i source.audio -framerate 20 -i source.video -c:a aac -b:a 64k -r 20 -c:v h264_nvenc -rc:v vbr_hq -cq:v 19 -n out.mp4

is it possible to send ffmpeg images by using pipe?

I want to send images as input to ffmpeg and I want ffmpeg to output video to a stream (webRtc format.)
I found some information that from my understanding showed this is possible. - I believe that ffmpeg could receive image from a pipe, does anyone know how this can be done ?
"I want to send images as input to FFmpeg... I believe that FFmpeg could receive image from a pipe, does anyone know how this can be done?"
Yes it's possible to send FFmpeg images by using a pipe. Use the standardInput to send frames. The frame data must be uncompressed pixel values (eg: 24bit RGB format) in a byte array that holds enough bytes (widthxheightx3) to write a full frame.
Normally (in Command or Terminal window) you set input and output as:
ffmpeg -i inputvid.mp4 outputvid.mp4.
But for pipes you must first specify the incoming input's width/height and frame rate etc. Then aso add incoming input filename as -i - (where by using a blank - this means FFmpeg watches the standardInput connection for incoming raw pixel data.
You must put your frame data into some Bitmap object and send the bitmap values as byte array. Each send will be encoded as a new video frame. Example pseudo-code :
public function makeVideoFrame ( frame_BMP:Bitmap ) : void
{
//# Encodes the byte array of a Bitmap object as FFmpeg video frame
if ( myProcess.running == true )
{
Frame_Bytes = frame_BMP.getBytes(); //# read pixel values to a byte array
myProcess.standardInput.writeBytes(Frame_Bytes); //# Send data to FFmpeg for new frame encode
Frame_Bytes.clear(); //# empty byte array for re-use with next frame
}
}
Anytime you update your bitmap with new pixel information, you can write that as a new frame by sending that bitmap as input parameter to the above function eg makeVideoFrame (my_new_frame_BMP);.
Your pipe's Process must start with these arguments:
-y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - ....etc
Where...
-f rawvideo -pix_fmt argb means accept uncompressed RGB data.
-s 800x600 and -r 25 are example input width & height, r sets frame rate meaning FFmpeg must encode this amount of images per one second of output video.
The full setup looks like this:
-y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - -c:v libx264 -profile:v baseline -level:v 3 -b:v 2500 -an out_vid.h264
If you get blocky video output try setting two output files...
-y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - -c:v libx264 -profile:v baseline -level:v 3 -b:v 2500 -an out_tempData.h264 out_vid.h264
This will output a test h264 video file which you can later put inside an MP4 container. The audio track -i someTrack.mp3 is optional.
-i myH264vid.h264 -i someTrack.mp3 outputVid.mp4

Using FFMPEG to losslessly convert YUV to another format for editing in Adobe Premier

I have a raw YUV video file that I want to do some basic editing to in Adobe CS6 Premiere, but it won't recognize the file. I thought to use ffmpeg to convert it to something Premiere would take in, but I want this to be lossless because afterwards I will need it in YUV format again. I thought of avi, mov, and prores but I can't seem to figure out the proper command line to ffmpeg and how to ensure it is lossless.
Thanks for your help.
Yes, this is possible. It is normal that you can't open that raw video file since it is just raw data in one giant file, without any headers. So Adobe Premiere doesn't know what the size is, what framerate ect.
First make sure you downloaded the FFmpeg command line tool. Then after installing you can start converting by running a command with parameters. There are some parameters you have to fill in yourself before starting to convert:
What type of the YUV pixel format are you using? The most common format is YUV4:2:0 planar 8-bit (YUV420p). You can type ffmpeg -pix_fmts to get a list of all available formats.
What is the framerate? In my example I will use -r 25 fps.
What encoder do you want to use? The libx264 (H.264) encoder is a great one for lossless compression.
What is your framesize? In my example I will use -s 1920x1080
Then we get this command to do your compression.
ffmpeg -f rawvideo -vcodec rawvideo -s 1920x1080 -r 25 -pix_fmt yuv420p -i inputfile.yuv -c:v libx264 -preset ultrafast -qp 0 output.mp4
A little explanation of all other parameters:
With -f rawvideo you set the input format to a raw video container
With -vcodec rawvideo you set the input file as not compressed
With -i inputfile.yuv you set your input file
With -c:v libx264 you set the encoder to encode the video to libx264.
The -preset ultrafast setting is only speeding up the compression so your file size will be bigger than setting it to veryslow.
With -qp 0 you set the maximum quality. 0 is best, 51 is worst quality in our example.
Then output.mp4 is your new container to store your data in.
After you are done in Adobe Premiere, you can convert it back to a YUV file by inverting allmost all parameters. FFmpeg recognizes what's inside the mp4 container, so you don't need to provide parameters for the input.
ffmpeg -i input.mp4 -f rawvideo -vcodec rawvideo -pix_fmt yuv420p -s 1920x1080 -r 25 rawvideo.yuv

Convert a YUV video stream to mp4

How to record YUV video and encode it to mp4 using h264 coded for mac application.
Plz suggest me any link on it.
FFMpeg can encode YUV to mp4(H.264) via libx264 encoder. But you have to specify exact YUV pixel format of your source video. There are several YUV formats.
This command converts rawvideo with pixel format of yuv420p to a MPEG-4 or x264 format.
# Converts the raw yuv420p data to a MPEG-4 video
ffmpeg -f rawvideo -pix_fmt yuv420p -video_size 1280x720 -framerate 25 -i 'in' -f mp4 'out'
Below list shows YUV pixel formats can be decoded by ffmpeg.
$ ffmpeg -pix_fmts 2>&1 | grep yuv
yuv420p
yuv422p
yuv444p
yuv410p
yuv411p
yuvj420p
yuvj422p
yuvj444p
yuv440p
yuvj440p
yuva420p
yuv420p16le
yuv420p16be
yuv422p16le
yuv422p16be
yuv444p16le
yuv444p16be
yuv420p9be
yuv420p9le
yuv420p10be
yuv420p10le
yuv422p10be
yuv422p10le
yuv444p9be
yuv444p9le
yuv444p10be
yuv444p10le
This is the easiest way to convert video formats using MacOSX command line (any version). First download this compressed file and unpack it to your Movies Folder:
https://drive.google.com/file/d/0B3NlLwMD4yd9QU0yVGJyU1NiUDA/view?usp=sharing
You will then have a MMedia_Converter directory with two apps: MMedia_Convert and Android_Converter. Those are my own developed MacOSX open source applications, bassed on FFMpeg Group and HandBrake Group France previous work. Both are fully compliant Mac compilled applications and you´ll have have to do nothing but extract them to your Movies Folder.
You also have there, 3 folders: clip_in, clip_out and scripts.
You must put the videos you want to be converted in the clip_in folder.
The converted output videos, will be generated automatically in the clip_out folder.
In addition you have 2 bash scripts, that you must move to your Mac OSX Desktop.
Once these bash are on desktop, edit them with TextEdit, and change my user name by your Mac name.
In my case, I use one script to generate thumbnails and the other to generate thumbnails too, and to automatically convert videos from any formar to wathever I choose.
"Whatever" means, that if you want to convert mpeg to mkv, you will have to declare it in line: DEST_EXT=mkv (or wathever known video format you want).
Hope this will help you all.
Best Regards, Tomás Hernández

Resources