Chroma Subsampling with ffmpeg - ffmpeg

I want to create an .mp4 output. But it doesn't work...
I'm using ffmpeg. My input video is a raw video and I want to have an raw video .mp4 at the end.
My code that i use:
ffmpeg.exe -i input.y4m -c:v rawvideo -vf format=yuv420p output.y4m
Can anyone help me out? :)

The correct syntax is: ffmpeg.exe -i input.y4m -pix_fmt yuv420p output.y4m
Test sample:
Build synthetic video y4m file in YUV444 format:
ffmpeg -f lavfi -i testsrc=rate=10:size=160x120 -t 5 -pix_fmt yuv444p input1.y4m
Convert from YUV444 to YUV420:
ffmpeg -i input1.y4m -pix_fmt yuv420p output1.y4m
You can also create AVI raw video:
ffmpeg -i input1.y4m -c:v rawvideo -pix_fmt yuv420p output1.avi
But you can't create raw mp4 video.
Following code: ffmpeg -i input1.y4m -c:v rawvideo -pix_fmt yuv420p output1.mp4 returns error message:
[mp4 # 00000140d77eb9c0] Could not find tag for codec rawvideo in stream #0, codec not currently supported in container
Could not write header for output file #0 (incorrect codec parameters ?): Invalid argument
Error initializing output stream 0:0 --
There is an error because mp4 container doesn't support raw video format.

Related

How to save decoded raw rgba frames using ffmpeg?

I need to transcode mp4 video to a raw video frames stored in a file.
I'm trying this command but it fails with Unable to find a suitable output format for 'test.rgba':
ffmpeg -i test.mp4 -vcodec rawvideo -pix_fmt rgba -an test.rgba
It's this
ffmpeg -i test.mp4 -vf format=rgba -f rawvideo test.rgba

FFMPEG - Convert UInt16 Data to .264

At the moment, I'm trying to convert with FFMPEG my raw data in uint16 format from an infrared camera to MP4 format or at least to .h264.
My current command for ffmpeg is here:
ffmpeg -f rawvideo -pix_fmt gray16be -s:v 140x110 -r 30 -i binaryMarianData.bin -c:v libx264 -f rawvideo -pix_fmt yuv420p output.264
But my ouput File is not really looking good :(
Frame of My Input, its a Nose
Frame of My Output
Here is my Input File: http://fileshare.link/91a43a238e0de75b/binaryMarianData.bin
Update 1: Little Endian
Hey guys, would be great if it's possible to get the video output in the little endian byte order.
This is a frame shown with ImageJ with the following settings
Settings of the shown frame above in ImageJ
Unfortunaley my output doesn't look like this.
Output Frame Little Endian
This is my command used to convert the RAW File:
ffmpeg -f rawvideo -pixel_format gray16le -video_size 110x140 -framerate 30 -i binaryMarianData.bin -vf transpose=clock -c:v libx264 -pix_fmt yuv420p output.264
It is sideways, so the stride has to be corrected and the image rotated.
ffmpeg -f rawvideo -pixel_format gray16be -video_size 110x140 -framerate 30 -i binaryMarianData.bin -vf transpose=clock -c:v libx264 -pix_fmt yuv420p output.264

ffmpeg convert TS video file to raw rgb32 video file

I have a transport steam file containing H.264 video and would like to extract the video stream to a file containing raw uncompressed RGB32 video frames. So the H.264 video would need to be decoded and converted to RGB32 frames that would be dumped into a file.
Is there a ffmpeg command that would do this, or any other method?
Using FFmpeg command line,
ffmpeg -i in.ts -pix_fmt rgba -c:v rawvideo -map 0:v -f rawvideo in-rgba.raw

How to extract('demux') from 'mp4' video to get the encoded data(or elementary stream) using ffmpeg?

How to demux from mp4 video to get the video(h264 format) elementary stream using ffmpeg command line ?
Thanks
ffmpeg -i input.mp4 -vcodec copy -bsf h264_mp4toannexb -an -f h264 output.h264

FFMPEG: how to save input camera stream into the file with the SAME codec format?

I have the camera-like device that produces video stream and passes it into my Windows-based machine via USB port.
Using the command:
ffmpeg -y -f vfwcap -i list
I see that (as expected) FFmpeg finds the input stream as stream #0.
Using the command:
ffmpeg -y -f vfwcap -r 25 -i 0 c:\out.mp4
I can successfully save the input stream into the file.
From the log I see:
Stream #0:0: Video: rawvideo (UYVY / 0x59565955), uyvy422, 240x320, 25 tbr, 1k tbn, 25 tbc
No pixel format specified, yuv422p for H.264 encoding chosen.
So, my input format is transcoded to yuv422p.
My question:
How can I cause FFmpeg to save my input video stream into out.mp4 WITHOUT transcoding - actually, to copy input stream to output file as close as possible, with the same format?
How can I cause ffmpeg to save my input videostream into out.mp4 WITHOUT transcoding
You can not. You can stream copy the rawvideo from vfwcap, but the MP4 container format does not support rawvideo. You have several options:
Use a different output container format.
Stream copy to rawvideo then encode.
Use a lossless encoder (and optionally re-encode it after capturing).
Use a different output container format
This meets your requirement of saving your input without re-encoding.
ffmpeg -f vfwcap -i 0 -codec:v copy rawvideo.nut
rawvideo creates huge file sizes.
Stream copy to rawvideo then encode
This is the same as above, but the rawvideo is then encoded to a more common format.
ffmpeg -f vfwcap -i 0 -codec:v copy rawvideo.nut
ffmpeg -i rawvideo.nut -codec:v libx264 -crf 23 -preset medium -pix_fmt yuv420p -movflags +faststart output.mp4
See the FFmpeg and x264 Encoding Guide for more information about -crf, -preset, and additional detailed information on creating H.264 video.
-pix_fmt yuv420p will use a pixel format that is compatible with dumb players like QuickTime. Refer to colorspace and chroma subsampling for more info.
-movflags +faststart relocates the moov atom which allows the video to begin playback before it is completely downloaded by the client. Useful if you are hosting the video and users will view it in their browser.
Use a lossless encoder
Using huffyuv:
ffmpeg -f vfwcap -i 0 -codec:v huffyuv lossless.mkv
Using lossless H.264:
ffmpeg -f vfwcap -i 0 -codec:v libx264 -qp 0 lossless.mp4
Lossless files can be huge, but not as big as rawvideo.
Re-encoding the lossless output is the same as re-encoding the rawvideo.

Resources