How to save decoded raw rgba frames using ffmpeg? - ffmpeg

I need to transcode mp4 video to a raw video frames stored in a file.
I'm trying this command but it fails with Unable to find a suitable output format for 'test.rgba':
ffmpeg -i test.mp4 -vcodec rawvideo -pix_fmt rgba -an test.rgba

It's this
ffmpeg -i test.mp4 -vf format=rgba -f rawvideo test.rgba

Related

Chroma Subsampling with ffmpeg

I want to create an .mp4 output. But it doesn't work...
I'm using ffmpeg. My input video is a raw video and I want to have an raw video .mp4 at the end.
My code that i use:
ffmpeg.exe -i input.y4m -c:v rawvideo -vf format=yuv420p output.y4m
Can anyone help me out? :)
The correct syntax is: ffmpeg.exe -i input.y4m -pix_fmt yuv420p output.y4m
Test sample:
Build synthetic video y4m file in YUV444 format:
ffmpeg -f lavfi -i testsrc=rate=10:size=160x120 -t 5 -pix_fmt yuv444p input1.y4m
Convert from YUV444 to YUV420:
ffmpeg -i input1.y4m -pix_fmt yuv420p output1.y4m
You can also create AVI raw video:
ffmpeg -i input1.y4m -c:v rawvideo -pix_fmt yuv420p output1.avi
But you can't create raw mp4 video.
Following code: ffmpeg -i input1.y4m -c:v rawvideo -pix_fmt yuv420p output1.mp4 returns error message:
[mp4 # 00000140d77eb9c0] Could not find tag for codec rawvideo in stream #0, codec not currently supported in container
Could not write header for output file #0 (incorrect codec parameters ?): Invalid argument
Error initializing output stream 0:0 --
There is an error because mp4 container doesn't support raw video format.

FFMPEG - Convert UInt16 Data to .264

At the moment, I'm trying to convert with FFMPEG my raw data in uint16 format from an infrared camera to MP4 format or at least to .h264.
My current command for ffmpeg is here:
ffmpeg -f rawvideo -pix_fmt gray16be -s:v 140x110 -r 30 -i binaryMarianData.bin -c:v libx264 -f rawvideo -pix_fmt yuv420p output.264
But my ouput File is not really looking good :(
Frame of My Input, its a Nose
Frame of My Output
Here is my Input File: http://fileshare.link/91a43a238e0de75b/binaryMarianData.bin
Update 1: Little Endian
Hey guys, would be great if it's possible to get the video output in the little endian byte order.
This is a frame shown with ImageJ with the following settings
Settings of the shown frame above in ImageJ
Unfortunaley my output doesn't look like this.
Output Frame Little Endian
This is my command used to convert the RAW File:
ffmpeg -f rawvideo -pixel_format gray16le -video_size 110x140 -framerate 30 -i binaryMarianData.bin -vf transpose=clock -c:v libx264 -pix_fmt yuv420p output.264
It is sideways, so the stride has to be corrected and the image rotated.
ffmpeg -f rawvideo -pixel_format gray16be -video_size 110x140 -framerate 30 -i binaryMarianData.bin -vf transpose=clock -c:v libx264 -pix_fmt yuv420p output.264

ffmpeg convert TS video file to raw rgb32 video file

I have a transport steam file containing H.264 video and would like to extract the video stream to a file containing raw uncompressed RGB32 video frames. So the H.264 video would need to be decoded and converted to RGB32 frames that would be dumped into a file.
Is there a ffmpeg command that would do this, or any other method?
Using FFmpeg command line,
ffmpeg -i in.ts -pix_fmt rgba -c:v rawvideo -map 0:v -f rawvideo in-rgba.raw

How to extract('demux') from 'mp4' video to get the encoded data(or elementary stream) using ffmpeg?

How to demux from mp4 video to get the video(h264 format) elementary stream using ffmpeg command line ?
Thanks
ffmpeg -i input.mp4 -vcodec copy -bsf h264_mp4toannexb -an -f h264 output.h264

convert yuv to avi using gstreamer

I want to convert an avi file to an yuv file. I tried using
ffmpeg -i <input.avi> <output.yuv>
but not getting converted. can anybody suggest me conversion using gstreamer?
This is how you solve it using ffmpeg
ffmpeg -i in.avi -vcodec rawvideo -pix_fmt yuv420p -o out.yuv
Converts any input to 420 planar yuv.

Resources