I have a raw .h264 video with resolution 1920x1080 and I want to resize to 1280x720 in the same format (h264) using ffmpeg. I found examples on doing that but to mp4 (https://askubuntu.com/questions/690015/how-can-i-convert-264-file-to-mp4) but I actually want to resize to .h264 format (no container)
Use the scale filter:
ffmpeg -i input.h264 -vf scale=1280:720 output.h264
If ffmpeg assumes an incorrect frame rate (refer to console output) add the -framerate input option:
ffmpeg -framerate 24 -i input.h264 -vf scale=1280:720 output.h264
Related
I'm able to play a local YUV file through VLC (as expected)
.\vlc.exe --demux rawvideo --rawvid-fps 25 --rawvid-width 480 --rawvid-height 360 --rawvid-chroma I420 out.yuv
Also FFPLAY is playing well
.\ffplay.exe -f rawvideo -pixel_format yuv420p -video_size 480x360 out.yuv
Conversion to YUV was done with FFMPEG
.\ffmpeg.exe -i "video.mp4" -c:v rawvideo -pixel_format yuv420p out.yuv
However, I'm not able o stream it and play'it over local network.
I know, it sounds crazy :) but I plan to use VLC as a monitor/debugger for some YUV data.
The original MP4 file is streamed/played fine with VLC (test on the same computer)
.\vlc.exe "video.mp4" --sout="#std{access=http, mux=ts, dst=:55555/}"
.\vlc.exe http://192.168.0.174:55555/
So the big question!
How can I stream the YUV format and VLC recognise an play it as well?
All that I tried with VLC is not playable.
Thanks for any hints!
I need to transcode mp4 video to a raw video frames stored in a file.
I'm trying this command but it fails with Unable to find a suitable output format for 'test.rgba':
ffmpeg -i test.mp4 -vcodec rawvideo -pix_fmt rgba -an test.rgba
It's this
ffmpeg -i test.mp4 -vf format=rgba -f rawvideo test.rgba
I have a question about avconv (or ffmpeg) usage.
My goal is to capture video from a webcam and saving it to a file.
Also, I don't want to use too much CPU processing. (I don't want avconv to scale or re-encode the stream)
So, I was thinking to use the compressed mjpeg video stream from the webcam and directly saving it to a file.
My webcam is a Microsoft LifeCam HD 3000 and its capabilities are:
ffmpeg -f v4l2 -list_formats all -i /dev/video0
Raw: yuyv422 : YUV 4:2:2 (YUYV) : 640x480 1280x720 960x544 800x448 640x360 424x240 352x288 320x240 800x600 176x144 160x120 1280x800
Compressed: mjpeg : MJPEG : 640x480 1280x720 960x544 800x448 640x360 800x600 416x240 352x288 176x144 320x240 160x120
What would be the avconv command to save the Compressed stream directly without having avconv doing scaling or re-encoding.
For now, I am using this command:
avconv -f video4linux2 -r 30 -s 320x240 -i /dev/video0 test.avi
I'm not sure that this command is CPU efficient since I don't tell anywhere to use the mjpeg Compressed capability of the webcam.
Is avconv taking care of the configuration of the webcam setting before starting to record the file ? Is it always working of raw stream and doing scaling and enconding on the raw stream ?
Thanks for your answer
Reading the actual documentation™ is the closest thing to magic you'll get in real life:
video4linux2, v4l2
input_format
Set the preferred pixel format (for raw video) or a codec name. This option allows one to select the input format, when several are available.
video_size
Set the video frame size. The argument must be a string in the form WIDTHxHEIGHT or a valid size abbreviation.
The command uses -c:v copy to just copy the received encoding without touching it therefore achieving the lowest resource use:
ffmpeg -f video4linux2 -input_format mjpeg -video_size 640x480 -i /dev/video0 -c:v copy <output>
I have the camera-like device that produces video stream and passes it into my Windows-based machine via USB port.
Using the command:
ffmpeg -y -f vfwcap -i list
I see that (as expected) FFmpeg finds the input stream as stream #0.
Using the command:
ffmpeg -y -f vfwcap -r 25 -i 0 c:\out.mp4
I can successfully save the input stream into the file.
From the log I see:
Stream #0:0: Video: rawvideo (UYVY / 0x59565955), uyvy422, 240x320, 25 tbr, 1k tbn, 25 tbc
No pixel format specified, yuv422p for H.264 encoding chosen.
So, my input format is transcoded to yuv422p.
My question:
How can I cause FFmpeg to save my input video stream into out.mp4 WITHOUT transcoding - actually, to copy input stream to output file as close as possible, with the same format?
How can I cause ffmpeg to save my input videostream into out.mp4 WITHOUT transcoding
You can not. You can stream copy the rawvideo from vfwcap, but the MP4 container format does not support rawvideo. You have several options:
Use a different output container format.
Stream copy to rawvideo then encode.
Use a lossless encoder (and optionally re-encode it after capturing).
Use a different output container format
This meets your requirement of saving your input without re-encoding.
ffmpeg -f vfwcap -i 0 -codec:v copy rawvideo.nut
rawvideo creates huge file sizes.
Stream copy to rawvideo then encode
This is the same as above, but the rawvideo is then encoded to a more common format.
ffmpeg -f vfwcap -i 0 -codec:v copy rawvideo.nut
ffmpeg -i rawvideo.nut -codec:v libx264 -crf 23 -preset medium -pix_fmt yuv420p -movflags +faststart output.mp4
See the FFmpeg and x264 Encoding Guide for more information about -crf, -preset, and additional detailed information on creating H.264 video.
-pix_fmt yuv420p will use a pixel format that is compatible with dumb players like QuickTime. Refer to colorspace and chroma subsampling for more info.
-movflags +faststart relocates the moov atom which allows the video to begin playback before it is completely downloaded by the client. Useful if you are hosting the video and users will view it in their browser.
Use a lossless encoder
Using huffyuv:
ffmpeg -f vfwcap -i 0 -codec:v huffyuv lossless.mkv
Using lossless H.264:
ffmpeg -f vfwcap -i 0 -codec:v libx264 -qp 0 lossless.mp4
Lossless files can be huge, but not as big as rawvideo.
Re-encoding the lossless output is the same as re-encoding the rawvideo.
I have a raw YUV video file that I want to do some basic editing to in Adobe CS6 Premiere, but it won't recognize the file. I thought to use ffmpeg to convert it to something Premiere would take in, but I want this to be lossless because afterwards I will need it in YUV format again. I thought of avi, mov, and prores but I can't seem to figure out the proper command line to ffmpeg and how to ensure it is lossless.
Thanks for your help.
Yes, this is possible. It is normal that you can't open that raw video file since it is just raw data in one giant file, without any headers. So Adobe Premiere doesn't know what the size is, what framerate ect.
First make sure you downloaded the FFmpeg command line tool. Then after installing you can start converting by running a command with parameters. There are some parameters you have to fill in yourself before starting to convert:
What type of the YUV pixel format are you using? The most common format is YUV4:2:0 planar 8-bit (YUV420p). You can type ffmpeg -pix_fmts to get a list of all available formats.
What is the framerate? In my example I will use -r 25 fps.
What encoder do you want to use? The libx264 (H.264) encoder is a great one for lossless compression.
What is your framesize? In my example I will use -s 1920x1080
Then we get this command to do your compression.
ffmpeg -f rawvideo -vcodec rawvideo -s 1920x1080 -r 25 -pix_fmt yuv420p -i inputfile.yuv -c:v libx264 -preset ultrafast -qp 0 output.mp4
A little explanation of all other parameters:
With -f rawvideo you set the input format to a raw video container
With -vcodec rawvideo you set the input file as not compressed
With -i inputfile.yuv you set your input file
With -c:v libx264 you set the encoder to encode the video to libx264.
The -preset ultrafast setting is only speeding up the compression so your file size will be bigger than setting it to veryslow.
With -qp 0 you set the maximum quality. 0 is best, 51 is worst quality in our example.
Then output.mp4 is your new container to store your data in.
After you are done in Adobe Premiere, you can convert it back to a YUV file by inverting allmost all parameters. FFmpeg recognizes what's inside the mp4 container, so you don't need to provide parameters for the input.
ffmpeg -i input.mp4 -f rawvideo -vcodec rawvideo -pix_fmt yuv420p -s 1920x1080 -r 25 rawvideo.yuv