I need some help transcoding a file with ffmpeg.
I'm trying to transcode a progressive file to an interlaced DNxHD 185 using different filters.
Goal is to archive better interlacing for a couple of problematic shots eg. zooms / drone footage.
Can someone help with the proper command?
Best,
Moritz
Related
I am trying to generate a video with libavformat/Libavcodec with a bunch of images that are in memory.
Can someone point me in the right direction, please?
Thanks in advance.
First, the basics of creating a video from images with FFmpeg is explained here.
If you simply want to change/force the format and codec of your video, here is a good start.
For the raw FFmpeg documentation you could use the Video and Audio Format Conversion, the Codec Documentation, the Format Documentation the and the image2 demuxer documentation (this demuxer will manage images as an input).
If you just want to take images and make a simple video out of it, just look at the 2 first links. FFmpeg's documentation gives you powerful tools but don't use them if you don't need them.
A sample command to create a video from images is:
ffmpeg -i image-%03d.png video.mp4
This will take all the files in sequence from image-000.png to the highest number available and make a video out of it.
You can force the format with the extension of the output file. To force the video codec use -c:v followed by a codec name available in the codec documentation.
I think the problem is relatively common, but I can't find any good solution to my problem.
Well, I need to create a preview of the video while it uploads. It can be a very big video, so I decided to use only the first 10 Mb of video to make image extraction with ffmpeg.
The command-line looks like this
ffmpeg -ss 00:00:00 -i "src.mp4" -frames:v 1 -q:v 2 "preview.jpg"
It works fine for lots of video files, but for mp4 it always failed with the error message "Moov atom not found". I guess the mp4 format is not streamable and the cut file looks like a broken video for the FFmpeg.
But there should be a solution to this. Could you help me?
i'm looking for a script that can convert a video in 2 formats for my website :
mp4 and Webm
i also want it to create a jpeg of the first frame and make all at 640*360
I'm a begginer with ffmpeg so i don't really know where to start. this is what i have for the moment, but that doesn't work
ffmpeg -i /tmp/video.off /tmp/video.webm /tmp/video.mp4
the ideal situation is to have a drag and drop conversion tool, but a folder based can do the trick too
Thank you
Mencoder has a lovely option for converting a mjpeg file into an avi file with an 'MJPG' codec that plays in VLC.
The command line to do this is:
mencoder filename.mjpeg -oac copy -ovc copy -o outputfile.avi -speed 0.3
where 0.3 is the ratio of the desired play framerate to the default 25 fps. All this does is make a copy of the mjpeg file, put an avi header on top and at the end, what seems to be an index of the frame positions in the file.
I want to replicate this in my own code, but I can't find documentation anywhere. What is the exact format of the index section? The header has extra filler bytes in it for some reason - whats this about?
Anyone know where I can find documentation? Both mencoder and vlc seem to have this codec built in.
After much work, study and fiddling around with HxD and RiffPad, I finally figured it out. It would take a long blog entry to explain it all, but basically there isn't really an 'MJPG' codec out there - mjpg just uses a few tricks and unusual parts of the avi standard to produce an indexed file.
The key is to place '00dc' and an Int32 length tag 8 bytes in front of each Jpeg open tag. If you want the avi to be random access, then you need an index at the end which points to each of the '00dc' tag positions.
VLC will play this natively. If you have ffmpeg installed, then Windows Media Player uses that to decode these types of mjpg files.
I've a C# program generating JPEG images in realtime, i need to (continuously) generate a video from the images and stream it (also in realtime).
I've used ffmpeg to transcode an input video source and stream it, doesn't ffmpeg have an option to get the input as a set of images(always being generated) and make the video out of it ?
Cheers
Actually I used VLC for the streaming....
Actually I just found at that I could:
ffmpeg -f image2 -i img%d.jpg /tmp/a.mpg
But i need to tell ffmpeg to keep doing it, I mean, if it doesn't find another image ffmpeg should wait for another one to be generated... is this possible ?