FFMpeg Can't find libvorbis under Windows - ffmpeg

I try to use FFMpeg to generate MKV video format. By default, FFMpeg will using h264 & libvorbis. But when I using the doc/examples/muxing.c under ffmpeg source file folder, there is always an error:
[libvorbis # 002e52a0] Specified sample format s16 is invalid or not supported
Could not open audio codec: Error number -22 occurred
I used Zeranoe FFmpeg and showed this error. I also tried to compile the ffmpeg from source under MinGW, and I also enable the libvorbis by following comfiguration:
$ ./configure --prefix=/mingw --enable-libvpx --enable-libvorbis --enable-shared --enable-static
Before I make, I also install libvorbis, libogg, yasm etc. But the error is still there.
If I use the ffmpeg.exe to convert video to webm format, it works. The command is like following:
ffmpeg -i test.h264 -vcodec libvpx -acodec libvorbis output.webm
The generated output.webm can be played by Firefox or something else. So I think the compiled ffmpeg library is OK. But why I can't generate webm file in muxing.c code?

As can be seen in the file libvorbisenc.c, the libvorbis library supports only AV_SAMPLE_FMT_FLTP (float planar data) input format.
You have to use, for example, a SwResample library from ffmpeg for converting audio data.

Try compiling the libvorbis package with the following :
LDFLAGS="-static" \
LIBS="-logg" \
./configure \
--prefix=$INSTALL_PREFIX \
--with-gnu-ld \
--with-pic \
--enable-static \
--disable-shared \
--disable-oggtest

Related

How to use the hardware acceleration for ffmepg on m1-max?

Since there aren't m1 builds available from ffmpeg.org, I had to compile my own. Obviously, I'd like to get the best possible performance.
Does ffmpeg use the "Hardware-accelerated H.264" on the m1 max?
Is there anything I need to do, like compiler flags, to get it?
Any switch at run time?
How can I verify that it's being used?
To compile ffmpeg, I just did the basics:
./configure --prefix=/tmp/ff --enable-gpl --enable-nonfree --enable-libx264
make
make install
For x264, I just did
./configure --prefix=/tmp/ff
make
make install
to run:
ffmpeg -i random.wmv -c:v libx264 -preset ultrafast -c:a aac output-ultra.mp4
Anything else I should be doing?
It looks like what I wanted was videotoolbox
Usage is documented here, basically
To use H.264/HEVC hardware encoding in macOS, just use the encoder -c:v h264_videotoolbox
Example:
ffmpeg -i random.wmv -c:v h264_videotoolbox -c:a aac junk-vt.mp4
Seems to be slightly faster than "ultrafast" with software, and much smaller files.
H264_videotoolbox is useless on M1 Pro. I don't see that the GPU is working. I use the same file to transcode h264_videotoolbox can only play to a 6.x magnification and -vcodec h264 Use CPU magnification to reach 12.x
Ffmpeg 5.1.2 macos 13.1

ffmpeg hwaccel no decoder surfaces left

Recently I compiled natively the latest version of ffmpeg 4.3 on Windows 10 amd64.
Evironment:CUDA11.0, NASM, VS2019, MYSY2 with mingw64.
I also used the patch https://trac.ffmpeg.org/attachment/ticket/9019/0001-Patch-for-ticket-9019-CUDA-Compile-Broken-Using-MSVC.patch
Compile featrues were:
--enable-nonfree --enable-cuda-nvcc --enable-libnpp --enable-gpl --enable-libx264 --enable-cuda-llvm --enable-nvenc
--toolchain=msvc --extra-cflags=-I../nv_sdk --extra-ldflags=-libpath:../nv_sdk
I tested ffmpeg for cuda acceleration. the CPU is an AMD 3500x. the GPU is an RTX 2060Ultra.
Issuing this command:
.\ffmpeg -hwaccel cuvid -i .\a.wmv -c:v hevc_nvenc -bf 4 -preset slow -c:a aac -b:a 256k myvideo.mp4
But recieved this error:
[wmv3 # 000002632DFC5180] No decoder surfaces left
Error while decoding stream #0:0: Cannot allocate memory
[hevc_nvenc # 00000263300B1740] Failed locking bitstream buffer: out of memory (10):
video encoding failed: Cannot allocate memory
I'm not sure where I've gone wrong here.
Try adding -extra_hw_frames N to your input and increase N until the error ceases. I just needed 8 myself.
I encountered this same problem on version 4.4 as well. This was reported against 4.1 but only on some cases. Someone suggested the -extra_hw_frames N workaround on https://trac.ffmpeg.org/ticket/7562 and it worked for me.
I also had the same problem as OP and followed the user's "Moby Disk" advice to use "-extra_hw_frames N"
Here is what I used which worked for me:
ffmpeg -y -vsync 0 -hwaccel cuda -hwaccel_output_format cuda -extra_hw_frames 8 -i video_sample.mp4 -c:a copy -c:v h264_nvenc -b:v 5M output.mp4
The GPU I have is:
MSI Nvidia GeForce GT 710 2GB 2GD3
setting the output format to auto working for me. the -extra_hw_frames cause a initialisation error (but do convert the video)
ffmpeg -hwaccel cuda -hwaccel_output_format auto

FMS FLV (Speex) to mp3/mp4/acc/wav

I'm trying to decode an FLV's audio to a playable format. I attempted to use this SO post: FMS FLV to mp3.. as an example, but my FLV is encoded in Speex.
I have compiled ffmpeg with --enable-libspeex on a Fedora 15 machine.
I believe this can be done with ffmpeg but I'm having a hard time figuring out how to do it.
Any thoughts? Thanks
Your ffmpeg needs to be configured with --enable-libspeex to support Speex decoding. Since you did not provide your OS I can not give any more specific instructions. Once you have a build of ffmpeg that can decode speex the most simple command would be:
ffmpeg -i input.flv output.wav
while reencoding flv file (speex to mp3) if you get sample rate error try this:
ffmpeg -i c:\in.flv -acodec libmp3lame -ar 44100 -vcodec copy c:\out.flv
It does not matter what your input. As long as you have the decoder and encoder enabled in your ffmpeg it will do it.
ffmpeg -i inputfile.flv -acodec libmp3lame any_other_parameters_you_want -vcodec copy out.flv
will do the trick.
run ffmpeg -codecs to see the codecs supported and ffmpeg -formats to see the formats supported in your install.

FFmpeg/libavcodec failing for asv1 conversion using PHPVideoToolkit

It seems that I can not get libavcodec running with my install of ffmpeg. I say 'it seems' because the searching I have done based on the following error message has gotten me to that point:
PHPVideoToolkit Error: Execute error. Output for file "/home/clrock/public_html/drupal-7.14/sites/default/files/img/videos/original/StoryboardMovie.mp4" was found, but the file contained no data. Please check the available codecs compiled with FFmpeg can support this type of conversion. You can check the encode decode availability by inspecting the output array from PHPVideoToolkit::getFFmpegInfo().
The ffmpeg command is
/usr/local/bin/ffmpeg \
-i '/home/clrock/public_html/drupal-7.14/sites/default/files/img/videos/original/StoryboardMovie.mp4' \
-strict experimental -vcodec 'asv1' -s '640x480' -acodec 'aac' -ac '2' \
/tmp/1343067407-500d950fbd290.3gp
I can not seem to find out how to get ffmpeg to configure with libavcodec. It seems all of the necessary files are there in /usr/src/ffmpeg-0.7.12/libavcodec.
ffmpeg can make mp4 and flv files fine, only when using asv1 does it hang up.
I only needed to change the ouput settings. For webm:
vcodec=libvpx acodec=vorbis
Form Mp4
vcodec=libx264 acodec=libmp3lame
With Mp4, I couldn't use the libx264-hq preset as something was missing. I'm not sure what, but it's good to know I don't need it.
Thanks.

transcode and segment with ffmpeg

It appears that ffmpeg now has a segmenter in it, or at least there is a command line option
-f segment
in the documentation.
Does this mean I can use ffmpeg to realtime-transcode a video into h.264 and deliver segmented IOS compatible .m3u8 streams using ffmpeg alone? if so, what would a command to transcode an arbitrary video file to a segmented h.264 aac 640 x 480 stream ios compatible stream?
Absolutely - you can use -f segment to chop video into pieces and serve it to iOS devices. ffmpeg will create segment files .ts and you can serve those with any web server.
Working example (with disabled sound) - ffmpeg version N-39494-g41a097a:
./ffmpeg -v 9 -loglevel 99 -re -i sourcefile.avi -an \
-c:v libx264 -b:v 128k -vpre ipod320 \
-flags -global_header -map 0 -f segment -segment_time 4 \
-segment_list test.m3u8 -segment_format mpegts stream%05d.ts
Tips:
make sure you compile ffmpeg from most recent git repository
compile with libx264 codec
-map 0 is needed
How I compiled FFMPEG - with extra rtmp support to get feeds from flash-media-server
export PKG_CONFIG_PATH="/usr/lib/pkgconfig/:../rtmpdump-2.3/librtmp"
./configure --enable-librtmp --enable-libx264 \
--libdir='../x264/:/usr/local/lib:../rtmpdump-2.3' \
--enable-gpl --enable-pthreads --enable-libvpx \
--disable-ffplay --disable-ffserver --disable-shared --enable-debug
This is found in the ffmpeg documentation: https://ffmpeg.org/ffmpeg-formats.html#segment_002c-stream_005fsegment_002c-ssegment

Resources