How to select compile parameter with custom need? - ffmpeg

I need to compile a static FFmpeg on macOS and add this build to a Xcode project. If I download a full version from official website that is work. But this version size is huge, and I just need a few format to convert. So I need to compile by myself.
I've tired to compile and it's worked. But I am not sure how to select compile parameter.
For instance, I need to convert: ogg,flac,opus,webm files to mp3 file with the minimum size. And my compile parameter :
./configure --enable-ffmpeg --enable-small --enable-static --enable-protocol=file,http,https --enable-libvorbis \
--enable-libopus --disable-ffplay --disable-ffprobe --enable-demuxer=mp3,mp4,webm_dash_manifest,opus,flac,ogg \
--enable-decoder=mp3*,vp*,mpeg4*,opus,flac --enable-libmp3lame --disable-autodetect --disable-network --enable-pthreads
But it seems not to work, I can't convert files. Error reason is dyld: Library not loaded: /usr/local/opt/lame/lib/libmp3lame.0.dylib.But I used parameter --enable-static.
So what should I do? If I need to support a format to convert, I need to care about which respect? Thanks

--enable-static is applied to ffmpeg libraries but not its dependencies. You need to download and compile lame as static as well.

Related

Hardware acceleration RPI4 FFmpeg

I'm trying to get to work hardware acceleration on my Raspberry pi 4-64. I'm using FFmpeg and AFAIK hwaccel can be reached by using OpenMAX or V4L2-M2M.
After '--enable-omx' and 'enable-omx-rpi' for FFmpeg, build fails with error: 'OMX_Core.h not found'. If I will provide manually omx headers, then it will compile but FFmpeg encoding will fail due to missing libraries: bcm_host.so and libopenmaxil.so
I have tried reverting to userland by DISABLE_VC4GRAPHICS = "1", it produced bcm_host.so, but not libopenmaxil.so. I have tried different combinations of virtual providers and graphics settings but without success.
Is it possible to access omx hardware acceleration on RPI4-64?
Steps to reproduce the issue:
1.Download latest Poky distro, meta-openembedded, meta-raspberrypi
2.Enable omx, omx-rpi support for FFmpeg
3.Link headers for FFmpeg
4.Build and try to use h264_omx
How do I get missing library libopenmaxil.so and everything I need for hwaccel?
poky master: commit 5d47cdf448b6cff5bb7cc5b0ba0426b8235ec478
meta-openembedded master: commit daa50331352c1f75da3a8ef6458ae3ddf94ef863
meta-raspberrypi master: commit 8d163dd
BTW, by using V4L2-M2M, I'm getting green shadows on the resulting video. Maybe can someone point me in the right direction?
You have to provide some extra flags to point ffmpeg to the right header and library locations, both at compile time and at run time.
This is what I used to cross-compile ffmpeg for AArxh64:
./configure \
--arch="${HOST_ARCH}" \
--target-os="linux" \
--prefix="/usr/local" \
--sysroot="${RPI_SYSROOT}" \
--enable-cross-compile \
--cross-prefix="${HOST_TRIPLE}-" \
--toolchain=hardened \
--enable-gpl --enable-nonfree \
--enable-avresample \
--enable-libvpx --enable-libx264 --enable-libxvid \
--enable-omx --enable-omx-rpi --enable-mmal --enable-neon \
--enable-shared \
--disable-static \
--disable-doc \
--extra-cflags="$(pkg-config --cflags mmal) \
-I${RPI_SYSROOT}/usr/local/include \
-I${RPI_SYSROOT}/opt/vc/include/IL" \
--extra-ldflags="$(pkg-config --libs-only-L mmal) \
-Wl,-rpath-link,${RPI_SYSROOT}/opt/vc/lib \
-Wl,-rpath,/opt/vc/lib"
Note that pkg-config is configured for cross-compilation as well, it looks in the Raspberry Pi sysroot, not in the build machine root. This is done by setting the right environment variables here.
The -I flags specify the include paths, and the -L flags returned by pkg-config --libs-only-L are the library paths. -Wl passes a list of comma separared arguments to the linker. -rpath-link is used to find shared libraries required by other shared libraries at link time, -rpath is used to find the libraries at run time. This is required because the userland libraries are in a nonstandard location, ld will not search in /opt/vc/lib by default.
You can find the toolchains, Dockerfiles and install scripts I used on my GitHub: https://github.com/tttapa/RPi-Cpp-Toolchain/tree/master/toolchain/docker/rpi3/aarch64/aarch64-cross-build
The userland script is here: https://github.com/tttapa/RPi-Cpp-Toolchain/blob/76ac03741bc7b7da106ae89884c7bada96768a07/toolchain/docker/rpi3/aarch64/aarch64-cross-build/install-scripts/userland.sh
And the ffmpeg script is here: https://github.com/tttapa/RPi-Cpp-Toolchain/blob/76ac03741bc7b7da106ae89884c7bada96768a07/toolchain/docker/rpi3/aarch64/aarch64-cross-build/install-scripts/ffmpeg.sh
There's some more documentation about the compilation process and the files used in the repository here (though not specifically about ffmpeg).

ffmpeg build on mac with videotoolbox enabled becomes unportable

If i configure ffmpeg this way:
./configure --disable-everything --enable-static --disable-shared \
--enable-gpl --enable-nonfree --enable-encoder=h264_videotoolbox,aac \
--enable-muxer=mp4 --enable-protocol=file --enable-libfdk-aac
--enable-videotoolbox --disable-autodetect
it works for my purposes (allows to encode h264 video with aac audio on Mac's videotoolbox - an Apple QSV toolkit), but if i send it to any other computer except the one it was built on, it fails with something like this:
dyld: Symbol not found: _kCVImageBufferTransferFunction_ITU_R_2100_HLG
Referenced from: /Users/admin/Downloads/./ffmpeg
Expected in: /System/Library/Frameworks/CoreVideo.framework/Versions/A/CoreVideo
in /Users/admin/Downloads/./ffmpeg
Abort trap: 6
if i rebuild it this way:
./configure --disable-everything --enable-static --disable-shared
--enable-gpl --enable-nonfree --enable-encoder=aac
--enable-muxer=mp4 --enable-protocol=file --enable-libfdk-aac
--disable-autodetect
so with everything else but videotoolbox removed, it runs successfully on any other computer, so apparently ffmpeg needs to carry along something it doesn't, for videotoolbox to work...
i am actually building a C++ app with ffmpeg's static libraries, but explaining what i do there will be a very long story and error message produced is exactly the same if i run it on different machines, so i better illustrate it on example of ffmpeg console utility itself.
what are the configure switches i need to do to make the ffmpeg build portable please?
Problem turned out to be my macos version (10.14), the API mentioned is since 10.13, so it didn't work on earlier version i tried. Fixed by rebuilding ffmpeg on 10.10.

https support for ffmpeg centos?

I have installed ffmpeg on centos.But when I feed url with https like
ffmpeg -i https://s3-us-west-2.amazonaws.com/bucket/check.mp4 video.mp4
Error come
https protocol not found, recompile FFmpeg with openssl, gnutls, or securetransport enabled.
I know i have to enable this --enable-openssl,but when i am doing like this
PKG_CONFIG_PATH="/ffmpeg_build/lib/pkgconfig"
./configure --prefix="$HOME/ffmpeg_build" --extra-cflags="-I$HOME/ffmpeg_build/include" --extra-ldflags="-L$HOME/ffmpeg_build/lib -ldl" --bindir="$HOME/bin" --pkg-config-flags="--static" --enable-gpl --enable-nonfree --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-openssl
It give me error like:
ERROR: opus not found
What should I do to enable https? Please help
The guide FFmpeg Wiki: Compile ffmpeg on CentOS halfelf referred to just got some cleanups, so try again. It will probably just be easiest to start over.
Install gnutls-devel or openssl-devel.
Remove old junk: rm -rf ~/ffmpeg_build ~/ffmpeg_sources ~/bin/{ffmpeg,ffprobe,ffserver,lame,vsyasm,x264,yasm,ytasm}
Re-run guide.
Stop at the FFmpeg section, add --enable-gnutls or --enable-openssl to the ./configure line, then continue following the guide.
Or forget compiling and just download a static build of ffmpeg: it has HTTPS support.

ERROR: libmp3lame >= 3.98.3 not found

I am installing ffmpeg utility, but I am facing libmp3lame >= 3.98.3 not found not found error. I am able to find lame-3.99.5-1.el6.rf.x86_64.rpm and lame-libs-3.98.4-1.el6.nux.x86_64.rpm but installing these are not solving the problem. I am not able to find libmp3lame rpm to install.
Can anyone help me here?
[root#sdp-dev-03:/opt/ffmpeg] # ./configure --prefix="$HOME/ffmpeg_build" --extra-cflags="-I$HOME/ffmpeg_build/include" --extra-ldflags="-L$HOME/ffmpeg_build/lib" --bindir="$HOME/bin" --extra-libs=-ldl --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libvpx --enable-libfaac --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libx264 --enable-libvo-aacenc --enable-libxvid --disable-ffplay --enable-gpl --enable-postproc --enable-nonfree --enable-avfilter --enable-pthreads
ERROR: libmp3lame >= 3.98.3 not found
If you think configure made a mistake, make sure you are using the latest
version from Git. If the latest version fails, report the problem to the
ffmpeg-user#ffmpeg.org mailing list or IRC #ffmpeg on irc.freenode.net.
Include the log file "config.log" produced by configure as this will help
solve the problem.
What worked for me was building lame from source. Download lame from here: https://sourceforge.net/projects/lame/files/lame/3.99/, then extract and install:
tar -zxvf lame-3.99.5.tar.gz
cd lame-3.99.5
./configure
make
sudo make install
Check to see where libmp3lame.a is:
locate libmp3lame.a
Its probably in /usr/local/lib.
Now when you go to configure ffmpeg, try adding that path to the end of your ./configure string. For me it made the difference. e.g.:
--extra-ldflags=-L/usr/local/lib
For configure troubleshooting see the ffbuild/config.log in the ffmpeg source directory.
In my case it had missing references to libmath functions, even if -lm was set in host_extralibs.
For a quick-fix add -lm to the configure script:
enabled libmp3lame && require "libmp3lame >= 3.98.3" lame/lame.h lame_set_VBR_quality -lmp3lame -lm
I just experienced this problem. I had lame v3.99.5 installed, but ffmpeg configure was giving ERROR: libmp3lame >= 3.98.3 not found.
In addition to --extra-ldflags, I had to specify --extra-cflags. So, the configure line was:
./configure [...] --enable-libmp3lame [...] --extra-ldflags=-L/usr/local/lib --extra-cflags=-I/usr/local/include
On Ubuntu 16.04
sudo apt-get install yasm libmp3lame-dev
Then configure ffmpeg to build from source with libmp3lame:
./configure --enable-gpl --enable-libmp3lame --enable-shared
In my case the solution for ffmpeg/3.1.3 (based on https://github.com/Homebrew/legacy-homebrew/issues/44489) was to add:
--host-ldflags=-L/usr/local/lib
to the configure string.
this is my way:
install X11,and goto ffmpeg path,and code this in the Terminal:
pkg-config usr/local/lib
pkg-config usr/lib
pkg-config usr/X11/lib
then code ./configure xxxx.

how to compile ffmpeg with librtmp in macos?

I've tried installing ffmpeg in mac lion through homebrew like this:
brew install --use-clang ffmpeg --with-tools --with-ffplay --enable-librtmp
but the compile flags ended up as
configuration: --disable-debug --prefix=/usr/local/Cellar/ffmpeg/0.6.2 --enable-shared --enable-pthreads --enable-nonfree --enable-gpl --disable-indev=jack --enable-libx264 --enable-libfaac --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libxvid --enable-libfaad
What's the correct way to install ffmpeg with librtmp? I also want to run libavfilters like blackdetect.
Doing a direct install always failed with RTMP_Socket() missing so I was hoping for a homebrew solution.
You can include librtmp in the build by adding the --with-rtmpdump flag (after installing RTMPDump). These are the commands I used:
brew install rtmpdump
brew install ffmpeg --with-x265 --with-rtmpdump
Now I have rtmp, rtmpe, rtmps, rtmpt and rtmpte as available protocols! :)
Finally able to compile! Think the key might have been running
export PKG_CONFIG_PATH=$PKG_CONFIG_PATH:/usr/local/lib/pkgconfig
just before configure but not sure since i was getting different errors according to the flags. The script I ended up running (modified from other sources) is at github: https://gist.github.com/2863964

Resources