1) I need gcc-4.1 for Matlab mex usage, but I can't get it installed fully with apt-get install:
The following packages have unmet dependencies:
libstdc++6-4.1-dev : Depends: gcc-4.1-base (= 4.1.2-27ubuntu1) but 4.1.2-29ubuntu1 is to be installed
Depends: g++-4.1 (= 4.1.2-27ubuntu1) but it is not going to be installed
E: Broken packages
2) I now only have gcc-4.1-base and -multilib installed. When compiling mex file:
/usr/bin/ld: cannot find -lstdc++
collect2: ld returned 1 exit status
Something is wrong with libstdc++6-4.1-dev.
So any easier fix than compiling by myself?
Thanks
I assume you use x64 version Ubuntu and your Matlab version is also 64bit. There are two ways that may solve your problem mentioned in 2):
Open mexopts.sh (located in yourhome/.matlab/MATLAB VERSION/ directory),
and comment CLIBS="CLIBS -lstdc++" for glnxa64.
Check whether libstdc++.so. exists in /usr/lib directory. If not, create a symbolic link /usr/lib/libstdc++.so to MATLABROOT/sys/os/glnxa64/libstdc++.so.6.0.xx (xx is a number that may change with matlab version).
I wouldn't compile it myself. I remember how long that takes (it's one the longest parts of building any Linux system)...
So I presume you don't have a fully functional GCC right now? I got this to install from apt-get in Ubuntu 10.10 x64...
Okay, so you have broken dependencies, eh? I know this is not elegant, but try downloading the deb files manually (http://packages.ubuntu.com/maverick/gcc-4.1 for 10.10 or http://packages.ubuntu.com/lucid/gcc-4.1 for 10.04), save them to a folder, cd into the folder from Terminal, and run this for each package:
dpkg -i package.deb
There is a more elegant way to do this, but I just don't know it...
Related
It installed about 70% of the dependencies it needed to get ffmpegs going, but it got stuck at installing 'doxygen'.
This is the error I got:
==> Installing dependencies for ffmpeg: doxygen, little-cms2, openjpeg, opus, rust, libgit2, cargo-c, rav1e, flac, libsndfile, libsamplerate, rubberband, sdl2, swig, llvm, snappy, speex, srt, leptonica, libb2, lz4, libarchive, tesseract, theora, x264, x265, xvid, docbook, boost, source-highlight, asciidoc, docbook-xsl, libyaml, ruby, asciidoctor, gnu-getopt, xmlto, libsodium, zeromq and zimg
==> Installing ffmpeg dependency: doxygen
==> cmake ..
==> make
Last 15 lines from /Users/macbook/Library/Logs/Homebrew/doxygen/02.make:
In file included from /tmp/doxygen-20220723-61533-5m5mdv/doxygen-1.9.4/src/outputlist.h:25:
/tmp/doxygen-20220723-61533-5m5mdv/doxygen-1.9.4/src/searchindex.h:29:10: fatal error: 'variant' file not found
#include <variant>
^~~~~~~~~
1 error generated.
make[2]: *** [src/CMakeFiles/doxymain.dir/__/generated_src/code.cpp.o] Error 1
make[1]: *** [src/CMakeFiles/doxymain.dir/all] Error 2
make[1]: *** Waiting for unfinished jobs....
[ 46%] Linking CXX static library ../lib/libvhdlparser.a
cd /tmp/doxygen-20220723-61533-5m5mdv/doxygen-1.9.4/build/vhdlparser && /usr/local/Cellar/cmake/3.23.2/bin/cmake -P CMakeFiles/vhdlparser.dir/cmake_clean_target.cmake
cd /tmp/doxygen-20220723-61533-5m5mdv/doxygen-1.9.4/build/vhdlparser && /usr/local/Cellar/cmake/3.23.2/bin/cmake -E cmake_link_script CMakeFiles/vhdlparser.dir/link.txt --verbose=1
/usr/bin/ar qc ../lib/libvhdlparser.a CMakeFiles/vhdlparser.dir/CharStream.cc.o CMakeFiles/vhdlparser.dir/ParseException.cc.o CMakeFiles/vhdlparser.dir/Token.cc.o CMakeFiles/vhdlparser.dir/TokenMgrError.cc.o CMakeFiles/vhdlparser.dir/__/generated_src/VhdlParser_adj.cc.o CMakeFiles/vhdlparser.dir/VhdlParserTokenManager.cc.o
/usr/bin/ranlib ../lib/libvhdlparser.a
[ 46%] Built target vhdlparser
make: *** [all] Error 2
Do not report this issue to Homebrew/brew or Homebrew/core!
Error: You are using macOS 10.12.
We (and Apple) do not provide support for this old version.
You will encounter build failures with some formulae.
Please create pull requests instead of asking for help on Homebrew's GitHub,
Twitter or any other official channels. You are responsible for resolving
any issues you experience while you are running this
old version.
Can someone please help me am not sure what it is, that I am supposed to do.
Looks like we're in the same boat. Fighting the "good fight", as it were. We should probably both just bail to linux or even windows, because god knows they are going to keep making this hard for us....
However, probably like yourself, a combination of inertia and "good reasons" keeps us staying the course ;)
Basically, the issue is described correctly by the others here (lack of c++ 17 support) and in my case (and likely yours as well) is because gcc is a symlink to clang (9.0 if you are using the latest version compatible with our os)
Telling brew to use real gcc (I assume you have compiled it already, and if not - gcc 11.3.0 is a dependency for ffmpeg anyhow, so go ahead and build it; brew install gcc) can be done like this :
HOMEBREW_CC=gcc-11 HOMEBREW_CXX=g++-11 brew install doxygen
However, the above didn't work for me because - although it solved the c++ 17 issue, it exposed another problem :(
It seems that for some reason the minimum macos version number is messed up (I think because the sdk headers for 10.13 are installed with the latest version of xcode compatible with our os) and so the compilation fails. Although there is almost certainly a better/cleaner way to deal with this issue, I solved it by manually compiling and installing doxygen after editing the code.
In the doxygen/filesystem/filesystem.hpp file around line 4491 you need to change the line
#if __MAC_OS_X_VERSION_MIN_REQUIRED < 101300
to
#if __MAC_OS_X_VERSION_MIN_REQUIRED < 111400
Then compile doxygen (extract the source from the brew cache), cd to the cmake folder then run
cmake .. -DCMAKE_INSTALL_PREFIX=/usr/local/Cellar/doxygen/1.9.4 -DCMAKE_INSTALL_LIBDIR=lib -DCMAKE_BUILD_TYPE=Release -DCMAKE_FIND_FRAMEWORK=LAST -DCMAKE_VERBOSE_MAKEFILE=ON -Wno-dev -DBUILD_TESTING=OFF -DCMAKE_C_COMPILER=gcc-11 -DCMAKE_CXX_COMPILER=g++-11
make
make install
brew link doxygen
Then continue the brew install ffmpeg. I had trouble with nettle as well, and needed to compile and install that one manually too because the version of libcrypto packaged with macos (libressl) is incompatible. I had to do the steps shown in "brew info openssl#1.1" in order to make sure the compiler used the openssl libcrypto that IS compatible and I couldn't figure out how to make brew do this for me.
Fun fun fun. I am not all that hopeful that the rest of the compilation will go without issue, but it is chugging away again now. I get the distinct impression that this is SO not worth the hassle/trouble.
Good luck!
** EDIT **
I'm still slogging through it, and it is taking a LONG time. But as I encounter more issues I will try and detail them here with the hope that each issue I encounter is reasonably easy to overcome.
llvm failed to compile with lots of missing header errors. Directing brew to use gcc-11 like i did with doxygen worked to compile it further (27%) but still failed (it looks like because it is calling clang specific options as a result of the brew cmake config passed to it... i'm still working on this one...)
Conclusion - I gave up. It isn't worth the time in my view.
Instead I installed macports and used "sudo port install ffmpeg-upstream" to install ffmpeg-5. Unless you have some dying need to compile yourself, I recommend you do this as well.
I've spent quite some time to get around this issue as well and I ended up with another solution. I merely edited the formula to suit my needs :
brew edit doxygen
Once there, spot the "def install" block then edit the file so that it looks like this :
fails_with :clang
def install
inreplace "CMakeLists.txt", "MACOS_VERSION_MIN 10.14", "MACOS_VERSION_MIN 10.11"
Save, exit.
In short, just add the "fails_with" line and the "inreplace" line. The first one causes brew to not use clang (so you need to have a gcc copy somewhere). This is to solve the C++17 issue. The second one patches the CMakeLists.txt file to allows cmake to do its magic. Once done, "brew install doxygen" should succeed.
When attempting to compile RNNLib, I got an error in NetcdfDataset.hpp:26:24 saying that Netcdfcpp.h could not be found. I looked around and found a bug report from 2011 that suggested that this was a bug, but it claimed to have been fixed. I have tried everything I can think of, including rebuilding NetCDF (a dependency of RNNLib) with various different flags, and have been unable to fix this bug. Can anyone give me a hand?
I had some trouble on a virtual machine building rnnlib.
I had to install the C and C++ version of NetCDF to get it to work.
The C version can be installed via sudo apt-get install libnetcdf-dev
I had to install the C++ version by building it.
Hope it will help. It's quite a difficult lib to install.
Maybe this helps someone: you can avoid some of the pain by installing packages from APT, and access the correct version mentioned by user3620756, which contains the netcdfcpp.h header file
. This happens through a legacy package, available on Ubuntun 16.04 (Xenial universe, see APT repository).
First install libnetcdf for C, then install libnetcdf-cxx-legacy-dev which should depend on libnetcdf-c++4 and install required C++ libraries on the go:
sudo apt install libnetcdf-dev libnetcdf-cxx-legacy-dev
The newest version doesn't have this netcdfcpp.h file anymore.
I had to use ftp://ftp.unidata.ucar.edu/pub/netcdf/netcdf-cxx-4.2.tar.gz to get it working.
I have also followed the same process and it worked for me
"The newest version doesn't have this netcdfcpp.h file anymore. I had to use ftp://ftp.unidata.ucar.edu/pub/netcdf/netcdf-cxx-4.2.tar.gz to get it working."
After downloading the folder, I had to build it by entering into the netcdf folder. I used simple command for the task :
.\configure
make
sudo make install
But in the file named as "NetcdfDataset.hpp", I have to give the complete path of the netcdfcpp.h file. For my case the path of the include file is :
#include "/Volumes/Macintosh_HD_2/WordSpottingProj/trunk/CODE C++/rnnlib_source_forge_version/netcdf-cxx-4.2/cxx/netcdfcpp.h"
I had this problem in the context of trying to use a makefile that called for netcdfcpp.h:
$ make -f makefile_MAC
c++ -O2 -o burn7.x burn7.cpp -I/opt/local/include -L/opt/local/lib -lm -lnetcdf_c++
burn7.cpp:31:10: fatal error: 'netcdfcpp.h' file not found
#include <netcdfcpp.h>
^
1 error generated.
make: *** [burn7.x] Error 1
I'm on a Mac, so I used Homewbrew to install the NetCDF package, but version 4.3.3.1 didn't appear to have netcdfcpp.h:
brew install homebrew/science/netcdf
However, I found that installing it with an additional flag resulted in this version being included:
brew install homebrew/science/netcdf --with-cxx-compat
I assume that the same is true of other installation/compilation methods, and not that this file has been taken out of versions since 4.2 as others answers state. Maybe it was a default option before and now it isn't?
I am trying to compile, libgphoto2 with libxml2 support followint the guidelines here. Everything is ok until I try to run ./configure:
./configure --prefix=/tmp/gphoto2/local --with-libxml2=yes
That appears to me as a correct syntax, however I got an output:
LIBXML2 to support Olympus ..: no
I have checked this in 2 different systems (LinuxMint 11 x64 and Ubuntu 13.04), and I have found the same problem.
Can anyone give me a clue or solution?
Is there any problem with the syntax?
Is there a common problem with the configure --with-PACKAGE[=yes] option?
Is there a common problem with LIBXML2 used in compilation?
Thanks for any help!
This problem as appears on Debian Wheezy (Linux debian 3.2.0-4-amd64 #1 SMP Debian 3.2.41-2 x86_64 GNU/Linux) and the latest libgphoto2 release 2.5.2
libxml2-dev package is installed :-
Package: libxml2-dev
State: installed
Automatically installed: no
Multi-Arch: same
Version: 2.8.0+dfsg1-7+nmu1
Not totally familiar with configure scripts
but configure.ac file has line:-
AC_CHECK_HEADER(libxml/parser.h,[
which I assume looks for libmxl/parser.h
the libxml2-dev package delivers the file
/usr/include/libxml2/libxml/parser.h
It looks like libgphoto2 is designed for libxml2 library in a different place
Tried various solutions but only the following worked
as root I sym linked libxml2 to the place libgphoto2 was looking
ln -s /usr/include/libxml2/libxml /usr/include/libxml
After compiling libgphoto2 and gphoto2 this enabled gphoto2 to talk to my Olympus E-510
Bug raised on gphoto sourceforge site (https://sourceforge.net/p/gphoto/bugs/953/) and a patch fix has been provided
Just found another way. Thanks for your help.
After digging in the config.log file created after the ./configure tool, I found the libxml2 error (that I wrongly supposed to stop the configure script):
conftest.c:75:27: fatal error: libxml/parser.h: No such file or directory
But I knew it was there but can't find it! So I checked it and found it under
/usr/lib
And found somewhere else that libxml2 package comes with a script (xml2-config) to give library linking information and more so:
$ xml2-config --cflags
-I/usr/include/libxml2
And the just needed to add the output to the CFLAGS environment variable when configuring:
$ CFLAGS="-I/usr/include/libxml2" ./configure --prefix=/tmp/gphoto2/local --with-libxml2=yes
And everything else was just ok!
Usually, a --with-some-package=yes option checks for the existence of header files for some-package on your system. If it doesn't find the required header files, then it still outputs "no" to the terminal. Have you installed your distribution's libxml2-devel (or similarly named) package?
I was following an online tutorial to install some Python modules using homebrew and one step was to install gfortran with brew install gfortran. Later on, I tried using another third-party installation script to install some Python modules and after the fact I realized that part of what the script did was download and run http://r.research.att.com/tools/gcc-42-5666.3-darwin11.pkg. I don't know that much about gfortran, but looking at the brew formula for gfortran it appears that brew uses a different version from the att.com one. Will that lead to problems in the future? I did brew uninstall gfortran and brew install gfortran again, and so far it seems like things are the same (I tried recompiling the old code that I had compiled before), but I am not sure what all the att.com pkg did. (I have OS X 10.8.2 and XCode 4.2 if that matters).
It depends on where the other gfortran installer goes to. In general, no, it won't conflict. * Homebrew is designed to be compatible with third-party gfortran installs - it defines dependencies on a generic "fortran" compiler, and not on the specific gfortran Homebrew formula. Homebrew stays under /usr/local, sticking the main install in /usr/local/Cellar, and symlinking judiciously in to /usr/local/lib. Other installers that install to /usr/local will just prevent Homebrew from linking its own compiler in, but will work with other formulas that use a fortran compiler. (Assuming the gfortran build options are compatible.)
That particular gcc-42 installer you linked to installs to /usr, not to /usr/local.** And its binaries are suffixed with "-4.2"; that is, it installs cpp-4.2, g++-4.2, gfortran-4.2, and so on. So a) there's no direct conflict with the Homebrew gfortran, and b) it won't directly shadow the /usr/local/bin/gfortran installed by Homebrew.
Which compiler will get picked up by things you build with both these gfortrans installed will depend on how the build works, but most will be looking for plain gfortran and so will find the Homebrew one, unless you explicitly direct them to the AT&T-provided one. You don't specify how you're building things, but since you're using brew, I'm assuming it's via brew or command line tools, in which case I think they're all probably seeing the Homebrew gfortran and ignoring this one. Look at the verbose output from their build processes to find out for sure.
*Depending, of course on what you mean by "conflict".
**Side note: AFAIK there's no easy way to figure this out by looking at the package or an installation manager. I just did a find /usr before and after running that installer, and did a diff on the output:
$ diff usr_before_any_installs.txt usr_after_att_install.txt | grep '^[<>]'
> /usr/bin/c++-4.2
> /usr/bin/cpp-4.2
> /usr/bin/g++-4.2
> /usr/bin/gcc-4.2
> /usr/bin/gfortran-4.2
> /usr/bin/i686-apple-darwin11-cpp-4.2.1
> /usr/bin/i686-apple-darwin11-g++-4.2.1
> /usr/bin/i686-apple-darwin11-gcc-4.2.1
> /usr/bin/i686-apple-darwin11-gfortran-4.2.1
> /usr/include/gcc
> /usr/include/gcc/darwin
> /usr/include/gcc/darwin/4.2
...
I have a virtual Debian system which I use to develop. Today I wanted to try llvm/clang. After installing clang I can't compile my old c-projects (with gcc).
This is the error:
/usr/bin/ld: cannot find crt1.o: No such file or directory
/usr/bin/ld: cannot find crti.o: No such file or directory
collect2: ld returned 1 exit status
I uninstalled clang and it still did not work. Does anyone have any idea how I can fix this?
Debian / Ubuntu
The problem is you likely only have the gcc for your current architecture and that's 64bit. You need the 32bit support files. For that, you need to install them
sudo apt install gcc-multilib
What helped me is to create a symbolic link:
sudo ln -s /usr/lib/x86_64-linux-gnu /usr/lib64
It seems that while you were playing with llvm/clang you(or the package manager) removed previously existing standard C library development package(eglibc on Debian) or maybe you didn't have it installed in the first place, thus you need to reinstall it, now that you reverted back to gcc.
You can do so like this on Debian:
aptitude show libc-dev
Ubuntu:
apt-get install libc-dev
On Ubuntu, if you don't have libc-dev, since I cannot find it on packages.ubuntu.com, you can try installing libc6-dev directly.
Or on Redhat like systems:
yum install glibc-devel
NB: Although you were briefly answered in the comments, here is an answer just so there is one on record in case someone encounters this one and might be looking for an answer, but not in the comments or the comment is not explicit enough for them.
This is a BUG reported in launchpad, but there is a workaround :
Run this to see where these files are located
$ find /usr/ -name crti*
/usr/lib/x86_64-linux-gnu/crti.o
then add this path to LIBRARY_PATH variable
$ export LIBRARY_PATH=/usr/lib/x86_64-linux-gnu:$LIBRARY_PATH
After reading the http://wiki.debian.org/Multiarch/LibraryPathOverview that jeremiah posted, i found the gcc flag that works without the symlink:
gcc -B/usr/lib/x86_64-linux-gnu hello.c
So, you can just add -B/usr/lib/x86_64-linux-gnu to the CFLAGS variable in your Makefile.
If you're using Debian's Testing version, called 'wheezy', then you may have been bitten by the move to multiarch. More about Debian's multiarch here: http://wiki.debian.org/Multiarch
Basically, what is happening is various architecture specific libraries are being moved from traditional places in the file system to new architecture specific places. This is why /usr/bin/ld is confused.
You will find crt1.o in both /usr/lib64/ and /usr/lib/i386-linux-gnu/ now and you'll need to tell your toolchain about that. Here is some documentation on how to do that; http://wiki.debian.org/Multiarch/LibraryPathOverview
Note that merely creating a symlink will only give you one architecture and you'd be essentially disabling multiarch. While this may be what you want it might not be the optimal solution.
To get RHEL 7 64-bit to compile gcc 4.8 32-bit programs, you'll need to do two things.
Make sure all the 32-bit gcc 4.8 development tools are completely installed:
sudo yum install glibc-devel.i686 libgcc.i686 libstdc++-devel.i686 ncurses-devel.i686
Compile programs using the -m32 flag
gcc pgm.c -m32 -o pgm
stolen from here : How to Compile 32-bit Apps on 64-bit RHEL? - I only had to do step 1.
As explained in crti.o file missing , it's better to use "gcc -print-search-dirs" to find out all the search path. Then create a link as explain above "sudo ln -s" to point to the location of crt1.o
This worked for me with Ubuntu 16.04
$ export LIBRARY_PATH=/usr/lib/x86_64-linux-gnu
./configure --disable-multilib
works for it
On Alpine Linux that would mean that you need musl-dev:
apk add musl-dev
Although in my case the messages were:
/usr/lib/gcc/x86_64-alpine-linux-musl/11.2.1/../../../../x86_64-alpine-linux-musl/bin/ld: cannot find Scrt1.o: No such file or directory
/usr/lib/gcc/x86_64-alpine-linux-musl/11.2.1/../../../../x86_64-alpine-linux-musl/bin/ld: cannot find crti.o: No such file or directory
/usr/lib/gcc/x86_64-alpine-linux-musl/11.2.1/../../../../x86_64-alpine-linux-musl/bin/ld: cannot find -lssp_nonshared: No such file or directory
collect2: error: ld returned 1 exit status
Which are also caused by missing musl-dev.
Ran into this on CentOs 5.4. Noticed that lib64 contained the crt*.o files, but lib did not. Installed glibc-devel through yum which installed the i386 bits and this resolved my issue.
Even I got the same compilation error when I was cross compiling i686-cm-linux-gcc.
The below compilation option solved my problem
$ i686-cm-linux-gcc a.c --sysroot=/opt/toolchain/i686-cm-linux-gcc
Note: The sysroot should point to compiler directory where usr/include available
In my case the toolchain is installed at /opt/toolchain/i686-cm-linux-gcc directory and usr/include is also available in the same directory
I solved it as follows:
1) try to locate ctr1.o and ctri.o files by using find -name ctr1.o
I got the following in my computer: $/usr/lib/i386-linux/gnu
2) Add that path to PATH (also LIBRARY_PATH) environment variable (in order to see which is the name: type env command in the Terminal):
$PATH=/usr/lib/i386-linux/gnu:$PATH
$export PATH
I had the same problem today, I solved it by installing recommended packages:
libc6-dev-mipsel-cross libc6-dev-mipsel-cross, libc-dev-mipsel-cross
This worked:
sudo apt-get install libc6-dev-mipsel-cross
One magic command:
sudo apt install build-essential
Fixed everything for me even on Raspberry Pi.
In my case, the crti.o error was entailed by the execution path configuration from Matlab.
For instance, you cannot perform a file if you have not set the path of your execution directory earlier.
To do this: File > setPath, add your directory and save.
use gcc -B lib_path_containing_crt?.o
In my case Ubuntu 16.04 I have no crti.o at all:
$ find /usr/ -name crti*
So I install developer libc6-dev package:
sudo apt-get install libc6-dev