Boost in catkin dependency after installing from source - boost

I'm using Ubuntu 20.4, and I have installed Boost 1.81.0 from source, but when I try to build ROS Package using CMake, the dependency to Boost library is searched in the path /usr/lib/x86_64-linux-gnu/ with the default Boost Version 1.71.0 instead of the installed one 1.81.0 on /usr/local/lib/.
So when I build this package for example I'm getting the Error:
Errors
<< abb_robot_cpp_utilities:
make /home/user/ros1_ws/pacbot_ws/logs/abb_robot_cpp_utilities/build.make.000.log
make[2]: *** No rule to make target
'/usr/lib/x86_64-linux-gnu/libboost_chrono.so.1.71.0',
needed by '/home/user/ros1_ws/pacbot_ws/devel/.private/abb_robot_cpp_utilities/lib/libabb_robot_cpp_utilities.so'.
Stop.
I have the path /usr/local/lib/ exported export LD_LIBRARY_PATH=/lib:/usr/lib:/usr/local/lib in my ~/.bashrc, and I have done sudo ldconfig already, but they didn't solve my problem, can you please tell me how can I solve this problem to compile the package correctly? thanks in advance.

Thanks to #Tsyvarev comments, The solution in my case was to:
Remove all libboost* libraries, and boost folders.
Install boost 1.71.0 from source in addition to the sudo apt install libboost-* libraries to solve the /usr/lib/x86_64-linux-gnu/ problem.

Related

How to make easy_install see libraries installed by homebrew?

My use case: I installed graphviz using homebrew. It's installed with include headers and dynamic libraries into /usr/local/Cellar/graphviz, which is great. Then I tried installing pygraphviz with easy_install, but got the following error:
pygraphviz/graphviz_wrap.c:2954:10: fatal error: 'graphviz/cgraph.h'
file not found
#include "graphviz/cgraph.h"
I can add the include path manually, but then the linker complains about not finding the dynamic libraries. Again, I can specify the paths manually, but it seems that it should Just Work (TM). So, why doesn't easy_install play nice with homebrew-installed libraries?

gnu/libtool (libltdl) installed but not found by configure script

I am trying to install guile locally on a system. It requires gnu/libtool. While installing all its dependencies, the "make check" command showed errors while installing gnu/libtool. But if I omitted the command and simply ran "make" followed by "make install", then it was able to install successfully. I was able to install the rest of the dependencies without any problem. However, when I run the following command, then I am getting the below mentioned error:
Command:
../configure --with-libltdl-prefix=$PREFIX/libtool --with-libgmp-prefix=$PREFIX/gmp --with-libunistring-prefix=$PREFIX/libunistring --with-libiconv-prefix=$PREFIX/libiconv --with-libreadline-prefix=$PREFIX/libreadline --with-libintl-prefix=$PREFIX/gettext --prefix=$PREFIX/guile
Error:
configure: error: GNU libltdl (Libtool) not found, see README.
the $PREFIX is defined and I have installed the libltdl library in the libtool folder. When I look through the include and lib sub directories of the libtool folder, I can find the libltdl folders and .so files.
So, I am unsure as to why the configure script is not able to find the locally installed version of libtool. I will be highly grateful if someone can point out the problem in the command and how to remedy this error.
I had a similar issue when trying to compile bind9 using distcc under Rasbian. I had previously installed the package libtool but I was also missing the package libtool-bin.
That solved my issue.
Try
apt list libtool* --installed
and see if both show up.

Error during RNNLib configuration: netcdfcpp.h cannot be found

When attempting to compile RNNLib, I got an error in NetcdfDataset.hpp:26:24 saying that Netcdfcpp.h could not be found. I looked around and found a bug report from 2011 that suggested that this was a bug, but it claimed to have been fixed. I have tried everything I can think of, including rebuilding NetCDF (a dependency of RNNLib) with various different flags, and have been unable to fix this bug. Can anyone give me a hand?
I had some trouble on a virtual machine building rnnlib.
I had to install the C and C++ version of NetCDF to get it to work.
The C version can be installed via sudo apt-get install libnetcdf-dev
I had to install the C++ version by building it.
Hope it will help. It's quite a difficult lib to install.
Maybe this helps someone: you can avoid some of the pain by installing packages from APT, and access the correct version mentioned by user3620756, which contains the netcdfcpp.h header file
. This happens through a legacy package, available on Ubuntun 16.04 (Xenial universe, see APT repository).
First install libnetcdf for C, then install libnetcdf-cxx-legacy-dev which should depend on libnetcdf-c++4 and install required C++ libraries on the go:
sudo apt install libnetcdf-dev libnetcdf-cxx-legacy-dev
The newest version doesn't have this netcdfcpp.h file anymore.
I had to use ftp://ftp.unidata.ucar.edu/pub/netcdf/netcdf-cxx-4.2.tar.gz to get it working.
I have also followed the same process and it worked for me
"The newest version doesn't have this netcdfcpp.h file anymore. I had to use ftp://ftp.unidata.ucar.edu/pub/netcdf/netcdf-cxx-4.2.tar.gz to get it working."
After downloading the folder, I had to build it by entering into the netcdf folder. I used simple command for the task :
.\configure
make
sudo make install
But in the file named as "NetcdfDataset.hpp", I have to give the complete path of the netcdfcpp.h file. For my case the path of the include file is :
#include "/Volumes/Macintosh_HD_2/WordSpottingProj/trunk/CODE C++/rnnlib_source_forge_version/netcdf-cxx-4.2/cxx/netcdfcpp.h"
I had this problem in the context of trying to use a makefile that called for netcdfcpp.h:
$ make -f makefile_MAC
c++ -O2 -o burn7.x burn7.cpp -I/opt/local/include -L/opt/local/lib -lm -lnetcdf_c++
burn7.cpp:31:10: fatal error: 'netcdfcpp.h' file not found
#include <netcdfcpp.h>
^
1 error generated.
make: *** [burn7.x] Error 1
I'm on a Mac, so I used Homewbrew to install the NetCDF package, but version 4.3.3.1 didn't appear to have netcdfcpp.h:
brew install homebrew/science/netcdf
However, I found that installing it with an additional flag resulted in this version being included:
brew install homebrew/science/netcdf --with-cxx-compat
I assume that the same is true of other installation/compilation methods, and not that this file has been taken out of versions since 4.2 as others answers state. Maybe it was a default option before and now it isn't?

automake configure ignoring option --with-libxml2=yes

I am trying to compile, libgphoto2 with libxml2 support followint the guidelines here. Everything is ok until I try to run ./configure:
./configure --prefix=/tmp/gphoto2/local --with-libxml2=yes
That appears to me as a correct syntax, however I got an output:
LIBXML2 to support Olympus ..: no
I have checked this in 2 different systems (LinuxMint 11 x64 and Ubuntu 13.04), and I have found the same problem.
Can anyone give me a clue or solution?
Is there any problem with the syntax?
Is there a common problem with the configure --with-PACKAGE[=yes] option?
Is there a common problem with LIBXML2 used in compilation?
Thanks for any help!
This problem as appears on Debian Wheezy (Linux debian 3.2.0-4-amd64 #1 SMP Debian 3.2.41-2 x86_64 GNU/Linux) and the latest libgphoto2 release 2.5.2
libxml2-dev package is installed :-
Package: libxml2-dev
State: installed
Automatically installed: no
Multi-Arch: same
Version: 2.8.0+dfsg1-7+nmu1
Not totally familiar with configure scripts
but configure.ac file has line:-
AC_CHECK_HEADER(libxml/parser.h,[
which I assume looks for libmxl/parser.h
the libxml2-dev package delivers the file
/usr/include/libxml2/libxml/parser.h
It looks like libgphoto2 is designed for libxml2 library in a different place
Tried various solutions but only the following worked
as root I sym linked libxml2 to the place libgphoto2 was looking
ln -s /usr/include/libxml2/libxml /usr/include/libxml
After compiling libgphoto2 and gphoto2 this enabled gphoto2 to talk to my Olympus E-510
Bug raised on gphoto sourceforge site (https://sourceforge.net/p/gphoto/bugs/953/) and a patch fix has been provided
Just found another way. Thanks for your help.
After digging in the config.log file created after the ./configure tool, I found the libxml2 error (that I wrongly supposed to stop the configure script):
conftest.c:75:27: fatal error: libxml/parser.h: No such file or directory
But I knew it was there but can't find it! So I checked it and found it under
/usr/lib
And found somewhere else that libxml2 package comes with a script (xml2-config) to give library linking information and more so:
$ xml2-config --cflags
-I/usr/include/libxml2
And the just needed to add the output to the CFLAGS environment variable when configuring:
$ CFLAGS="-I/usr/include/libxml2" ./configure --prefix=/tmp/gphoto2/local --with-libxml2=yes
And everything else was just ok!
Usually, a --with-some-package=yes option checks for the existence of header files for some-package on your system. If it doesn't find the required header files, then it still outputs "no" to the terminal. Have you installed your distribution's libxml2-devel (or similarly named) package?

How to fix linker error "cannot find crt1.o"?

I have a virtual Debian system which I use to develop. Today I wanted to try llvm/clang. After installing clang I can't compile my old c-projects (with gcc).
This is the error:
/usr/bin/ld: cannot find crt1.o: No such file or directory
/usr/bin/ld: cannot find crti.o: No such file or directory
collect2: ld returned 1 exit status
I uninstalled clang and it still did not work. Does anyone have any idea how I can fix this?
Debian / Ubuntu
The problem is you likely only have the gcc for your current architecture and that's 64bit. You need the 32bit support files. For that, you need to install them
sudo apt install gcc-multilib
What helped me is to create a symbolic link:
sudo ln -s /usr/lib/x86_64-linux-gnu /usr/lib64
It seems that while you were playing with llvm/clang you(or the package manager) removed previously existing standard C library development package(eglibc on Debian) or maybe you didn't have it installed in the first place, thus you need to reinstall it, now that you reverted back to gcc.
You can do so like this on Debian:
aptitude show libc-dev
Ubuntu:
apt-get install libc-dev
On Ubuntu, if you don't have libc-dev, since I cannot find it on packages.ubuntu.com, you can try installing libc6-dev directly.
Or on Redhat like systems:
yum install glibc-devel
NB: Although you were briefly answered in the comments, here is an answer just so there is one on record in case someone encounters this one and might be looking for an answer, but not in the comments or the comment is not explicit enough for them.
This is a BUG reported in launchpad, but there is a workaround :
Run this to see where these files are located
$ find /usr/ -name crti*
/usr/lib/x86_64-linux-gnu/crti.o
then add this path to LIBRARY_PATH variable
$ export LIBRARY_PATH=/usr/lib/x86_64-linux-gnu:$LIBRARY_PATH
After reading the http://wiki.debian.org/Multiarch/LibraryPathOverview that jeremiah posted, i found the gcc flag that works without the symlink:
gcc -B/usr/lib/x86_64-linux-gnu hello.c
So, you can just add -B/usr/lib/x86_64-linux-gnu to the CFLAGS variable in your Makefile.
If you're using Debian's Testing version, called 'wheezy', then you may have been bitten by the move to multiarch. More about Debian's multiarch here: http://wiki.debian.org/Multiarch
Basically, what is happening is various architecture specific libraries are being moved from traditional places in the file system to new architecture specific places. This is why /usr/bin/ld is confused.
You will find crt1.o in both /usr/lib64/ and /usr/lib/i386-linux-gnu/ now and you'll need to tell your toolchain about that. Here is some documentation on how to do that; http://wiki.debian.org/Multiarch/LibraryPathOverview
Note that merely creating a symlink will only give you one architecture and you'd be essentially disabling multiarch. While this may be what you want it might not be the optimal solution.
To get RHEL 7 64-bit to compile gcc 4.8 32-bit programs, you'll need to do two things.
Make sure all the 32-bit gcc 4.8 development tools are completely installed:
sudo yum install glibc-devel.i686 libgcc.i686 libstdc++-devel.i686 ncurses-devel.i686
Compile programs using the -m32 flag
gcc pgm.c -m32 -o pgm
stolen from here : How to Compile 32-bit Apps on 64-bit RHEL? - I only had to do step 1.
As explained in crti.o file missing , it's better to use "gcc -print-search-dirs" to find out all the search path. Then create a link as explain above "sudo ln -s" to point to the location of crt1.o
This worked for me with Ubuntu 16.04
$ export LIBRARY_PATH=/usr/lib/x86_64-linux-gnu
./configure --disable-multilib
works for it
On Alpine Linux that would mean that you need musl-dev:
apk add musl-dev
Although in my case the messages were:
/usr/lib/gcc/x86_64-alpine-linux-musl/11.2.1/../../../../x86_64-alpine-linux-musl/bin/ld: cannot find Scrt1.o: No such file or directory
/usr/lib/gcc/x86_64-alpine-linux-musl/11.2.1/../../../../x86_64-alpine-linux-musl/bin/ld: cannot find crti.o: No such file or directory
/usr/lib/gcc/x86_64-alpine-linux-musl/11.2.1/../../../../x86_64-alpine-linux-musl/bin/ld: cannot find -lssp_nonshared: No such file or directory
collect2: error: ld returned 1 exit status
Which are also caused by missing musl-dev.
Ran into this on CentOs 5.4. Noticed that lib64 contained the crt*.o files, but lib did not. Installed glibc-devel through yum which installed the i386 bits and this resolved my issue.
Even I got the same compilation error when I was cross compiling i686-cm-linux-gcc.
The below compilation option solved my problem
$ i686-cm-linux-gcc a.c --sysroot=/opt/toolchain/i686-cm-linux-gcc
Note: The sysroot should point to compiler directory where usr/include available
In my case the toolchain is installed at /opt/toolchain/i686-cm-linux-gcc directory and usr/include is also available in the same directory
I solved it as follows:
1) try to locate ctr1.o and ctri.o files by using find -name ctr1.o
I got the following in my computer: $/usr/lib/i386-linux/gnu
2) Add that path to PATH (also LIBRARY_PATH) environment variable (in order to see which is the name: type env command in the Terminal):
$PATH=/usr/lib/i386-linux/gnu:$PATH
$export PATH
I had the same problem today, I solved it by installing recommended packages:
libc6-dev-mipsel-cross libc6-dev-mipsel-cross, libc-dev-mipsel-cross
This worked:
sudo apt-get install libc6-dev-mipsel-cross
One magic command:
sudo apt install build-essential
Fixed everything for me even on Raspberry Pi.
In my case, the crti.o error was entailed by the execution path configuration from Matlab.
For instance, you cannot perform a file if you have not set the path of your execution directory earlier.
To do this: File > setPath, add your directory and save.
use gcc -B lib_path_containing_crt?.o
In my case Ubuntu 16.04 I have no crti.o at all:
$ find /usr/ -name crti*
So I install developer libc6-dev package:
sudo apt-get install libc6-dev

Resources