How to make libtool point to user installed library? - makefile

I am trying to install a rubygem which keeps on trying to read a library which is not available.
grep: /usr/lib64/libgdbm.la: No such file or directory
/bin/sed: can't read /usr/lib64/libgdbm.la: No such file or directory
libtool: link: /usr/lib64/libgdbm.la' is not a valid libtool archive
In order to work around this, I installed my own libgdbm and provided the path to the libgdbm in the makefile LDFLAGs but to no avail.
Any help is much appreciated.

This rubygem seems to do dirty stuff, since any clean library search (-L or pkg-config) would have resulted in a message like "library/package gdbm not found". And especially the grep-and-sed procedure on the la file seems really dirty. Make sure Santa knows that the author of this gem gets no presents this year.
The gem probably has the path to the libtool archive hardcoded. First of all, try to grep for /usr/lib64/libgdbm.la in the Makefile of the gem. Change the hardcoded path, and make sure the installation script has no write permissions on any system directories, because it seems to run wild with seds.

Libtool requires intimate knowledge of your compiler suite and operating system in order to be able to create shared libraries and link against them properly. When you install the libtool distribution, a system-specific libtool script is installed into your binary directory.
However, when you distribute libtool with your own packages, you do not always know the compiler suite and operating system that are used to compile your package.
For this reason, libtool must be configured before it can be used. This idea should be familiar to anybody who has used a GNU configure script. configure runs a number of tests for system features, then generates the Makefiles (and possibly a config.h header file), after which you can run make and build the package.
Libtool adds its own tests to your configure script in order to generate a libtool script for the installer's host machine. For this you can play with LT_INIT macro in configure.ac.
So in short, if your package has configure file, run it before running Make
make distclean //clean up all the previous generated files
autoconf //or autoreconf to generate configure script from configure.ac and configure.in
automake //to generate new Makefile.in from Makefile.ac
./configure //to generate new Makefile and libtool

Related

How to build a package from source?

I'm working on a Windows 7 computer at work and want to use the libpostal package. Unfortunately, it's apparently not available for Windows, so I'm trying to configure it through Cygwin and I'm SO close. The last step is to install snappy from Google. Again, not available on Windows...
My assumption (based on nothing) is that I can just download the tarball and build it from source, right? I tried that, and I think it worked? But a) I don't know how to tell, and b) if it did, I don't know how to tell ./configure in libpostal to find it.
In order to build it from source, I downloaded the tarball and saved it in the folder that Cygwin reads as my home, which is C:\cygwin64\home\brittenb\. From there, I ran bash autogen.sh, which created the ./configure that I needed. So I ran that and while some responses to the checks were no, it seemed to run fine. I then ran make and make install. Nothing seemed out of place, so my assumption is that it did what it was supposed to do. I just have no idea where to go from here.
Here is the output from ls after I run everything:
aclocal.m4 snappy.cc
AUTHORS snappy.h
autogen.sh snappy.lo
autom4te.cache snappy.o
ChangeLog snappy.pc
compile snappy.pc.in
config.guess snappy_unittest.cc
config.h snappy_unittest.exe
config.h.in snappy_unittest-snappy_unittest.o
config.log snappy_unittest-snappy-test.o
config.status snappy-c.cc
config.sub snappy-c.h
configure snappy-c.lo
configure.ac snappy-c.o
COPYING snappy-internal.h
depcomp snappy-sinksource.cc
format_description.txt snappy-sinksource.h
framing_format.txt snappy-sinksource.lo
INSTALL snappy-sinksource.o
install-sh snappy-stubs-internal.cc
libsnappy.la snappy-stubs-internal.h
libtool snappy-stubs-internal.lo
ltmain.sh snappy-stubs-internal.o
m4 snappy-stubs-public.h
Makefile snappy-stubs-public.h.in
Makefile.am snappy-test.cc
Makefile.in snappy-test.h
missing stamp-h1
NEWS testdata
README test-driver
ls /usr/local/bin shows nothing, but ls /usr/local/include shows:
snappy.h snappy-c.h snappy-sinksource.h snappy-stubs-public.h
So... my question: did it work? Why does ./configure in libpostal say it can't find snappy? Thanks in advance.
The snappy dependency has been removed as of release 1.0.0. I made changes to the source and make and config so that it will build on MinGW.
Get it in my repository:
https://github.com/BenK10/libpostal_windows
Note that this is not the complete source since not everything had to be changed. I would suggest merging my changes with the official libpostal distribution to make sure you've got everything. Also, there are some extra DLLEXPORTs in some source files that I haven't removed yet, and the part in the Makefile that builds the executables like address_parser.exe was removed because some porting is necessary to build those programs on Windows. You can write your own using the DLL you'll get in the Windows build and the original source as a reference.
Check the return code from make install ($?). If it is zero, make install succeeded.
snappy looks like a library, so maybe it doesn't install anything in /usr/local/bin. The library is probably installed into /usr/local/lib.

Libtool installation issue with make install

I use the following autotool steps to install my pacakges:
./configure
make
make install prefix=/my/path
However I got the following libtool warning "libtool: warning: remember to run 'libtool --finish /usr/local/lib' and "libtool: warning: 'lib/my.la' has not been installed in '/usr/local/lib'" when using the autotool to install my software package. If I change to the following command, the problem disappear:
./configure
make prefix=/my/path
make install prefix=/my/path
It looks like the first method doesn't substitute the prefix correctly to libtool. How can I avoid this problem?
Among the information that libtool archives record about the libraries they describe is the expected installation location. That information is recorded when the library is created. You can then install to a different location, but libtool will complain. Often, libtool's warning is harmless.
In order to avoid such a warning, you need to tell libtool the same installation location at build time that you do at install time. You present one way to do that in the question, but if you're using a standard Autotools build system then it is better to specify the installation prefix to configure:
./configure --prefix=/my/path
make
make install
Alternatively, if you're installing into a staging area, such as for building an RPM, then use DESTDIR at install time. libtool will still warn, but you'll avoid messing up anything else:
./configure
make
make install DESTDIR=/staging/area

Confused about configure script and Makefile.in

I'm currently learning how to use the autoconf/automake toolchain. I seem to have a general understanding of the workflow here - basically you have a configure.ac script which generates an executable configure file. The generated configure script is then executed by the end user to generate Makefiles, so the program can be built/installed.
So the installation for a typical end-user is basically:
./configure
make
make install
make clean
Okay, now here's where I'm confused:
As a developer, I've noticed that the auto-generated configure script sometimes won't run, and will error with:
config.status: error: cannot find input file: `somedir/Makefile.in'
This confuses me, because I thought the configure script is supposed to generate the Makefile.in. So Googling around for some answers, I've discovered that this can be fixed with an autogen.sh script, which basically "resets" the state of the autoconf environment. A typical autogen.sh script would be something like:
aclocal \
&& automake --add-missing \
&& autoconf
Okay fine. But as an end-user who's downloaded countless tarballs throughout my life, I've never had to use an autogen.sh script. All I did was uncompress the tarball, and do the usual configure/make/make install/make clean routine.
But as a developer who's now using autoconf, it seems that configure doesn't actually run unless you run autogen.sh first. So I find this very confusing, because I thought the end-user shouldn't have to run autogen.sh.
So why do I have to run autogen.sh first - in order for the configure script to find Makefile.in? Why doesn't the configure script simply generate it?
In order to really understand the autotools utilities you have to remember where they come from: they come from an open source world where there are (a) developers who are working from a source code repository (CVS, Git, etc.) and creating a tar file or similar containing source code and putting that tar file up on a download site, and (b) end-users who are getting the source code tar file, compiling that source code on their system and using the resulting binary. Obviously the folks in group (a) also compile the code and use the resulting binary, but the folks in group (b) don't have or need, often, all the tools for development that the folks in group (a) need.
So the use of the tools is geared towards this split, where the people in group (b) don't have access to autoconf, automake, etc.
When using autoconf, people generally check in the configure.ac file (input to autoconf) into source control but do not check in the output of autoconf, the configure script (some projects do check in the configure script of course: it's up to you).
When using automake, people generally check in the Makefile.am file (input to automake) but do not check in the output of automake: Makefile.in.
The configure script basically looks at your system for various optional elements that the package may or may not need, where they can be found, etc. Once it finds this information, it can use it to convert various XXX.in files (typically, but not solely, Makefile.in) into XXX files (for example, Makefile).
So the steps generally go like this: write configure.ac and Makefile.am and check them in. To build the project from source code control checkout, run autoconf to generate configure from configure.ac. Run automake to generate Makefile.in from Makefile.am. Run configure to generate Makefile from Makefile.in. Run make to build the product.
When you want to release the source code (if you're developing an open source product that makes source code releases) you run autoconf and automake, then bundle up the source code with the configure and Makefile.in files, so that people building your source code release just need make and a compiler and don't need any autotools.
Because the order of running autoconf and automake (and libtool if you use it) can be tricky there are scripts like autogen.sh and autoreconf, etc. which are checked into source control to be used by developers building from source control, but these are not needed/used by people building from the source code release tar file etc.
Autoconf and automake are often used together but you can use autoconf without automake, if you want to write your own Makefile.in.
For this error:
config.status: error: cannot find input file: `somedir/Makefile.in'
In the directory where the configure.ac is located in the Makefile.am add a line with the subdirectory somedir
SUBDIRS = somedir
Inside somedir put a Makefile.am with all the description. then run automaker --add-missing
A better description can be found in 7.1 Recursing subdirectories automake manual.
https://www.gnu.org/software/automake/manual/automake.html

autoconf without install files

I have started recently to work with automake and autoconf and I am a little confused about how to distribute the code.
Usually when I get a code that works with a configure file, the only thing that I get is a confiure file and the code itself with the Makefile.am and so on. Usually I do
./configure
make
sudo make install
and thats all but when I generate my configure from a configure.ac file it toss out lots of files that I thought where just temporary but if I give the code to a partner and he makes configure, it doesn't work, it needs either remake the autoreconf or have all this files (namely instal.sh,config.sub...).
Is there something that I am missing? How can I distribute the code easily and clean?
I have searched a lot but I think I am searching for the right thing because I cannot find anything useful.
Thanks a lot in advance!
Automake provides a make dist target. This automatically creates a .tar.gz from your project. This archive is set up in such a way that the recipient can simply extract it and run the usual ./configure && make && make install invocation.
It is generally not recommended to check the files generated by Autotools into your repository. This is because they are derived objects. You wouldn't check in your .o files!
Usually, it is a good idea to provide a autogen.sh script that carries out any actions required to re-create the Autotools build infrastructure in a new version control system checkout. Often, it can be as simple as:
#!/bin/sh
autoreconf -i
Then set it chmod +x, and the instructions for compiling from a clean checkout can be ./autogen.sh && ./configure && make.

Build shared libraries in ATLAS

I've read the entire ATLAS installation guide, and it says all you need to build shared (.so) libraries is to pass the --shared flag to the configure script. However, when I build, the only .so files that appear in my lib folder are libsatlas.so and libtatlas.so, though the guide says that there should be six others:
libatlas.so, libcblas.so, libf77blas.so, liblapack.so, libptcblas.so, libptf77blas.so
After installation some of the tests fail because these libraries are missing. Furthermore, FFPACK wants these libraries during installation.
Has anyone encountered this? What am I doing incorrectly?
In my experience, it's a lot more complex than that, see our EasyBuild implementation of the ATLAS build procedure at https://github.com/hpcugent/easybuild-easyblocks/blob/master/easybuild/easyblocks/a/atlas.py .
We needed to:
enable the -fPIC compiler option
run 'make shared cshared ptshared cptshared' in the 'lib' directory
We're not even using --shared for configure, probably because it doesn't do much.
If you want to build ATLAS (and whatever you will be linking it with) without headaches, look into EasyBuild.
(disclaimer: I'm a developer for EasyBuild)
First if you have incorrectly specified the --force-tids flag for configure then the parallel libs won't build. To check this you can run make ptcheck. I have question regarding the specification of this flag here
Then if I examine my resulting ATLAS Makefile it says " ... only when atlas is built to one lib" and indeed only two "fat" libs are constructed: libsatlas.so and libtatlas.so.
I quess you can either link FFPACK against those libs or change the resulting ATLAS Makefile to contain the targets you need (Which won't be too hard since the static libs are available).
I had to manually create links to the .so.3 files.
So the versioned library files existed, but not the files the cmake was looking for.
Running
sudo ln -s libatlas.so.3 libatlas.so
sudo ln -s libcblas.so.3 libcblas.so
sudo ln -s liblapack_atlas.so.3
(I didn't build the cblas, atlas or lapack but installed them with apt-get. Wondering why the links were not automatically created).

Resources