A common issue automake complains about is caused by following lines in various Makefile.am:s
Makefile.am:
SUBDIRS = deployment transport/http/util transport/http/common engine transport
The intent of this line is to force the order of building so that transport/http/util and transport/http/common are build before the engine directory, and building rest of the transport after engine is build.
This line causes the following error when running automake under MinGW:
Makefile.am:1: directory should not contain `/'
This is caused by an old version of automake (at least 1.7 and older). Newer versions accept multi-level paths as values for SUBDIRS.
Related
I am building binaries for our custom board(iMX7) using Yocto-morty. I need some libraries such as UTF-32.so, UTF-16.so, UTF-7.so from glibc package for bluetooth file tranfer. But these libraries are not available in the rootfs, only files available under /usr/lib/gconv are gconv-modules and ISO8859-1.so. So I am trying to add these libraries by adding new bbappend file glibc_2.24.bbappend with the following content
FILES_${PN} += "${libdir}/gconv/*"
do_install_locale_append() {
cp -r ${dest}${libdir}/gconv ${D}${libdir}/
}
But it results in the following error:
ERROR: glibc-2.24-r0 do_populate_sysroot: The recipe glibc is trying to install files into a shared area when those files already exist. Those files and their manifest location are:
build_dir/tmp/sysroots/esomimx7d/usr/lib/gconv/ISO-2022-CN.so Matched in b'manifest-esomimx7d-glibc-locale.populate_sysroot'
build_dir/tmp/sysroots/esomimx7d/usr/lib/gconv/ARMSCII-8.so Matched in b'manifest-esomimx7d-glibc-locale.populate_sysroot'
......
Then I tried to remove the glibc-locale from the image but due to some dependency issues I could not do that.
Could anyone help me to add the above mentioned libraries to the rootfs?
The error is telling you the answer to your problem. Those files are part of the glibc-locale recipe, so you just need to install the right packages into the rootfs.
$ oe-pkgdata-util find-path \*/UTF-7.so
glibc-gconv-utf-7: /usr/lib/gconv/UTF-7.so
So you need to add glibc-gconv-utf-7 (or -utf-32, etc) to your image.
You can remove thm and compile again it will work.
rm build_dir/tmp/sysroots/esomimx7d/usr/lib/gconv/ISO-2022-CN.so
rm build_dir/tmp/sysroots/esomimx7d/usr/lib/gconv/ARMSCII-8.so
This is for work around only we need for perminant sol.
These files are belongs to glibc-locale so you need to install the required packages.
$ oe-pkgdata-util find-path */UTF-7.so
glibc-gconv-utf-7:> /usr/lib/gconv/UTF-7.so
Add the glibc-gconv-utf-7 (or -utf-32, etc) to recipe image(e.g core-image-minimal).
I'm trying to ensure that some non-source files are generated when make dist is executed. The files are an info file and an index which is constructed from the info file. I find that the files are generated when I execute make install but not when I execute make dist.
Here is the Makefile.am. (This is share/logic/Makefile.am in the Maxima project, if anybody cares.)
all-local: info
info: logic.info logic-index.lisp
logic.info: logic.texi
makeinfo --force logic.texi
logic-index.lisp: logic.info
perl ../../doc/info/build_index.pl $< > $#
Somehow I got the idea that the target all-local could cause the info and index to be rebuilt. That works OK for make install -- I guess all-local is a target for that. But all-local is not, it appears, a target for make dist. What other target could I use to ensure that the logic.info and logic-index.lisp are rebuilt for make dist as well as make install?
I have searched the web, and SO, and tried some random things, but so far I've come up empty-handed. Thanks in advance for your help.
I'm working with GNU make 3.81, GNU automake 1.14.1, and GNU autoconf 2.69, on Ubuntu 14.04.
You can force something to be built and included in the package by using EXTRA_DIST, so in your case
EXTRA_DIST = logic-index.lisp
This will cause the file to be always included in the distribution tarball.
I'm trying to build a project that supports easily building 'stripped-down' distributions with undesired features removed. The project uses automake and is structured with potentially removable features in their own feature.am files that are included in the top-level Makefile.am file as
include feature.am
The problem is, if you remove a feature (and its feature.am file), autoreconf fails with
automake: error: cannot open < feature.am: No such file or directory
Is there a way to simply ignore this error and continue? I tried using
-include feature.am
like GNU make does, but this ends up simply copying that line into the Makefile.in file (and thus the Makefile), rather than having automake read it.
This is not possible because of the way automake works (which has next to nothing to do with how make includes files by itself.)
You can use AM_CONDITIONAL and autoconf's AC_ARG_ENABLE to build or not build the components that you don't want built. If you would like to have separate tarballs, that's more complicated and I would suggest you just ship separate packages for those features.
I have access to a large IBM Power8 machine (running Ubuntu), and would like to build Bazel on it. However, when I try to do it as their installation instructions suggest, I get:
me#machine:~/bazel-0.1.5$ ./compile.sh
INFO: You can skip this first step by providing a path to the bazel binary as second argument:
INFO: ./compile.sh compile /path/to/bazel
🍃 Building Bazel from scratch.
Compiling Java stubs for protocol buffers...
third_party/protobuf/protoc-linux-x86_32.exe -Isrc/main/protobuf/ --java_out=/tmp/bazel.T9C83cNa/src src/main/protobuf/android_studio_ide_info.proto
scripts/bootstrap/buildenv.sh: line 63: third_party/protobuf/protoc-linux-x86_32.exe: cannot execute binary file: Exec format error
pv#sardonis:~/bazel-0.1.5$ ^C
Clearly, part of the problem is the compiler trying the 32-bit compiler. I tried the following things to no avail.
Replacing the third_party/protobuf/protoc-linux-x86_32.exe by a copy of third_party/protobuf/protoc-linux-x86_64.exe. This gave the same error.
Replacing third_party/protobuf/protoc-linux-x86_32.exe by a symbolic link to /usr/local/bin/protoc, which came with my distribution (this is version libprotoc 3.0.0 according to protoc --version). However, this gave a large amount of errors: http://pastebin.com/HN0MQiC4
Following the instructions on http://www.cnblogs.com/rodenpark/p/5007744.html to compile Protobuf from source and then building Bazel with the modifications on http://www.cnblogs.com/rodenpark/p/5007846.html but this resulted in a similar large amount of errors: http://pastebin.com/KjkseaGx for reference.
So, I'm out of inspiration. How can I compile Bazel on the IBM Power8 machines?
(PS: I've posted this as a part of resolving installing TensorFlow on the IBM power8, so it's not a duplicate question, just one aspect in order to solve it stepwise.)
The version of protobuf you're using must match the protobuf runtime that is checked in. In this case, that's protobuf-java-3.0.0-beta-1.jar [1], so you have to use the compiler version 3.0.0-beta-1.
(I work on Bazel.)
[1] https://github.com/bazelbuild/bazel/tree/master/third_party/protobuf
I've read the entire ATLAS installation guide, and it says all you need to build shared (.so) libraries is to pass the --shared flag to the configure script. However, when I build, the only .so files that appear in my lib folder are libsatlas.so and libtatlas.so, though the guide says that there should be six others:
libatlas.so, libcblas.so, libf77blas.so, liblapack.so, libptcblas.so, libptf77blas.so
After installation some of the tests fail because these libraries are missing. Furthermore, FFPACK wants these libraries during installation.
Has anyone encountered this? What am I doing incorrectly?
In my experience, it's a lot more complex than that, see our EasyBuild implementation of the ATLAS build procedure at https://github.com/hpcugent/easybuild-easyblocks/blob/master/easybuild/easyblocks/a/atlas.py .
We needed to:
enable the -fPIC compiler option
run 'make shared cshared ptshared cptshared' in the 'lib' directory
We're not even using --shared for configure, probably because it doesn't do much.
If you want to build ATLAS (and whatever you will be linking it with) without headaches, look into EasyBuild.
(disclaimer: I'm a developer for EasyBuild)
First if you have incorrectly specified the --force-tids flag for configure then the parallel libs won't build. To check this you can run make ptcheck. I have question regarding the specification of this flag here
Then if I examine my resulting ATLAS Makefile it says " ... only when atlas is built to one lib" and indeed only two "fat" libs are constructed: libsatlas.so and libtatlas.so.
I quess you can either link FFPACK against those libs or change the resulting ATLAS Makefile to contain the targets you need (Which won't be too hard since the static libs are available).
I had to manually create links to the .so.3 files.
So the versioned library files existed, but not the files the cmake was looking for.
Running
sudo ln -s libatlas.so.3 libatlas.so
sudo ln -s libcblas.so.3 libcblas.so
sudo ln -s liblapack_atlas.so.3
(I didn't build the cblas, atlas or lapack but installed them with apt-get. Wondering why the links were not automatically created).