Link specific version of C library to golang program - go

I develop a utility in Go that requires recent version of sqlite. I'm interested only in targeting specific architecture, to be specific: x64 linux. I'm developing that utility on Mac OS X. I'm using go-sqlite3 driver. I use GNU Make + Glide to build my utility. In order to cross compile on my Mac I pass specific arch flags to make.
Repos on Linux platforms that I'm targeting usually have quite old versions of sqlite that don't have features that I need in my utility.
I can manually compile and install required version of sqlite on all the platforms that I need, but it is quite cumbersome. I wonder if there is a good way to either statically link a specific version of sqlite or somehow bundle a utility with specific version of sqlite dynamic library.
Even though I mention sqlite a lot, this question can be generalized to other libraries: how to bundle a golang app with a specific version of C library an outdated version of which may be installed on the target platform.
Also: how to better organize development of that utility so that other devs won't need to manually compile and install specific version of the library - the preference is to use Makefile that would build all the binaries for required target platform. I see that I can just copy code of specific version of library (e.g. sqlite) to my utility's repo though I wonder if there is a better option - maybe I can somehow use glide dependencies for that purpose and build library that I need as part of my other dependencies.

Related

macOS: Should I ship system libs if my app dynamically links to them?

My app uses a couple of third-party dynamic libs, which I include in my app package. But what about the system libs that my own code and third-party libs depend on?
Examples include
/System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation
/usr/lib/libSystem.B.dylib
Should I bundle these and update their reference paths in all caller binaries of the app package?
If I must, should I create app binaries for each macOS version?
My thinking is that I shouldn't force my users to install Xcode, but I'm not sure if those system frameworks and dynamic libs are bundled with OS or installed through Xcode ...
The system libs are guarantee to run on any system your application supports (and naturally they are installed on any Mac since the system itself uses them). Therefore, no, you don't have to bundle these libs. Instead you have to carefully decide from which version of the OS your codes are prepared to support this or that version of a lib.
When a piece of code needs to be handled differently for different versions of the lib at runtime, you can use such form to synchronize your codes with the current environment :
if (#available(macOS 10.14, *)) {
//codes supporting any system since 10.14
} else {
//codes supporting earlier systems
}
You must not bundle these. Anything in /System or /usr/lib is part of the operating system, not Xcode. They are not intended to be copied to other systems.

Compile and use a custom version of libstdc++ as a regular user

I have to compile and run a modern program on a cluster with an outdated OS. The program employs some c++11 features and STL templates. The cluster's compiler toolchain (g++ v 4.4.7) supports almost all of c++11 features, but some important STL templates/classes are missing.
To make it work I'll have to either:
modify the program's source code, or
compile a newer version of STL library on a cluster and link against it instead of system-wide STL libs.
The 1'st route seems sub-optimal, because the program is currently in an active development by our lab, and it means we'll need to patch it by hand daily, or make a separate branch and regularly merge from trunk, or drop the support for c++11 features altogether.
So, is it possible to build a libstdc++ version that is newer than the one installed system-wide, and link against it? And if it is possible, how could it be done?

Tutorial on building whole toolchain on CentOS

I am working on CentOS 6 machines, which has very old GCC/GlibC version. I want to build the whole glibc, binutils, gcc toolchain with latest or at least very recent versions in order to use c++11 support in latest gcc, and ld.gold in recent binutils, and possibly improvements in recent glibc.
I want to put the whole toolchain in some separate directory, and not to influence any existing system files. I also want to build gcc with --sys-root so that when using the gcc, I don't need to specify -I/some/directory/include and -L/some/directory/lib or whatever other parameters. Also the generated executable will automatically use the new ld-linux-xxxxx program loader which will automatically find the new libc.so.
Anyone knows if there exists some tutorial on this task?
The compiler is very dependent on glibc, altough you manage to build the compiler either in a chrooted system or equivalent, you will need to build also all libraries needed with the program you will build with this new compiler.
The best you can do is use a fresh new system (vm or whatever) or upgrade your existing one
You can download the latest toolchain from Openembedded or Yocto.
And here you don't have to do any package installation to your current system.
Just download the toolchain, source the environment and thats it you are ready to check the c++11 support.
The location to download the toolchain:
http://downloads.yoctoproject.org/releases/yocto/yocto-1.7/toolchain/ (Just select the architecture either 32bit or 64 bit based on your machine support)
If you need the latest toolchain, you'd better migrate to Fedora.
If you can't/won't, the best bet is to get the pieces as source RPMs for CentOS and Fedora, unpack them and fix up the CentOS by pilfering the sources and patches from Fedora, take care it doesn't overrule the system packages, correct versions and fix to install elsewhere (don't mess up your system too much! /usr/local comes to mind). The pieces are at least binutils, gcc.
I do not knwo Why you need this ? If this is needed that to compile for another computer, I would suggest using a virtual machine running the same OS as target. much more easier !!

Are Boost header libraries installed by default in Debian Squeeze?

I recently installed Debian Squeeze on my machine with C++ programming practice as one of the main goals. I use Boost libraries regularly in my projects. On OS-X and Windows, I had to manually install Boost header libraries prior to using them. However, regarding Linux, the front page of the Boost website mentions
Popular Linux and Unix distributions such as Fedora, Debian, and NetBSD include pre-built Boost packages.
I use mainly the header libraries, not pre-built packages for my current projects. So my question is: Are the header libraries installed by default anywhere on Debian or do I have to install them? I have already looked in /usr/include and it doesn't seem to have any Boost directory. I have googled as well as looked up related discussions on SO, but didn't get a clear answer to my question. If I do have to install the header libraries, is there an 'apt-get' way of doing it or I simply untar and place in a convenient location (/usr/local/include) ?
Second, if I need to manually place the boost headers (say in /usr/local/include/), should the version of the headers match with the pre-installed packages for compatibility with any potential future projects which use both the binaties (libboost-*) and header files?
I am fairly new to programming on a Linux platform. Although, I can make things work using patch-and-match (and googling), I am looking for guidance on long term best practices.
Thanks.
Saying a GNU/Linux distribution "includes" a package such as Boost doesn't mean it is installed automatically, it means the package is available for installation, using your system's package management tool. The package might be tailored for your distribution, so it integrates well with the rest of the OS, or it might just be identical to the upstream version and the benefit is just that it's already built for you and convenient to install from within the OS.
There is tons of documentation on Debian's package mgt tools:
http://wiki.debian.org/PackageManagement
http://www.debian.org/doc/manuals/debian-faq/ch-pkgtools.en.html
http://www.debian.org/doc/manuals/debian-reference/ch02.en.html
So yes, you want to apt-get (or the equivalent with another of Debian's tools) to install Boost in /usr/include, that will be much easier than manually installing them. If you later decide to install Boost manually, keep that installation entirely separate from the system packages, so the libraries and headers from the newer version don't conflict with the system packages. If it's a single-user machine and you don't need the packages to be available to other users on the machine then you can just install them in your home directory, rather than /usr/local/ (which requires superuser access, and you should do as little as possible as the root user)

Can I use OpenFrameworks on OS X without having to use XCode?

I can't stand XCode, but really love OpenFrameworks, and I know it works on Linux+Win32 so I don't see why it should be XCode dependent. If I need to have XCode installed that's fine, I just don't want to use it at all.
Xcode internally uses gcc/llvm. in fact from the command line you can cd into a directory that contains an openFrameworks project and just type xcodebuild. but this won't allow you to edit the project file and add new source code, etc.
the Linux makefiles could be adapted to work on OSX as well. they already contain a lot of the information necessary about finding the correct source files, library paths etc. however Linux allows us to install many more components as shared system libraries, while on OSX we link most of the libs statically, so a number of extra library paths would need to be added. probably the biggest gotcha is that everything has to be compiled 32 bit, which means passing -arch i386 everywhere, so you can't just install dependant libs using Homebrew or MacPorts. we are in the process of transitioning to 64 bit but there are still some QuickTime calls that require us to stick with 32 bit, mainly around accessing legacy video capture devices that a lot of us still use for computer vision.
like #cdelacroix points out, we only maintain Xcode project files on OSX. this is mainly due to the lack of a decent alternative. there is a version of Code::Blocks for OSX but it is not very well supported, has some issues with native gui rendering and tends to lag behind the other platforms. Xcode is also the easiest way to install a toolchain on OSX so for most users installing Xcode is necessary.
if you do get a makefile based build system working, and would be interested in maintaining it medium to long term, please consider contributing it to the GitHub repository, it would be gladly accepted.
As of March 2013, openFrameworks has official makefile support for compiling the library itself. However, at the time of this writing, the changes haven't yet been merged into the stable release. You'll need to clone the Git repository and switch to the development branch.
git clone https://github.com/openframeworks/openFrameworks
cd openFrameworks && git checkout develop
cd libs/openFrameworksCompiled/project
make
As far as I can tell, we still need to use the unofficial solutions for compiling apps against the library.
You need Xcode, or at least a set of compilers (more information is available here), but otherwise, no, you can edit/work with the code in whatever editor or environment you want.
Here's a link to a makefile which will compile an OpenFrameworks application on OsX:
https://gist.github.com/labe-me/1190981
Place the makefile in the apps' directory and run make. Tested on OsX 10.6, but haven't tried with addons yet.
As #mipadi said, there is no requirement to actually use Xcode, you can do pretty much everything you do in Xcode with make or cake or any of your build system of choice. All you have to do is find the right set of command line options to pass to the usual tools (compiler, linker, strip, etc.), and sometimes the easier way is to... look in the Xcode build window how it is doing stuff (expand the lines with the small button on the right of each line).
For example you can link with your framework of choice with ld -framework Framework -FPathToFramework foo.o or your dynamic library with ld -lLib -LPathToDylib foo.o. You might have to learn about #rpath, #loader_path and install_name_tool to ship a self-contained packaged application.
As for OpenFrameworks, the "requirement" for Xcode is that the authors decided to maintain only Xcode project files. I just looked at how they do it, they ship source code and Xcode project files that build static libraries, so it will be even more simple for you (although you will need to link the library dependencies by hand). You will have to choose between compiling everything from source in your build system (if you want more customization power without touching Xcode), or just produce the 2 static libraries (openFrameworks.a and openFrameworksDebug.a) with Xcode once, then use them in your build system (recommended until you really need continuous customization).

Resources