Packaging Go 1.5 applications for Debian-like systems - go

The original question, Packaging Go application for Debian, was asked over three years ago, when Go was statically linking executables.
Now, with the new Go 1.5 Release, whose release note says,
Changes to the linker enable distributing Go packages as shared libraries to link into Go programs, and building Go packages into archives or shared libraries that may be linked into or loaded by C programs
I headed to Debian GoPackaging wiki, but only to find out that it has not been updated to Go 1.5 yet, maybe the wiki is not maintained, or maybe not, but,
I just want to know, if I want to package a tool that depends on nothing but the official Go packages, how should I take advantage of the shared libraries, instead of building my app into a humongous statically linked executable?
I know the official answer may take some time, but I can wait.
Thanks to Braiam's answer, I moved a bit further.
I followed the blog to the step head -100 debian/**/*, but my output is completely different from the blog's. Mine is just the following. Anyone knows why and how to fix?
$ head -100 debian/**/*
3.0 (quilt)
I tried to run gbp buildpackage --git-pbuilder but bumped into error:
gbp:info: Building with (cowbuilder) for sid
Base directory /var/cache/pbuilder/base.cow does not exist
gbp:error: 'git-pbuilder' failed: it exited with 1
I checked and verified that I already have cowbuilder & pbuilder installed:
ii cowbuilder amd64 pbuilder running on cowdancer
ii pbuilder all personal package builder for Debian packages
What's wrong? This pbuilder thing is new to me as I build Debian/Ubuntu Package with Docker.

The easiest method is to use dh-make-golang and verify/correct the autogenerated files. Otherwise you must follow Debian Packaging Guide and create debian/* directory and editing the control and rules file to match the examples.

Related

How to install CBC for Pyomo locally on Windows machine?

My goal is to connect the open-source CBC solver with Pyomo in Spyder. I am working on a Windows 10 machine and it is not an option for me to use the NEOS server due to company policy.
I have downloaded the binaries from Bintray (https://bintray.com/coin-or/download/Cbc#files) that include a cbc.exe file. However when trying to run it, several errors come up stating that I am missing files (among other libbz2-1.dll and zlib1.dll). I do not know much about linux or software development but after a lot of time on google I understand that these are used for unpacking data among other things. I found all files except zlib1.dll in a developer chat on the same subject and zlib1.dll I found on another page. However when running I now get the error: “The application was unable to start correctly (0xc000007b).
I have also tried downloading MSYS2 MinGW and followed instructions from CBC. I don’t know if I require this or if it is only for developers.
Can anyone tell me what to do? I suspect other people than myself want to use CBC in Pyomo as an alternative to GLPK.
If you already have the .exe file, make sure it is in your current working folder (set as the working directory in Spyder, simply opening your file is not enough) and call it using the SolveFactory function:
opt = SolverFactory("cbc.exe")
results = opt.solve(model)
It works for me.
You will find some general information here where i outlined some approaches.
While this was targeted at Clp, it also applies to Cbc.
It's a bit strange as i observed too, that some libs are not statically linked (zlib) while it's certainly doable. But as mentioned in the thread, this should not be the case anymore (see the restriction about which files are fully statically linked) and therefore your observation is strange (and you did not say, which file you downloaded).
So i would trying one of the following (in this order):
Try again with your source, but stick to the master-versions (see first link) as the maintainer only guaranteed fully-static builds for those!
Use the builds from AMPL
(tested and works for me; generally recommended in terms of quality/stability of builds)
Use the builds from coin-or/pulp, another modelling-tool for python
(tested and works for me)
Compile from source using mingw64
(Use any build and provide some external dll of zlib and co -> hard to debug)
Of course i completely ignored other potential issues:
license-stuff (what's part of those builds)
not sure if a company can afford to use binaries not build themself in regards to legal stuff
version-compatibility with python
does every version of Cbc work
cbc version + configuration
modern version
compiled with multi-threading
...

go module didn't download cgo soft links of dynamic library correctly

The environment is ubuntu 16.04 64bit, go version go1.12 linux/amd64
I am trying to switch my golang project from gopath to gomodule. One of the packages my project imported is using cgo to call ffmpeg, the package have several dynamic ffmpeg libraries, for example, libavcodec.so, libavcodec.so.57, libavcodec.so.57.107.100, the first two files is soft link file
The problem is when I go build my golang project, go module only download libavcodec.so.57.107.100, it didn't download the two soft link file
I tried to go get the package, and successfully get all the libraries including soft link file
I expect go module download all c dynamic libraries files including soft link files, but I didn't get the soft link files
Update: I submitted a issue in github, and seemed that this is intentional, see issue #32050
Go (in modules mode as well as in GOPATH mode) is concerned only with Go source code and will download only Go packages (which might contain C code) but it never installs shared libraries on your system or does other installation work (like creating symlinks).
There is no way you can convince or force the go tool to do what you think it should do.
Install the required shared object files and the necessary symlinks in any other means you find convenient n your system.

Building GCC with MPFR, GMP and MPC

Of course we all know building GCC version >= 4.1.x requires the supplementary packages MPFR, GMP and MPC to be present.
There's a few ways to handle these GCC dependencies:
1) Download and build each supporting package separately and then tell make where the binaries are located during GCC build time.
2) Download each supporting package, untar and move the source into your GCC build directory and make will automatically build each of the packages when needed.
(Executing the gcc-src/contrib/download_prerequisites script does the same as option 2)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Is there an advantage to either method? Does pre-compiling the binaries provide something I'm missing by taking the "easy route" and just dumping the package's source into my GCC build directory and letting make figure it out?
I've seen it done more frequently in various build scripts by pre-compiling each package to a binary, and then telling make where they are located during gcc compilation. Is this the "preferred" way to do it? Why?
To add context, I'm mainly building cross-compilers targeting various ARM platforms.
For most use cases I believe that option 2 is just as good as option 1. However, I can see a few situations in which one would want to do it manually.
A package maintainer wants to build separately as they want separate packages for mpfr et al.
Someone who wants to pass different configure arguments/CFLAGS to each of the packages.
A GCC developer who wants to keep their source and build trees small as they don't make any changes to MPFR/GMP/etc.
I haven't done too much work with the (rather ugly) GCC build system, but I haven't seen any obvious differences in how the binaries are built.
I'm not the biggest authority on this though, so YMMV; I may be wrong.

wxWidgets and Golang

I want to develop a programm in Go with a multi-OS GUI in wxWidgets, my dev environment is WinXP x86.
I wanted to use the wxWidgets Go wrapper wxGo, but the documentation is very succint and the project seems dead since 2 years.
I encountered some errors with the go get github.com/JeroenD/wxGo and go install github.com/JeroenD/wxGo
Result of go get github.com/JeroenD/wxGo:
package github.com/JeroenD/wxGo
imports github.com/JeroenD/wxGo
imports github.com/JeroenD/wxGo: no Go source files in C:\Documents and Settings\dell\Mes documents\gopath\src\github.com\JeroenD\wxGo
Result of go install github.com/JeroenD/wxGo:
can't load package: package github.com/JeroenD/wxGo: no Go source files in C:\Documents and Settings\dell\Mes documents\gopath\src\github.com\JeroenD\wxGo
I tried to follow the Building.txt doc from JeroenD's github. First time, I downloaded and installed wxWidgets, the sample codes were able to compile (with MinGW), but I was not able to compile the wxWidgets library from source. The second time, I downloaded and installed wxPack, with libs/dll already compiled. Here, the problem is I can't compile the wxWidgets samples.
As stated in the Buiding.txt doc from JeroenD's github, I installed SWIGWIN, but did not compile it from source as SWIG now supports Go (from SWIG's documentation). But now, I don't know what to do with a wrapper (wxGo), wxWidgets dll/libs, and SWIG. I think I read that SWIG needs *.i files to make bindings from language to another, but I can't find any in my wxWidgets folder, perhaps I have to take these files from another wxWidgets binding (wxLua and wxPython have these in their repos).
My goal is just to get the wxWidgets lib working with Go to be able to write a multi OS GUI in wxWidgets, I'm a bit surprised that nobody had posted problems with JeroenD's package nor wanted to get wxWidgets working with Go.
As you may ask, building a gtk GUI with go-gtk is not a solution, as it is needed in my project to have a multi-OS GUI looking OS native (if you have a multi-OS GUI lib which looks native and is simpler to use, please tell me).
According to the Building.txt file:
To build the wxGo library:
cd wx
make install
So, despite this being a Go package, it doesn't seem to use any actual Go code (if you look in github.com/JeroenD/wxGo you won't see any .go files).
I think, in cases like this, you need to use git (instead of go get) and make install instead of go install.
On windows you may want to get the Git for Windows installer to make this process a bit simpler.
Once it's built it looks like you can use
import "wx"
as usual in your go code (minimal example at https://github.com/JeroenD/wxGo/blob/master/example/minimal/minimal.go)
wxGO wxWidgets wrapper for GO is live here wxGO MultiOS
QML may be a better choice, it helps deal with wxWidget or QT underwear.

missing zlib.dll

I am building a win32 executable. The compiler is the latest version of MinGW. The library dependencies are GLUT and libpng.
I first tested on a windows 7 machine, and had to obtain libpng3.dll and freeglut32.dll. However, on XP, I had to (in addition) acquire zlib1.dll.
The XP machine was a VM with a fresh install, so I suspect a fresh win7 machine may also be lacking zlib1.
My question is how do I go about finding out which dll's I need to distribute? How do I know, a priori, which dynamic libraries are needed for my program to run on a particular system? I suppose this is what installer programs are for... I'm guessing that what the installer does is look through the system to find out which dependencies are unsatisfied, and then provides them. So this way if I were to distribute my program I could check if the user's machine already has zlib1.dll, and I won't install zlib1.dll if it's already found in the system directory. However I never found a document that said to me specifically, "libpng requires zlib", and so, until such point as I tested the executable on a machine lacking zlib, I was unaware of this dependency. How can I create my dependency list without having a fresh install of each version of every operating system to test on?
One idea I have is to decompile the executable, or through some method examine the linking process, to find all the libraries that are being linked at runtime. The problem now becomes figuring out which of these are supposed to already be there, and which of them I could be expected to provide in the distribution.
edit: Okay, I looked, and the installation of libpng I downloaded did provide zlib1.dll inside its bin directory. So not including it is pretty much my fault. In any case, Daniel's answer is definitive.
Dependendy Walker shows all deps of your program.
The correct answer to this question, in my view, is to start at the source rather than to reverse engineer the solution with Dependency Walker, awesome and useful tool though it undoubtedly is.
The problem with Dependency Walker is that it only tells you what one particular run of the program requires on the OS on which you run it. If you have any dynamic loading dependencies in your app then you would only pick those up if you made sure you profiled the app with Dep. Walker and forced it through those dynamic loads.
My preferred approach to this problem is to start with your own source code and analyse and understand what it depends upon. It's often easy enough to do so because you know it well.
You need to understand what are the deployment requirements for your compiler. You usually have options of linking statically and dynamically to the C++ runtime. Obviously a dynamic link results in a deployment requirement.
You will also likely link to 3rd party code. One example would be Windows components. These typically don't need deployment, you can take them as already being in place. Sometimes that's not true, e.g. GDI+ on Windows 2000.
Sometimes you will link statically to 3rd party code (again easy), but if you link dynamically then that implies a deployment requirement.

Resources