I have a finished program that I would like to distribute to a colleague, and can find no "de-facto" tool chain or resource or guide, or even an opinion, on the best approach to cross-compilation. I know that there is rarely ever a cookie cutter solution, but I am still surprised by the lack of information in this regard.
I have begun trying to cross-compile all of the libraries that I use (and their dependencies) but of course, it is not going smoothly. For reference I decided to go with the basic instructions here. I have followed the instructions exactly as you see them.
Conveniently enough, I needed the png/jpeg/zlib libraries that are used as examples in the link above. I was also able to successfully cross-compile libtiff and leptonica. With that momentum, I then moved on to one of the more beastly libraries -- Tesseract. At this point, during the ./configure .... step, it says that it cannot find leptonica. I don't understand, as the pertinent leptonica files are installed, right where I (and the link above) want them to be. The rub could be rooted in the way that tesseract is built, it differs slightly from the prior libraries I built previously. Tesseract goes autogen.sh -> configure -> make ->make install. I have no idea how to mitigate this.
So my question: Should I continue to bother down this route? The other libraries that I use are openCV, and ImageMagick. If I should, can someone please ease my pain. If I should NOT continue down this route, what is the easiest way to switch to my (very old) windows XP computer, and package up this software? I doubt some of the software I use would even support development on that platform.
Related
I want a latest version of GCC for Windows.
Now the latest version is 9.2 but for Windows via MinGW it is just 8.1...
I have tried to build from source for Windows 10 include WSL, but have not found how to do it exactly, I do not want use via CygWin or other emulater, just real on Windows as clang and MSVC.
Note: I have Windows 10 latest version with WSL.
The latest version of GCC (9.2.0) compiler combined with the latest MinGW-w64 (7.0.0) headers and libraries can be found in the standalone build at http://winlibs.com/
Oh the pain, getting a working GCC for Windows.
build your own?
Building is a fun experience, or a no-fun experience, depending on how you look at it.
I've spent literally weeks of time building GCC successfully and unsuccessfully (native and cross). Follow the instructions to the letter, and it works. And then, another day, it doesn't work (with the slightest different sub-sub-release or revision, or the tiniest little change that is entirely "harmless", or to the best of your knowledge no change at all, and you never get it to build again).
Save everything you've done (copy console), keep the build tree, and repeat the build (paste text) 6 months later after first doing a svn update. Compiles fine for 15-20 minutes, then fails. Start from scratch, and spend a day or two until it works, and you cannot tell why it works now.
Use a build script by someone who offers binary builds (so the assumption is that it must work, otherwise where do the binaries come from). The build script more or less does exactly what you've done by hand anyway, and it works, or maybe doesn't work. If you are only interested in actually having a compiler that works for compiling under Windows, and not in spending your life fiddling around, that's not a lot of fun.
use a pre-built binary?
There exists serval binary distros from a variety of sources.
Although downloading binaries is of course always a tidbid risky (even when scanning everything before you run it, malware scanners are nowhere near perfect, or even good or halfway reliable), compilers are particularly high-risk. That's because compilers are a very interesting target for malware distributors as they get free redistribution with everything you build.
I've actually seen GCC builds with malware built-in on apparently harmless sites (forgot the name, but one such example was a site offering GCC builds for several architectures, which looked very nice).
Now... there exists a distro which supports GCC 9.2 since some time built by someone under the pseudonym "nuwen".
It turns out, that "nuwen" is actually Stephan T. Lavavej, so... chances are that this is a distro that you actually want to use (I'm using it anyway). It's unlikely that you will be able build one yourself that's substantially better (also that one has a lot of useful support libs already coming with it), and it's unlikely that it is harmful.
https://nuwen.net/mingw.html
Note that MSYS2 will also allow you to install a very recent GCC (9.1 or 9.2, not sure) via pacman, very fast and very trouble-free. MSYS2 is nice insofar as you get a 95% working Unix-like working environment with 95% of the tools.
And 95% of the time, it works fine in every practical respect. Until then, one day, it doesn't, usually related to some configure script messing up pathnames, or something with environment variables. Or something else very subtle. For example, it is very much possible to successfully build GCC with MSYS2 (I've done it), and it works "perfectly fine" until some weeks later you discover that something doesn't work in your custom build, so some old project of yours now suddenly doesn't build any more when it did with the old stock compiler.
That's probably issues that one could fix, if determined (I'm however too lazy, for me a compiler is something that simply must work).
There are two well known distributions of the GCC bundle for Windows. The first one is by equation.com
http://www.equation.com/servlet/equation.cmd?fa=fortran
and the second one is by winlibs.com
http://winlibs.com/
I want to install a driver for Ros (robot operating system), and I have two options the binary install and the compile and install from source. I would like to know which installation is better, and what are the advantages and disadvantages of each one.
Source: AKA sourcecode, usually in some sort of tarball or zip file. This is RAW programming language code. You need some sort of compiler (javac for java, gcc for c++, etc.) to create the executable that your computer then runs.
Advantages:
You can see what the source code is which means....
You can edit the end result program to behave differently
Depending on what you're doing, when you compile, you could enable certain optimizations that will work on your machine and ONLY your machine (or one EXACTLY like it). For instance, for some sort of gfx rendering software, you could compile it to enable GPU support, which would increase the rendering speed.
You can create a version of an application for a different OS/Chipset (see Binary below)
Disadvantages:
You have to have your compiler installed
You need to manually install all required libraries, which frequently also need to be compiled (and THEIR libraries need to be installed, etc.) This can easily turn a quick 30-second command into a multi-hour project.
There are any number of things that could go wrong, and if you're not familiar with what the various errors mean, finding support online could be quite difficult.
Binary: This is the actual program that runs. This is the executable that gets created when you compile from source. They typically have all necessary libraries built into them, or install/deploy them as necessary (depending on how the application was written).
Advantages:
It's ready-to-run. If you have a binary designed for your processor and operating system, then chances are you can run the program and everything will work the first time.
Less configuration. You don't have to set up a whole bunch of configuration options to use the program; it just uses a generic default configuration.
If something goes wrong, it should be a little easier to find help online, since the binary is pre-compiled....other people may be using it, which means you are using the EXACT same program as them, not one optimized for your system.
Disadvantages:
You can't see/edit the source code, so you can't get optimizations, or tweak it for your specific application. Additionally, you don't really know what the program is going to do, so there could be nasty surprises waiting for you (this is why Antivirus is useful....although LESS necessary on a linux system).
Your system must be compatible with the Binary. For instance, you can't run a 64-bit application on a 32-bit operating system. You can't run an Intel binary for OS X on an older PowerPC-based G5 Mac.
In summary, which one is "better" is up to you. Only you can decide which one will be necessary for whatever it is you're trying to do. In most cases, using the binary is going to be just fine, and give you the least trouble. Sometimes, though, it is nice to have the source available, if only as documentation.
In Wheezy there is a source package for gcc-3.3 which only builds libstdc++5. Close examination shows that building of debian/control (from control.m4) can be modified so that the full package is built, which is my goal (legacy project, needs to be built with libstdc++5-dev:i386 and so on, but I want to build it on Wheezy64).
The question: how do I (find what to) tell dpkg-buildpackage to enable building the rest of the package? Should I just download the source package from archival Lenny?
Impossible at all?
Thanks in advance for any directions.
yes, well.
the preferred way would be to port your legacy project to build with a current g++ (4.8).
this would allow your project to run on any wheezy system (and hopefully on futgure systems like jessie/...)
if this is not an option, you should first try to download the source package from your target release (wheezy), modify debian/control to build all the packages you need, and build them.
chances are high, that gcc-3.3 and friends are disabled only to guarantee that nobody uses obsolete software anymore (so debian people don't have to worry about maintaining gcc-0.1 through 6.66).
as a last ressort you could try to get the source package from lenny and build that.
chances are high that this will be quite complilcated, and bug-ridden.
if youplan using your legacy project in two years from now, you might be better of stating to port it to recent ilbraries now.
I recently had to do something like this. What I did was install an old (32-bit) distro as a VM guest (which included gcc-3.4) just to make sure that it built and worked in "the past" before making changes for current build tools.
I did this mainly because you can be pretty sure that the build tools and environment worked back then, because everyone needed them. Not as many people are going to be using old tools now, so it's less clear that things will work. But it could work alright.
It's not exactly clear from your question whether you want a 64 bit or 32 bit version of this legacy software. If you want a 64 bit version, there might be fewer issues to port first to a modern 32 bit environment, then a modern 64 bit environment. At least you'll be able to identify where the bugs are.
Hobbyist and newbie, so no laughing ;)
I have been developing some toy programs on my Mac for a long time and everything is nice and straightforward.
I was trying to port one of my existing projects to Windows (as a way to get started in developing for Windows) but am stuck trying to build the libraries I have come to love in a Unix environment under Windows (and MinGW).
At the risk of revealing my naïvety, could someone just run through how to build and install a library on Windows (including any special software required)?
For example, an install readme might look like this:
Do this to install:
./configure
make
make install
Obviously on Windows that pukes...so what are the analogous steps on Windows?
You have a couple of options wrt building unix style libraries on windows:
Google for a pre-built binary distribution made for windows of the library in question.
If the libraries authors have bothered to support it, you can try installing Cygwin to get a posix like build environment on windows
Some libraries - like OpenSSL - have a set of build instructions for windows that include installing ActiveState Perl, and then running the appropriate configure script manually.
Where the authors of the library have made no special effort, you are pretty stuck: Create a static library project in the dev environment, add the libraries files to it, create (or move) the headers that the ./configure step usually creates or moves and build it yourself.
It is a tragic state of affairs that doing this "simple" task is so hard. Developers seem to take one of two lessons away from this:
Microsoft is the devil. Microsoft hates developers. Hates open source software. And is ###$. Compiling libraries and software from source via a standardized ./configure & make install process is the one true way.
Microsoft is the one true way. The microsoft eco system of pre-built .lib files and headers is perfection and people who build everything from source using arcane perl scripts and install into a standardized filesystem are mentally defective.
Your pick :P
I've googled the hell out of it, and it seems like there is no way to install gcc on OS X without installing Xcode (which takes at leats 1.5GB of space). All I need is gcc and none of the other junk that comes with Xcode. And at this point, I'll take any other kind of C compiler.
I know I could simply install Xcode, but that is beside the point since I neither have my original installation disc nor a quick internet connection.
So... does anyone have any suggestions?
EDIT: Sorry if I was unclear, but I need the headers as well. I'm currently installing gcc4 via fink and it's downloading the shared libraries as well. I'll update on the progress.
EDIT 2: Ok, so I successfully installed gcc using fink. BUT, it's pretty much useless: "error: C compiler cannot create executables". After googling around, I found that not having Apple's Developer Tools installed is the cause of the error. Probably because I need all the libraries, headers, etc that are only available through Xcode.
Checkout command line tools for Xcode from apple. It's official support from apple to only create the command line tools.
Try the osx-gcc-installer on github.
I've been doing this for a long time, and I've done things like this, and I've concluded it's simply never worth doing. :-(
The reason is that no one expects you to do such things, so there are assumptions all over the system that "everything" is there. You might not run into this today - or worse, you might not even realize later that this is the cause of your issues.
Instead of wasting your smart time on things like this which don't actually produce any working code you can use, following the approved method, run the download overnight, and spend your time instead on planning and writing the top-level code (you shouldn't need a compiler for that anyway!)
I'm fairly certain that this is not possible. However, I'm also not sure if you need the whole developer suite to get the developer tools installed. Quite a few tools get installed along with XCode that might be optional. However, I think you're out of luck for not needing to bite the bullet and use wget or DownThemAll or some other download manager that will let you slowly download the developer tools in chunks.
Whenever I install OS X I install the developer tools as a rule, just because it opens up the world of available software tremendously. Perhaps you should consider doing this in the future as well.
The first thing you want to try is called Pacifist - what Pacifist lets you do is to open a large package (such as XCode) and to access parts of it directly. I'm pretty sure you'll be able to find a smaller package inside the XCode package that just has gcc.
HOWEVER it's not clear to me that this is the best route. If you are planning to do Cocoa or Carbon developing I strongly suggest installing the entire package because you will need all the documentation and headers. If you're only planning on doing command-line stuff, you still may find you need to poke around inside XCode to identify all the packages you will need - things such as libraries, headers, man pages and so on.
All in all you're probably still better off installing the whole thing - if HD space is really tight (because you're on a tiny old iMac for example) then look at tools like Monolingual - Monolingual removes all the international support from all the various OS X applications, which can easily reduce the size of an application by 50%.
There's fink and MacPorts, if you want an easy installer/updater.
Install the GCC package from the Packages directory in Xcode's disk image and you'll have just GCC. Note that of course you won't have autotools or other standard build tools, for which you will have to install more packages from that folder.
I found this googling around that appears to install it without XCode.
install Command Line Tool separately.
refer to
http://osxdaily.com/2014/02/12/install-command-line-tools-mac-os-x
http://osxdaily.com/2012/07/06/install-gcc-without-xcode-in-mac-os-x/
yes i could do it with port but you need at least to accept the code license.