I'm an embedded software engineer working with IA-32 type processors. We are looking for a compiler tool chain - preferable free.
We used to use Mentor Graphics CodeBench Lite but it's no longer available.
We have looked at other GCC distributions but none of them have a bare metal implementation of glibc. None except newlib but we can't use it because of the GPL and LGPL licencing issues. We're an OEM and our customers (and us) have proprietary code.
Any suggestions welcome.
Sourcery's "lite" gpl tools are still available, it's just that Mentor likes to play hide-the-link.
If you want a lightweight C library with non-GPL licensing, you might look at Bionic from Android.
However, you concern may be mistaken. IANAL but most C library licenses have a linking exception of some sort which you may want to research with the help of your lawyers - their utility as system libraries would be extremely limited without.
And actually, a quick search of the newlib licensing page (which is complicated) seems to show that more of it is under BSD-style licenses than under GPL-style ones, though care would be needed to sort it all out.
Mentor may no longer be providing a Lite edition of the IA-32 bare-metal toolchain, but I'm pretty sure it's still supported in the commercial editions, and a basic license is not that expensive.
As Chris says, the Newlib licensing page is a bit complicated -- but the gist of it is that basically all of it that you need for a bare-metal system is BSD licensed; IIRC, the parts that are GPL-licensed are clearly-delineated system-specific pieces that reference things in the Linux kernel or the like (and thus have to be GPL-licensed), and those aren't included in the bare-metal builds. I think they're even all in one or two distinct directories that you can just delete. Obviously you should do the analysis for yourself, but that's the result you should expect to find.
A shortcut that may be useful: The download page for the most recent version of CodeBench Lite for IA-32 ELF that was produced is on this page. If you download the source tarball from there, you'll get the Newlib sources that were used to build that, and there's also a .sh file in the package indicating how it was configured and built. You'll note that in the documentation (licenses are in the back of the Getting Started Guide) the Newlib binaries are simply listed as BSD-licensed, so this should show you how Mentor got a compiled library that fits that licensing description.
(Disclaimer: I used to work for Mentor until recently.)
Related
I've installed GPS GPS 6.1.1 (20150118) hosted on i686-pc-mingw32 GNAT GPL 2015 (20150428-49).
It successfully compiles Hello World, but even release executable is huge since it includes statically compiled not-optimized runtime and (what is more important) as far as I understand ada runtime is licensed under GPL and can not be statically linked into closed-source executable.
How can I configure GPS / gcc to link runtime dynamically?
This is very close to this question, and the same answer applies.
However, in case you’d prefer to edit your project properties in GPS via the Project > Properties dialog:
go to the Switches tab (on the left hand side)
go to the Binder tab (at the top)
tick the Shared GNAT run time checkbox.
While you’re there, tick the Store call stack in exceptions checkbox too; this can help debugging an unhandled exception (the binder switch is -E).
I will let someone else answer the specific question, which is (IMO) a good one.
Also good are related questions of minimising the runtime size where the full featured runtime is not necessary, as for "Hello World". Comparing the executable size with the memory installed on your platform, one might conclude this is a case of premature optimization. But for bare-bones executables, e.g. on embedded microcontrollers, it is certainly worth asking.
However there is another implicit question :
How do I divorce my executable from a GPL-encumbered runtime?
and I will answer this.
Historically, the Gnat RTS was not always so encumbered. At one time it featured the "Gnat Modified" GPL, (GMGPL), in which the runtime files contained additional permission above the GPL rights, allowing you to link those components of the RTS with an executable, without burdening your executable with the GPL - effectively allowing you to release such an executable under a closed-source license. (Provided none of its other components were pure GPL).
The Gnat GPL compiler comes with a pure GPL runtime (completely legally) to distinguish it from commercial offerings from the same authors - who have the right to put food on their own table, and their commercial products have an excellent reputation and first class support.
However there is another fork of the older Gnat compiler, offered by the FSF as part of mainstream GCC, which is kept up to date with modern Ada developments including Ada-2012. In some respects it is ahead of Gnat GPL - in the underlying gcc version for example, while in some respects it is behind, as newer Ada features take longer to make it into the FSF branch. But the point here is that it inherited the GMGPL license, and then the very similar "Runtime Exception" in GPLv3. The linked "Rationale and FAQ" should let you determine if this satisfies your needs.
If so. you can compile gcc (including Gnat) from source to meet your needs. This is not a trivial project, however! So for most common platforms, you can find pre-built binaries of the FSF Gnat compiler from the imaginatively named getadanow.com
Disclaimer : I am only pointing out this option. As always with licensing issues, don't take the word of "random guy on the internet" but study the actual licenses of the compiler and RTS you are using and take appropriate legal advice.
I am coding for WinAPI in MinGW
One thing I still have not fully understood is the VC redistributable,
I got a whole pack of question to it
Some say that such programs will need the msvcrt.dll
is the same library needed for bot c++ and c compilation?
is this available on all targets of clients?
must I redistribute it? can I redistribute it?
can I easily get rid of this external dependency?
is there other compiler that will allow me not to carry such unpleasant external dependency? (as I vaguely remember hearing that something is wrong with it - it is probably not core system lib, I heard, or it is not free to use and redistribute the library)
I see something wrong is here as I would like to produce no dependency small exes only calling the system WinAPI and if I use
some like C standard library functions functions I would prefer it economically and statically compiled in, not any third-party dependencies
MSVCRT.DLL contains mostly the C runtime, and MinGW can only use the C part. C++ binary code cannot be used across compilers generally.
It depends on your "target". It is available from Windows 2000.
No. No. It is Microsoft-proprietary code, and every Windows version has a slightly different version.
No. I am not aware of a mature alternative C run-time DLL.
You do not need to worry about the dependency, as it is available everywhere. (Do notice that it is not really a great run-time, esp. regarding multi-byte characters.)
Microsoft compilers can link with "static" libraries so that the resulting executable depends only on system DLLs like kernel32.dll, user32.dll, etc. MinGW cannot do this (yet).
EDIT: A concise description of the MSVCRT.DLL problem is here.
According to the MS White-paper here:
http://www.microsoft.com/en-gb/download/details.aspx?id=13350
you can redistribute certain parts of the Visual Studio components.
Some software, such as the Microsoft .NET Framework, can be
distributed. Components of software products included in MSDN
subscriptions that can be distributed (either within an application or
as separate files) without royalty are identified in the REDIST.TXT
file associated with the product. Components that can be distributed
to non-Microsoft platforms are identified in the OTHER-DIST.TXT file
associated with the product. Code identified as distributable that has
the extension .lib cannot be directly distributed; it must be linked
into the application. However, the resulting output can be
distributed.
You may also:
Modify and distribute source code and objects for code marked as “sample” or “Code Snippet”.
Distribute the unmodified output of Microsoft Merge Modules for use with an application's .msi file.
Distribute the MDAC_TYP.EXE file containing core data access components (such as the Microsoft SQL Server OLE DB provider and ODBC
driver).
Distribute the object version of C++ libraries (Microsoft Foundation Classes, Active Template Libraries, and C runtimes).
MS also produces a redistributable package specifically for the purpose of developers: http://www.microsoft.com/en-gb/download/details.aspx?id=40784
So, to answer your questions:
Yes. Although it is "purely C", it contains fundamental functions that are used by the C++ part of C as well, such as file I/O, time and date functions, math functions, and so on.
Within reason. See link above.
No, yes. As described above: You may choose to just say to customers "you need to download an install this package", but the license should allow you to distribute it free of charge with your product.
Depends on what you call "easily" and exactly what parts of the library your code uses. Some functions may be easy to replace, others not so - but it's not easy in the sense of "yes, just go do http://www.example.com/msvcrt.dll-plugin-replacement" - it would require coming up with some replacement code. The reason MinGW DOESN'T come with its own C library is that it's not entirely trivial to write a replacement for ALL of the windows functionality that you may need here...
See above - if it was easy, someone would have done it. There MAY be some compilers out there that come with their own library, but it's probably not a free-of-charge and free to distribute one (I'm not aware of any product that doesn't rely on the MSVCRT.DLL - but it's not impossible that one exists)
I'm new to programming Linux kernel modules, and many getting started guides on the topic include little information about how to build a kernel module which will run on many versions and CPU platforms of Linux. Most of the guides I've seen simply state things like, "Linux doesn't ensure any ABI/API compatibility between versions." However, other OSes do provide these guarantees for major versions, and the guides are mostly targeting 2.7 (which is a bit old now).
I was wondering if there is any kind of ABI/API compatibility now, or if there are any standard ways to deal with versioning other than isolating the kernel-dependent bits of my code into files with a ton of preprocessor directives. (Also, are there any standard preprocessor symbols I should be using in the second case?)
There isn't a stable ABI for the kernel and most likely never will be because it'd make Linux suck. The reasons for not having one are all pretty much documented in that link.
The best way to deal with this is to get your driver merged upstream where it'll be maintained by other kernel developers.
As to being cross-platform, that pretty much comes free with the Linux kernel as long as you only use the standard, platform-independent functions provided in the API.
Linux, the ying and the yang. Tangrs answer is good; it answers your question. However, there is the linux compat projects. See the backports wiki. Basically, there are libraries that provide shim functionality for newer Linux ABI's which you can use to link your code. The KERNEL_VERSION macro that Eugene notes is inspected in a compat.h, and appropriate compat-2.6.38.h, etc are included where each version has either macros and/or library functions to provide a forward API.
This lets the Linux Wifi group write code for the bleeding edge kernel, while still making it possible to compile on older kernel versions.
I guess this answers the question,
if there are any standard ways to deal with versioning?
The compat library is not a panacea, but at least it is there and under development.
Open source - There are many mutations. They all have a different plan.
can anyone tell me what gcc is?? and wts are its advantage over other compiler like turbo c and visual c
The GNU Compiler Collection is an open source (GPL) compiler. It's found on a wide variety of systems, ranging from GNU/Linux to every flavor of Unix, to Windows.
GCC contains support for many languages (C, C++, Fortran, to name but a few). It's highly portable, and widely used, and tends to produce good code. It can also be used as a cross-compiler (compiling for a system other than the one running GCC).
It's the default compiler choice for most Unix-type systems because most vendors don't bother to write their own compilers anymore - GCC is just too good for general use.
Under Windows, Microsoft's own dev tools are often preferred because they get support for new technologies quicker.
In high-performance programming environments (and some embedded environments) you may want a compiler that's more highly tuned to the chip/system in question.
The GNU Compiler Collection are the compilers used in GNU/Linux systems. I don't know that they compete with Turbo C or Visual C, which I think only run on DOS/Windows systems.
The main advantage to a user is that GCC can be installed on (and is sometimes distributed with) nearly every GNU/Linux system and can be used to build packages that are distributed as source.
I'm sure there are advantages that programmers would recognize, but maybe that's a topic for stackoverflow.com.
[Edit]
Now that this question has been migrated, see Michael Kohne's answer for some advantages to programmers.
Big advantage of gcc over Turbo C and Visual C: it's free!*
And it's ubiquitous, especially on the various *nix environments. You can use it on Windows via either cygwin or MinGW. It compiles a truly staggering number of languages (C, C++, Ada, Java, Fortran, Objective-C), and supplies an intermediate language for Haskell.
It has been used for industrial-strength projects for decades now, so you're pretty safe with it.
*(Though, in all fairness, Microsoft does offer Visual C++ Express for free, though it is not open source.)
The real true advantage of gcc over turbo C and visual C is it's availability on platforms other than Windows, and it's ability of building cross-platform binaries, meaning you can set up a build tool chain to build windows binaries on a linux box with gcc, or you can set up a similar tool chain to build some arm binaries on an intel box, which is definitely nice since you might not have as much power in your arm device as you might have in you development rig. With visual c compiler and turbo c compiler that's close to impossible.
A nice bonus to all that - gcc is open source and free.
Whereas GCC started as GNU C Compiler, it is now GNU Compiler Collection, as it has compilers for C, C++, Java, Fortran among others. Why would you want to use it? Because its better. Better, because:
Once you have written your code using the GCC compilers, you are assured that your code will work on a lot of other OSs/platforms/architectures. You will be able to do a lot more in and with your programs than you would do in Turbo C, which is pretty much tied to Windows.
GCC is used by all of those projects out there in the wild. Having some experience with GCC is definitely a great plus when you are moving to some serious programming.
and its just good karma :)
The GCC collection is open source. This allows it to be ported to MANY different platforms rather quickly because so many people work on it and have seen how it works. Commercial compilers usually are closed-source, and such only one company or consorsium can do the porting, which can be time-consuming.
I've found a company that provides GCC based toolchains for ARM, MIPS but I don't know in what are they different from the vanilla GCC, of course, they bring other software pieces such as Eclipse, but when looking only at GCC and Binutils, are they different or just the same?
One big difference between a pre-compiled toolchain (like those provided by Code Sourcery, MontaVista, Wind River, etc) and one built from source is convenience. Building a toolchain from scratch, especially for cross-compiling purposes, is tedious and can be a complete pain. Also, the newest versions of glibc (or uClibc), gcc, and binutils aren’t always compatible as they're developed independently. There are open source tools to make this process easier (like crosstool-NG), but having a proven toolchain that’s been optimized for a certain platform can save a lot of time and headaches. This is especially true at the beginning of a new project. It also helps to have technical support when things go screwy. Of course…you have to pay for it most of the time.
That being said, compiling your own toolchain will most likely save you money and can allow more flexibility down the road. MontaVista, as far as I know, doesn’t include support for older platforms in their newest toolchain releases. For example, if you bought MontaVista Pro 4.X and it included a toolchain with gcc 3.3.X, that’s the toolchain you’re most likely going to be stuck with for the life of your project. Upgrading to a toolchain with gcc 4.X most likely wouldn’t be an option.
Hope that helps.