Gcc, binutils and minimal support version of glibc? - gcc

How I can to know what is a minimal version of glibc for gcc or binutils?
Regards.

binutils doesn't generally have a minimal glibc requirement because it doesn't have too much glibc-specific details in it. it's merely a collection of low level tools like an assembler and linker and objdumper all of which are built on code included in binutils.
gcc is a different beast -- it needs to know intimate details about C library capabilities. in the specific version of gcc you have, consult the INSTALL/index.html file (and particularly, the Prerequisites page) for the requirements.

Related

Compiling a RISC-V code using GCC compiler

I am trying to compile a RISC-V C code into hex or binary file. I found out that the GCC-GNU compiler might help me.
On the wiki page of gcc, underArchitectures, there is a list of available targets and there, we can find:
Additional processors have been supported by GCC versions maintained separately from the FSF version:
...,...,...RISC-V
Does this mean that is it not supported? That i have to install something else? If yes, what and how can i do that?
Thanks for any help.

C++11 standard library features cross compilers support

I need to verify that some specific standard library feature is implemented and since which version.
For example: std::reference_wrapper
Compilers I need to verify: gcc, clang, msvc
MSVC
I am able to find https://msdn.microsoft.com/en-us/library/bb982605(v=vs.100).aspx so since version 10.0 the reference wrapper is implemented.
clang
On their webpage http://libcxx.llvm.org/ is written that the library is 100% completed. Is it possible to find in which version was what implemented?
gcc
I found: http://en.cppreference.com/w/cpp/compiler_support (language support)
Also: https://gcc.gnu.org/onlinedocs/gcc-4.6.4/libstdc++/manual/manual/status.html#status.iso.200x
- seems reference wrapper is implemented here
But for example 4.8.5 https://gcc.gnu.org/onlinedocs/gcc-4.8.5/libstdc++/manual/manual/status.html#status.iso.2011
There is:
This page describes the C++11 support in mainline GCC SVN, not in any particular release.
I'm confused. Can someone clarify that for me?
http://en.cppreference.com/w/cpp/compiler_support is probably going to be your best shot at finding compiler support versions. From there, you'd need to drill down into standard library release notes for specific implementation versions and details.

c++ libs from ubuntu 16.04 repo - compiler options

Ubuntu 16.04 comes with GCC 5.4 which does support c++11 and it is the default compiler. By default c++11 is not enabled in that particular version of GCC.
My intent is to use some of the binary libraries (not header only) from their repository (e.g. boost). In my projects I will enable c++ 11.
How were c++ libraries from the repository compiled? Is it possible to use them with c++ 11 enabled? I know that c++ libraries can be called from different languages (Java, Pythons, C# etc) by hiding all c++ stuff behind plain C interface. With boost it is not a case. If a certain function returns me a string or a vector or anything from STL then it is a problem. AFAIK STL objects binary representation depends on compiler flags (eg. std=c++11).
Thank you.
Which exact libraries are you talking about?
If you are talking about the standard library, libstdc++ is a part of gcc. It is always okay to link it no matter which standard you compile at. gcc also made a decision to include ABI tags, so that they can be ABI compatible with code compiled at C++11 and pre C++11. See for instance TC's really nice answer to a question I asked here:
Is this simple C++ program using <locale> correct?
If by
How were c++ libraries from the repository compiled?
you mean, how are all of the C++ libraries in the ubuntu repositories compiled, the answer is, it may be different for each one.
For instance if you want to use libfreetype6-dev or libsdl2-dev, these are C libraries, they will be okay to link to no matter what standard you target.
If you want to use libsilly-dev from CEGUI, that is a C++ library, and it is usually best to use the exact same compiler for your project and the C++ lib that you are linking to. If it appears in ubuntu repository, you can assume it was built with the default g++ version that ubuntu is shipping. If you need to use a different compiler, it's probably best to build the C++ lib yourself -- in general C++ is not ABI stable across different compilers, or even different versions of the same compiler.
If you want to use compiled boost libraries, it's probably best to use the libs they give you and use the compiler they give you. If you only use header-only boost, then the compiler doesn't matter since you don't actually have to link with something they built. So you then have more flexibility with respect to compilers.
Often, if you need to use C++ libraries, it's best to integrate their build system into yours so that it can be easily rebuilt from source and you only have to configure the compiler once. (At least in my experience.) This can save a lot of time when you decide to upgrade compilers later. If you use cmake then it's often feasible, but sometimes this can be hard, especially if you have a lot of C++ dependencies. If you don't use cmake, well, many libraries use cmake and it won't be that easy to integrate them this way. cmake is still kind of a pain anyways, so this might not be such a loss.

Questions about necessary steps for building GCC

I have a few questions about the build process for building GCC that i was hoping someone could explain to me.
Why is it necessary to unset C_INLCUDE_PATH CPLUS_INCLUDE_PATH LD_LIBRARY_PATH LIBRARY_PATH
Why does GCC require MPFR, MPC, and GMP to build? And if old versions (as downloaded with the download_prerequisites) and then newer versions are installed later, which will be used by a compiled program?
Why does GCC require MPFR, MPC, and GMP to build?
I can answer this part. MPFR and MPC are necessary to apply floating-point operations at compile-time. In theory MPFR could be used to parse decimal constants in the source code (GCC developers have said several time that since they depend on MPFR now, they might as well use it for that but to my knowledge, GCC's decimal-to-floating-point conversion still relies on their own code in real.c). Using MPFR also allows to make cross-compilers hosted on a machine that doesn't have floating-point (or has floating-point with different characteristics than the target architecture).
GMP is just a dependency of the other two.
It used not to be like this, the dependency towards MPFR is a relatively recent change (say a couple of years).
And if old versions (as downloaded with the download_prerequisites) and then newer versions are installed later, which will be used by a compiled program?
The GMP, MPFR, MPC libraries are used at compile-time only. Any program that has already been compiled was compiled with the version of these libraries that the compiler used at that time. It doesn't change anything from the point of view of a compiled program if you update the library afterwards.
While I'm here, I think I can explain the other thing as well:
Why is it necessary to unset C_INLCUDE_PATH CPLUS_INCLUDE_PATH LD_LIBRARY_PATH LIBRARY_PATH
Because the build process uses these variables for its own purposes and it will mess it up if you set them.

Is there an advantage to upgrade Binutils from 2.16.1 to 2.19? Why?

In the PSPSDK (Homebrew) we are using the Binutils 2.16.1 to assemble and link the code for the PlayStation Portable, however that release is getting quite outdated (3 versions have superseded it). The community and me have been updating the GCC and newlib to the latest stable versions and everything seems to work with the old binutils.
Will GCC produce better code with binutils 2.19? Why?
Will binutils 2.19 produce better elf files and libs than 2.16.1? Why?
binutils 2.19 has a new ELF linker called gold which is multi-threaded, written in modern C++, and quite a bit faster than the usual ld linker. I'm not sure however about the work involved to adapt it.
Other than that, well new versions always are a good idea. Performance and bug fixes are likely to have been included, of course. I think i would certainly try it and if something goes wrong you can still backstep.
In general, you don't need to upgrade binutils unless you run into some bug fixed in a later binutils version, or need new features (such as linker build-ids).
In particular, GCC code generation is largely independent of binutils (except for constructs like __thread, which require certain level of support from binutils).

Resources