Cross-compiling for target with lower version of libstdc++/libgcc - gcc

I want to use latest gcc compiler, but target pc configuration only has libstdc++/libc suitable for gcc 4.8.
Is there any way to tell compiler to link against older abi?

I've managed to run my application, built with newer compiler (gcc 6), when I used -std=gnu++11 flag. It seems explicitly specifying standard version makes compiler link to maximum required version of abi.

Related

Targetting an older version of libstdc++ with recent GCC when cross-copiling to an embedded-linux ARM device

We need to find a cross-compilation toolchain for an ARM embedded linux target that satisfies the following criteria:
Kernel 3.17
GLIBC 2.18
Recent version of GCC is required to compile some third-party code
Those requirements brought me to generate a custom cross-compilation toolchain using crosstool-ng. I selected the min kernel version, min glibc version and it seemed to work well until I tried to compile code containing C++.
Because the new GCC is using a more recent libstdc++ than what is available on the target, the executable won't run and we get an error like this:
/usr/lib/libstdc++.so.6: version `CXXABI_1.3.9' not found
The code compiled fine with an older version of GCC.
Looking at the configuration options for crosstool-ng I didn't find anything that would let me change the min libstdc++ version, like for glibc.
Is there a way to target an older libstdc++ version without downgrading GCC?
Can I use the headers and libstdc++.so files from the target to replace the ones GCC is using when cross-compiling?

AVX and newer intrinsics with GCC on Mac; what assembler would one need?

I have been tweaking GCC 6.3.0 to get it to use the libc++ runtime instead of libstdc++ so that g++ can be used without worrying about C++ runtime incompatibilities:
https://github.com/RJVB/macstrop/tree/master/lang/gcc6
The tweak works, I can build and run KDE software using g++ against Qt5 and KF5 frameworks (and everything else) built with various clang versions.
What doesn't work is generating code that uses AVX and presumably most or all newer intrinsic instructions.
This is not a new issue that's never been invoked on here; it's answered here for instance: How to use AVX/pclmulqdq on Mac OS X
Evidently one can configure gcc to call the linked script instead of the actual as executable.
But can gcc not be configured to use another assembler altogether, like nasm, and would that solve this issue?

g++ (tdm-1) 4.7.1 doesnt support all c++11 features

it's supposed g++ (tdm-1) 4.7.1 that comes with codeblocks for windows support all C++11 features, std::stoi(str) isnt reconized, same for other c++11 functions. (string header is included).
Do i need to look for another compiler ?
This is due to missing C library functions in MinGW, see the last few comments on https://gcc.gnu.org/bugzilla/show_bug.cgi?id=37522
I made some improvements so it is supported in MinGW GCC 4.9 and later, so you could just upgrade to a later TDM build.

Is codecvt not supported by Clang or GCC?

I can't even get the basic codecvt example from cppreference.com to compile on GCC 4.9 or Clang 3.4, e.g.:
http://goo.gl/HZ5GLH
http://coliru.stacked-crooked.com/a/345d6d89949ac1e6
According to libstdc++ manual, part 22.4.1, it is missing the support for codecvt, even on the latest version.
And libc++ has complete support for C++11 and C++14 features, so you should use it on CLang, specifying the -stdlib=libc++ compiler flag (make sure you have it installed).
Edit: As of current versions of libstdc++, codecvt is now supported.

Forcing G++ (GCC) to a specific libstdc++ version (GLIBCXX_*)

I'm trying to build a binary with GCC 4.9.0 that is backwards-compatible against libstdc++. According to GCC's ABI Policy and Guidelines and Options Controlling C++ Dialect, the command line option -fabi-version should do the trick; however, no matter which version I set, I still get imports of symbols from a version newer then desired, like this:
$ objdump -T binary | grep GLIBCXX_3.4.20
00000000 DF *UND* 00000000 GLIBCXX_3.4.20 _ZSt24__throw_out_of_range_fmtPKcz
I've tried -fabi-version=1 to -fabi-version=5 (ABI version 5 corresponds to GCC 4.6, which is guaranteed to be present on the target system), but those imports keep winding up in the resulting files.
How do I fix this? Going back to an old GCC version is not an option to me for other reasons.
the command line option -fabi-version should do the trick
No, that's completely unrelated to what you want. That option affects the code generated by the compiler, it does not mean you can link to an older version of libstdc++ (which is what you would need in order to stop depending on symbols in the newer libstdc++).
You cannot link to an older libstdc++ with a new GCC. The version of libstdc++ is tightly coupled to the version of GCC, so if you want to linker to an older libstdc++ then you need to compile with an older GCC.
You cannot tell libstdc++ to not use the new symbols, the reason it depends on them is because it needs them. Use an older libstdc++.
Going back to an old GCC version is not an option to me for other reasons.
Then you're screwed.
You either need to use an older GCC, or not link dynamically to libstdc++.so.
On Red Hat Enterprise Linux or CentOS you would have the option of using a newer GCC from the Developer Toolset which avoids linking to the new libstdc++.so but that is only compatible with the system GCC, which is GCC 4.4 for RHEL6 or GCC 4.7 for RHEL7. You can't use it to be compatible with GCC 4.6.

Resources