on windows, via codelite (compiler gcc), I wrote a simple program and wanted to build it into a dll.
After I built the project, no dll came out.
Then I ported the program to dev-c++, built it, dll came out successfully.
why codelite didn't work?
Because I chose gcc?
If anyone else is having issues with creating/linking libraries with CodeLite on Windows, ensure that the output filename does not contain the .so extension. By default, CodeLite on Windows uses the Unix shared object (.so) extension, so change it to dynamically linked library (.dll) in you project options.
<_<
Spent a very annoying three hours playing with configurations and puling my hair out as to why the linker (ld) could not find my compiled libraries. Time for some much needed sleep...zzzz
CodeLite use gcc/g++ as its compiler.
probably, its gcc/g++ compiler won't work as you expect on your winOS.
use MingW or Cygwin, they will work as you expected if you stick to gcc/g++.
Related
I am building Go code that uses CGo heavily and this code must be compiled into a shared or static library (static is highly preferred). (code for reference)
It all works just fine on Linux and Mac, but on Windows it fails on linker stage either saying that all 4 modes (c-shared, shared, c-archive, archive) are not available or if invoke go tool link -shared manually complains about missing windows specific instructions.
My understanding is that all I need to build usable lib.a is to compile everything I will use into object files (*.o) and then put it through ar to produce usable static library.
Now the question is whether I can completely skip Go's linker and based on prepared .o files create .a manually?
How would I go about doing that if that is even possible?
Looks like gcc on windows is unable to automatically discover necessary shared libraries. The problem was caused by GCC and not by Go.
Although for compiling Go I had to use self-compiled master tip as current release (1.6.2) does not support shared/static libraries on windows/amd64.
Manually feeding gcc with each shared library (ntdll, winmm etc) in default location (C:\Windows\SysWOW64) has fixed the problem.
I have installed MinGW on my windows7 machine, using instructions from here. Basically I used the GUI installer assistant called mingw-get-setup.exe. The installation manager allowed me to select a package called mingw32-libpthreadgc which installs bin/pthreadGC2.dll and bin/pthreadGCE2.dll.
To my knowledge that is not sufficient to compile code depending on the pthread library. E.g. trying to compile a file with a header-include like #include "pthread.h" - to no surprise - results in a "file not found" compile error. I can't find that header in my MinGW directory. If I use includes/headers from elsewhere, I'm afraid they might not match with the DLL's interface. How is this meant to be working?
(Furthermore I like to use CodeBlocks as the IDE. How would I set up a simple "HelloWorld"-like pthread program to get it all to work? There seems to be a lot of conflicting messages out there on how to set it up. Use "-pthread" vs "-lpthread". Set it in compiler and linker settings, right? Copy-paste the DLL's? What else???)
I had similar problem, https://www.sourceware.org/pthreads-win32/ this did job for me,
I used this in combination with mingw32. It also has nice README file.
I'm working on a project focusing on the MIPS32 arch (little endian). The vendor gave me a GNU toolchain to compile my project targeting their embedded Linux version and everything works just fine. It's a GCC+Linux+uClibc toolchain.
However, recently I've needed to add some features to my uClibc build, so I've tried to replicate the vendor's toolchain in my own box.
Everything worked fine with the help of crosstool-ng, but when I try to compile my project, I get strange linker warnings all over the place:
warning: linking abicalls files with non-abicalls files
From what I researched, these are pretty serious warnings. Analyzing my object files with readelf will give me almost identical output. There are no .abicall section anywhere in those files. This holds true for both my project's object files to my toolchain's ones.
What could be wrong here? I don't even know where to start debugging this.
I am atempting to build a Multiplatform desktop application using WxWidegts. As the IDE I am using Codlite. Version info is
Codlite: Revision 5770
WxWidgets: 2.9.4
OS: Windows 7
Compiler: g++
The problem is, after compiling, trying to start the program will give an error that tells me wxbase294u_gcc_cl.dll is missing. I thought maybe its a debug library thing, so I set the build configs to release but still the same error.
My understanding was that Wxwidgets builds nativly to the OS so it shouldn't be dependant on such a large dll. The dll exists in the libs that was installed by Codlite, but the system does not seem to pick it up.
Am I supposed to build using VC++? not sure how to set that on the build settings.
I've been a WebApp guy for a long time and new to Cross-Platform devlopment, so help me out if I'm going the wrong way.
Thanks in advance.
The wxWidgets library code must be linked to your application code. This can be done in two ways: A) to use static libraries, which are linked to your application executable when it is built, or B) to use DLLs, which are linked to your application when it runs.
From your question, you have built your application to use DLLS.
You have two options to fix this problem. Easiest will be to copy the required DLLs into your application folder.
You can also change the codelite options to use wxWidgets static libraries - you will need a codelite expert to help you to do that.
I took a quick look at the codelite webpage. It does look like codelite uses wxWidgets DLLs by default. To change this, you will need to built wxWidgets the "DIY" way as described here but set the SHARED make option to 0
I'm trying to migrate a project which uses Boost (particularly boost::thread and boost::asio) to VxWorks.
I can't get boost to compile using the vxworks gnu compiler. I figured that this wasn't going to be an issue as I'd seen patches on the boost trac that purport to make this possible, and since the vxworks compiler is part of the gnu tool chain I should be able to follow the directions in the boost docs for cross compilation.
I'm building on windows for a ppc vxworks.
I changed the user-config.jam file as specified in the boost docs, and used the target-os=linux option to bjam, but bjam appears to hang before it can compile. Closer inspection of the commands issued by bjam (by invoking it using the -n option) reveal that it's trying to compile with boost::thread's win32 files. This can't be right, as vxworks uses pthreads.
My bjam command: .\bjam --with-thread toolset=gcc-ppc target-os=linux gcc-ppc is set in user-config to point to the g++ppc vxworks cross compiler.
What am I doing wrong? I believe I have followed the docs to the letter.
If it's #including win32 headers instead of the pthread ones, there could be a discrepancy between the set of macros your compiler is defining and the macros the boost headers are checking for. I had a problem like that with the smart pointer headers, which in an older version of boost would check for __ppc but my compiler defined __ppc__ (or vice versa, can't remember).
touch empty.cpp
ccppc -dD -E empty.cpp
That will show you what macros are predefined by your compiler.
I never tried to compile boost for VxWorks, since I only needed a few of the headers.
Try also adding
threadapi=pthread
The documentation you mention is for Boost.Build -- which is standalone build tool -- and the above flag is something specific to Boost.Thread library. What do you mean by "hang"? Because Boost libraries are huge, it sometimes take a lot of time to scan dependencies prior to build.
If it actually hangs, can you catch bjam in a debugger and produce a backtrace? Also, log of any output will help.