How do I compile .rc files when building with CL.exe? - visual-studio

I'm running cl for my build phase. How do I compile embedded resources into the final executable?

Run rc.exe to compile the .rc script into a .res. Pass that to the linker to get it embedded in the final image.
There are preciously few reasons to not just let the IDE take care of this btw. It does a lot of other stuff automagically. Like getting the proper manifest embedded. And creating a debug build that helps you diagnose uninitialized variables and stack corruption. And supporting edit + continue. But yes, you can do this too if you know all the command line switches. Best way to find them is to build some sample projects with the IDE and look at the build log files to see what's being used.

Related

Cannot build EmulationStation (VS2015) from CMake solution file

I'm having difficulties trying to compile an opensource framework (EmulationStation) in VS2015 on Windows. I've never used any of the tools before, apart from Visual Studio - so please forgive me if these are some obvious mistakes.
The guide says i need to do like this:
Boost (you'll need to compile yourself or get the pre-compiled binaries)
Eigen3 (header-only library)
FreeImage
FreeType2 (you'll need to compile)
SDL2
cURL (you'll need to compile or get the pre-compiled DLL version)
(Remember to copy necessary .DLLs into the same folder as the executable: probably FreeImage.dll, freetype6.dll, SDL2.dll, libcurl.dll, and zlib1.dll. Exact list depends on if you built your libraries in "static" mode or not.)
CMake (this is used for generating the Visual Studio project)
(If you don't know how to use CMake, here are some hints: run cmake-gui and point it at your EmulationStation folder. Point the "build" directory somewhere - I use EmulationStation/build. Click configure, choose "Visual Studio [year] Project", fill in red fields as they appear and keep clicking Configure (you may need to check "Advanced"), then click Generate.)
This is how my CMake looks like (it says generating done)
I get alot of compilation errors in visual studio when trying to build though:
1) Cannot open include file: 'curl/curl.h': No such file or directory (compiling source file C:\Users\retropie\Documents\GitHub\EmulationStation\es-app\src\guis\GuiMetaDataEd.cpp) emulationstation C:\Users\retropie\Documents\GitHub\EmulationStation\es-core\src\HttpReq.h
Where do I get this header file from?
2) 'round': redefinition; different exception specifications (compiling source file C:\Users\retropie\Documents\GitHub\EmulationStation\es-app\src\guis\GuiMenu.cpp) emulationstation C:\Users\retropie\Documents\GitHub\EmulationStation\es-core\src\Util.h 18
I have a lot of these errors with round. Am I missing a reference to a library?
Another screendump of some of the errors from VS2015:
Hope someone can point me in the right direction.
I am currently in de same boat as you, trying to get ES building under MSVS2015.
I am also very green, so hopefully others chime in as well.
Regarding the 'round' errors, apparently the MS compiler has no knowledge of these. For this issue, and some others, the newer ES fork by Herdinger has fixed this.
As this is currently the most active ES branch out there, and has the explicit goal of consolidating at least some of the backlog of PRs from the original Aloshi git, I would suggest you use this one.
In issue #4, there is some more information on building in recent VS versions. There is also a link for the precompiled cURL libs, including the header.
Having gone that far, I am sad to say that I still do not have a succesfull build as of yet. Compiling is no problem, however linking gives me a LNK2005 error.
Hope this helps a bit. Let me know how you fare.

Error Linking Boost Libraries With Quantlib

I am trying to build Quantlib using Boost Libraries.
I followed the instructions here: and also on the Quantlib website.
I downloaded and unzipped boost_1_57_0 into C:\program files
I then used the Visual Studio 2013 x64 Native prompt to go to the boost directory and ran
bootstrap.bat
and then
b2 --toolset=msvc --build-type=complete architecture=x86 address-model=64 stage
Then I opened Quantlib_vc12.sln in Visual Studio 2013.
Picked "Release" and "x64", opened "Quantlib" in Property Manager and set the VC++ Directories.
In the include directories I added C:\Programm Files\boost_1_57_0
In the Library Directories I added C:\Program Files\boost_1_57_0\stage\lib
Then I went to the Solution Explorer and right clicked and chose build.
I got one LNK1104 error.
LNK1104: cannot open file 'libboost_unit_test_framework-vc120-mt-1_57.lib
Please see attached screenshot:
I have no idea how to fix this and I would really appreciate some help. I had successfully installed this at work using an admin account but was not able to access Quantlib using my user account. I have since deleted and attempted installations atleast 15 times but it's not working. I am worried that all these attempts at installing may have messed something else up, like some registry (I have no idea how that works but I only know to be afraid). Please help! Thanks.
UPDATE: Still get the same error after adding BOOST_AUTO_LINK_NOMANGLE define to project.
UPDATE2: I am getting these messages on the screen while running b2 to build boost. Is this an error I need to fix?
This is exactly what I warned you about in another related question/answer. What's happening here is that the boost headers you are including in this quantlib are (through macros) detecting that you're using MSVC, detecting the version, then automatically linking the required DLL files to build quantlib using #pragma comment(lib....). So even though under Project Settings -> C/C++ -> Linker there are no external DLL's or Lib's specified, they're still being linked by these pragma statements.
So when these macros are detecting your compiler and so on, they're dynamically building a string name of what they think the required libraries would be named on your system. Remember when you built boost, you specified the -layout option. This the naming layout of your boost libraries. Well by default, that layout is something like this:
LIB_LIBRARY_NAME_COMPILER_VERSION_SingleOrMultiThreaded_BOOST_VERSION.LIB
Which in practice looks like this:
libboost_unit_test_framework-vc120-mt-1_57.lib
This is boost "mangling" the name of your library to be as descriptive as possible about how the libraries were build so that, just by glancing at the file name, you know. What we do with -layout=system is tell the boost build system NOT to mangle the names, but to name them according to what option we gave to "layout". Since we chose layout=system, boost is going to name our libraries like this:
LIB_LIBRARY_NAME.LIB
Which in practice will produce:
libboost_unit_test_framework.lib
So when we start using boost after doing this (with MSVC only does this happen), these dynamically generated linker statements don't give a rip about or know about what -layout option you built boost with. They will attempt to link in required libraries using the fully mangled naming format, which is why you get the error:
cannot open file 'libboost_unit_test_framework-vc120-mt-1_57.lib
.. because you don't have a file named that! That's the mangled name! You have a file named libboost_unit_test_framework.lib. See the difference! So, you need to tell these stupid macros to stop mangling the library names when auto-linking required libraries. You do that by adding the following preprocessor definition to your Quantlib project:
BOOST_AUTO_LINK_NOMANGLE
You add that in Project Settings -> C/C++ -> Preprocessor -> Preprocessor definitions.
If you'd rather avoid this headache and don't care about the long and (imo ugly) mangling that boost does to library names, you can build boost omitting the -layout option and it will default to this mangled naming convention, where you shouldn't get stuck on this error at all anymore. I personally put out the effort to keep nice short/clean library names but it's all about preference.
Edit
Since you have the same error after fixing the NO_MANGLE problem, then the only possible reason that you're getting this particular link error is that you do not have whatever file the linker is complaining about missing stored in any of the directories supplied to the linker.
Verify the folders/paths you provide to the linker and verify that the file the linker is looking for is in one of the directories that you're providing to the linker. You have to provide directories to the linker because you're telling the linker "you can look in all of these places for the libraries my project needs". If you specify none, it's got nowhere to look. :(
Example:

How to install and use open source library on Windows?

I'd like to use open source library on Windows. (ex:Aquila, following http://aquila-dsp.org/articles/iteration-over-wave-file-data-revisited/) But I can't understand anything about "Build System"... Everyone just say like, "Unzip the tar, do configure, make, make file" at Linux, but I want to use them for Windows. There are some several questions.
i) Why do I have to "Install" for just source code? Why can't I use these header files by copying them to the working directory and throw #include ".\aquila\global.h" ??
ii) What are Configuration and Make/Make Install? I can't understand them. I just know that configuration open source with Windows need "CMake", and it is configuration tool... But what it actually does??
iii) Though I've done : cmake, mingw32-make, mingw32-make install... My compiler said "undefined references to ...". What this means and what should I do with them?
You don't need to install for sources. You do need to install for the libraries that get built from that source code and that your code is going to use.
configure is the standard name for the script that does build configuration for the software about to be built. The usual way it is run (and how you will see it mentioned) is ./configure.
make is a build management tool (as the tag here on SO will tell you). One of the most common mechanisms for building code on linux (etc.) is to use the autotools suite which uses the aforementioned configure script to generate build configuration information for use by generated makefiles which make then uses to build the software. make is also the way to run the default build target defined in a makefile (which is often the all target and which usually builds the appropriate library/binary/etc.).
make install is a specific, secondary, invocation of the make tool on the install target which (generally) installs the (in this case previously) built code into an appropriate location (in the autotools/configure universe the default location is generally under /usr/local).
cmake is, again as the SO tag says, a build system that generates configuration files for other build tools (make, VS, etc.). This allows the developers to create the build configuration once and build on multiple platforms/etc. (at least in theory).
If running cmake worked correctly then it should have generated the correct information for whatever target system you told it to use (make or VS or whatever). Assuming that was make that should have allowed mingw32-make to build the software correctly (assuming additionally that mingw32-make is not a distinct cmake target than make). If that is not working correctly then something is still missing from your system (and cmake probably should have caught that).
But to give any more detail you will need to give more detail about what errors you are actually getting and from what command.
(Oh, and on Windows, and especially if you plan on building your software with VS (or some other non-mingw32-make tool) the chances of you needing to run mingw32-make install are incredibly small).
For Windows use cmake or latest ninja.
The process is not simple or straight, but achievable. You need to write CMake configuration.
Building process is not simple and straight, that's why there exists language like Java(that's another thing though)
Rely on CMake build the library, and you will get the Open-Source library for Windows.
You can distribute this as library for Windows systems, distribute and integrate with your own software, include the Open Source library, in either cases, you would have to build it for Windows.
Writing CMake helps, it would be helpful to build for other platforms as well.
Now Question comes: Is there any other way except CMake for Windows Build
Would you love the flavor of writing directly Assembly?
If obviously answer is no, you would have to write CMake and generate sln for MSVC and other compilers.
Just fix some of the errors comes, read the FAQ, Documentation before building an Open Source library. And fix the errors as they lurk through.
It is like handling burning iron, but it pays if you're working on something meaningful. Most of the server libraries are Open Source(e.g. age old Apache httpd). So, think before what you're doing.
There are also not many useful Open Source libraries which you could use in your project, but it's the way to Use the Open Source libraries.

How to compile opencv 245 with visual studio 2010 and openCL support?

I have much trouble compiling OpenCV 245 with GPU support. With some effort I managed to have cuda support up and running, but now I am stuck on opencl, here is the problem:
At some point during the compilation, the file kernels.cpp is generated, containing all kernel functions as strings. For what I understand, they are converted automatically from the .cl files with the cl2cpp.cmake script.
What I don't understand is that one file is excluded from the build: nonfree_surf.cl (which is on my disk, alongside all other .cl files) is not included, either in the visual studio project, or in the kernels.cpp that is automatically generated. This leads to an undefined error at link time.
I have tried manually adding the nonfree_surf.cl to the visual studio project. This does not change anything. In the CMakeLists.txt for ocl module, all the cl files seem to be automatically added with the line:
file(GLOB CL_FILES "${CMAKE_CURRENT_SOURCE_DIR}/src/kernels/*.cl")
I have tried manually adding nonfree_surf to CL_FILES, with no effect.
I have very little knowledge of CMake, hence I don't understand well what is going on. Can somebody give me a clue how I could find the reason of this behavior, namely:
Why are all .cl files added to my VS project, except nonfree_surf.cl ?
How can I correct this ?
Maybe I can execute cl2cpp.cmake script manually ? If so, how ?
Managed it by manually running the script:
cmake -DOUTPUT_PATH=c:/opencv/kernels2.cpp -DCL_DIR:PATH=c:/opencv/modules/ocl/src/kernels -P "c:\opencv\modules\ocl\cl2cpp.cmake"
For some reason (probably the same that makes nonfree_surf not be processed), not every necessary function are processed this way, so I juste copy-pasted the nonfree_surf string into kernels.cpp, and proceeded with the build.
If anyone needs the binaries, since they are a pain to compile, here they are:
Opencv 2.4.5 binaries compiled with VS2010 x86 (WIN32) including ocl and gpu library.

Not reproducible exe hang of Matlab Compiler output executable

I have the following problem:
I have a Matlab program in form of some set of *.m files. It is later compiled into executable and used. The problem is that occasionally the resulted executable just hangs and this behavior cannot be reproduced when debugging/running *.m files from IDE (even using the same input data).
To figure out what hapens I intended to:
compile (somehow) *.m files into C/C++
compile C/C++ as debug to get .exe and .pdb
And later when .exe hangs just 'attach' visual studio debugger to hanged .exe to check where it 'loops/waits'.
Unfortunaley Matlab Compiler (as I was told today) does not produce C/C++ code before creating executable. I was misleaded by -g option of mcc which according to the documenttion is supposed to do the following:
-g Generate Debugging Information
Include debugging symbol information for the C/C++ code generated by MATLAB Compiler.
It looks exacly like the thing I want to archive.
I would appreciate if someone could explain me that discapency or suggest how to archive what I am trying to do (if it can be done at all).
It is not possible to create a debuggable code in Matlab compiler, because the deployed code uses MCR. (Matlab virtual machine) .
See this question : Is there any way to debug compiled components using Matlab Debugger?
Since you don't have errors, but rather an infinite loop, the best solution in that case would be screen outputs, and hopefully you will trace the bug.

Resources