CMake - Visual Studio project dependencies - visual-studio

In few words
My question is: Is there a way for generating a visual studio solutions that contains a project, for which I'm writing the CMake, and another project, for which the CMake is already available?
I was thinking about using add_subdirectory, but I'm not sure this is the best solution and the meaning of the parameter binary_dir is not really clear to me.
More in details
I have a Visual Studio project (B) which depends on another one (A).
I've written the CMakeLists for A and I'm able of generating the VS project and compiling it.
I've written a CMakeLists for B. I'm able to configure the VS project for B in such a way that it can access the headers and the libraries of A. Everything compiles.
A
- CMakeLists.txt
- src
- *.cpp
B
- CMakeLists.txt
- src
- *.cpp
Build
- Build_A
- Build_B
However when I change something in A, from the B project (that's possible because with visual studio I can access the headers of A from the project B), and I try to compile, A is not compiled.
The project B compiles only B.
I was able to improve the situation by using add_subdirectory. In the CMakeLists of B I included the following line:
add_subdirectory(PATH_TO_A A)
However with this solutions there is a duplication of the compilation files of A. They are generated in the out-of-source build directory BUILD_A when I compile from the project A; and they are generated in the out-of-source build directory BUILD_B\A when I compile from the project B.
What's the best practice for including the A project in the solution generated from the CMakeLists of B and avoid duplication of compilation files?
P.S.
If possible I would like to avoid the need of recompiling everything twice. That is, If I have compiled A and I need to compile B for the first time I would like to avoid the need of recompiling A.

In principle using add_subdirectory is a good way to solve this.
A CMake project that can be compiled on its own can also be compiled as a subproject, if it follows a few rules. First of all, any change to the global state will of course also affect all other projects being built from the same root. Aggressive changes to the global build environment are likely to disturb other projects and should be avoided.
Additionally, a project that is being pulled in like this must not assume that its CMakeLists file is the root of the build. For instance, the CMAKE_BINARY_DIR variable always points to the top-level binary directory, while the PROJECT_BINARY_DIR points to the binary directory of the most recent project. A library that always assumes it's being built as a top-level binary may use those two interchangeably. A library that can be built both on its own or as a sub-project must be aware of the difference.
Note that add_subdirectory introduces a tight coupling between the projects and only works well if both involved projects behave accordingly.
A more loosely coupled alternative would be to include the dependency as an ExternalProject, but from what you wrote in the question (project should automatically recompile if source of the dependency changes), add_subdirectory seems the better match for you.

Related

Avoiding build twice when using a shared project together with build generated code

I have a visual studio solution with multiple projects. One generates code files as part of pre-build (grpc classes via Grpc.Tools). There is also a shared project that extends the partial classes built as part of that pre-build.
However, sometimes for one reason or another - like compiling the client half of this (client uses the shared project to extend its own classes), compilation will error because the shared project can't find the generated classes yet. Presumably they don't exist. It's fixed easily by compiling the project twice.
Is there something I can do in this scenario? Is it possible to somehow move validating/compiling the shared project "further down" the compilation pipeline? Or even just set that particular project to try and compile twice if there's an error? Or is this the kind of thing that realistically I should just live with given what I'm doing - I haven't found any other references to this problem. It's not that big of an issue and it wouldn't happen very often, but I'd like to handle it reasonably if I can.
Edit
If I wasn't clear, this is a shared project, as in a .shproj, a project that is not compiled separately. The project that references it includes it and builds it all together as one.
If project B depends on project A, then project A must be built before project B. Visual Studio is smart enough to figure out the build order this way. Incidentally, this is also one of the reasons (among many) why circular dependencies simply cannot work.
I suspect that your projects are currently not linked via a dependency, as this issue wouldn't occur if there were such a link. Perhaps your second project is accessing the first project's files via the file system? That's just a guess though.
You can use this "A before B which depends on A" behavior of the build process to your advantage. Have project B (i.e. the project you need to go second) add project A (i.e. the project you need to go first) as a dependency. This forces VS to build them in the appropriate order.
Some remarks:
I am unsure if VS is able to omit dependencies that you add but not actually depend on (i.e. you never reference its content). I can't find any confirmation on this point (but absence of proof is not proof of absence!) But even if that is the case, that could be easily worked around by having a dummy class in B which actually references and uses something from project A.
Keep in mind that during a regular build, VS does not rebuild projects that have not changed since the last build. If this is an issue for you (unsure if it is, you didn't add enough context), make sure to always rebuild or clean to make sure that a new build will be triggered.
However, sometimes for one reason or another - like compiling the client half of this (client uses the shared project to extend its own classes), compilation will error because the shared project can't find the generated classes yet. Presumably they don't exist. It's fixed easily by compiling the project twice.
That it is only sometimes and can be fixed by "trying again" points at one thing - you got race condition. But a race condition during compilation, is not a thing I heard off or encountered before.
I got a few possible cultripts. But in the end, race conditions are notoriously hard to debug:
- Maybe the compiler that deals with the shared project returns before it is finished - wich should be impossible - or
- Maybe something causes the main projects to compile before the shared projects files are ready.
- Maybe a 3rd party tool - like a Virus scanner or auto backup maker inteferes?
- Maybe the shared projects compiled files are hosted on a network drive, and there sometimes is just the slightest delay between "compiled" and "visible to all other computers in the network"?
Usually the proper ways for dependant compilation should deal with such issues. That indicates that what you got there, is propably not the most stable setup.

How to Add a Static Library to a VS 2015 Fortran Project?

How do I add a Static Library to a VS 2015 Fortran Project?
I've searched for the answer to this question online, but the solutions I've found (linked below) don't seem to work for me.
How to link a .LIB in a MS Visual Studio / Intel Fortran project?
https://software.intel.com/en-us/forums/intel-visual-fortran-compiler-for-windows/topic/393843
I'm using VS 2015 and Intel Fortran 2017.
I have created a static library from my Utilities project and I would like to be able to use the 'Utilities.lib' file in a different project (PhysicsCore) without having all of the source included.
I've tried dragging and dropping the 'Utilities.lib' file into the PhysicsCore Project. I've tried adding existing file and adding 'Utilities.lib'. I've tried adding the lib file and all of the '.mod' and '.obj' files. I've tried going under properties -> librarian -> additional dependencies. All of these end with the PhysicsCore project failing to compile due to missing procedures and modules.
I have gotten it to work one way that isn't very helpful. I have added a new project to the solution and then added in all of the '.obj' and '.mod' files and the '.lib' file. Changed the solution settings to not rebuild that project. And then finally added that non-building project as a dependency of the PhysicsCore project.
I feel like I must just be missing something small.
EDIT: years later. I finally came across the issue. If the library were all in .f90 files everything would work fine, but I'm using modules which require the .mod files. Everything was doing what it was supposed to as far as I can tell; however, it didn't behave the way I expected it to.
There are several ways:
Drag the .lib into the project as a source file. You say this didn't work, but it always has when I have done it.
In the Linker project properties, add the full path to the .lib to Linker > Input > Additional Dependencies, or just add the .lib name there and add the directory path to Linker > General > Additional Library Directories.
If the parent project is also Fortran, right click on the parent project, select Build Dependencies > Project Dependencies. Check the box for the library project. (This does not work if the parent project is not Fortran.)
I would generally recommend #3, as this will also make the .mod files from the library project visible to the parent project. If you choose one of the other methods, you then also have to make any include or .mod files visible by adding the directory path to the project property Fortran > General > Additional Include Directories.
If you need more help with this, I suggest asking in https://software.intel.com/en-us/forums/intel-visual-fortran-compiler-for-windows

Linked project references aren't being copied to target folder

I have 2 c++ projects in a solution.
ExecB (an executable) depends on ProjA (a dll).
So in ExecB's properties I add ProjA as a reference, and select Copy Local = true.
The problem is, ProjA's dll isn't being copied to ExecB's output folder folder. So the executable obviously doesn't run.
Any suggestions ?
For C++ projects, the Visual Studio template/wizard sets the Output Directory to a subfolder of the solution: $(SolutionDir)$(Configuration)\. This is so the DLL Search Path works well for the developer. It even works if you have added projects to the solution from outside the solution folder hierarchy; The build will put all binaries into the output folder for that solution.
If this isn't working, check the Output Directory property on all platform/configuration combinations of all your projects. Also make sure that the Build Configuration Manager shows that your selected solution build is building all the projects appropriate for the solution platform/configuration.
The Copy Local in Project References that you are trying applies only to referenced .NET assemblies. The docs are ambiguous and too terse on that. (Most often undistinguished use of "assembly" means .NET assembly rather than WinSxS assembly.)

How do I use classes from another project in the same solution in Visual Studio 2010?

I'm working on a object-oriented Windows API wrapper library, written in C++, and I have two projects inside the solution:
The actual library project;
A "test" project, where I write code that uses the library for testing purposes.
My goal is to be able to include and use the library header files on the test project, as if it was an actual project that uses the library.
I solved the file inclusion problem by adding "$(SolutionDir)" to the test project's additional include directories (is there a cleaner way?), but I'm struggling to get the test project to link. I get unreferenced externals errors, which I assume is because the linker can't find the DLL.
I'm completely lost here. I have set up project-to-project references, so that the test project is dependent on the library project, but that did not solve the linking problem. I couldn't find any option in either project's properties that seemed to be relevant to my problem.
Is there a way I can simply hit "Build Solution" and then run the executable?
In your project's properties > Linker > Input, there's a bunch of settings you can specify for the linker, such as, for instance, additionnal dependencies to link with (put the .lib generated by your other project there) and which paths to look for said libraries.

Visual Studio 2008 Unnecessary Project Building

I have a C# project which includes one exe and 11 library files. The exe references all the libraries, and lib1 may reference lib2, lib3, lib4, etc.
If I make a change to a class in lib1 and built the solution, I assumed that only lib1 and the exe would need to be changed. However, all dll's and the exe are being built if I want to run the solution.
Is there a way that I can stop the dependencies from being built if they have not been changed?
Is the key this phrase? "However, all dll's and the exe are being built if I want to run the solution"
Visual Studio will always try to build everything when you run a single project, even if that project doesn't depend on everything. This choice can be changed, however. Go to Tools|Options|Projects and Solutions|Build and Run and check the box "Only build startup projects and dependencies on Run". Then when you hit F5, VS will only build your startup project and the DLLs it depends on.
I just "fixed" the same problem with my VS project. Visual Studio did always a rebuild, even if didn't change anything. My Solution: One cs-File had a future timestamp (Year 2015, this was my fault). I opened the file, saved it and my problem was solved!!!
I am not sure if there is a way to avoid dependencies from being built. You can find some info here like setting copylocal to false and putting the dlls in a common directory.
Optimizing Visual Studio solution build - where to put DLL files?
We had a similar problem at work. In post-build events we were manually embedding manifests into the outputs in the bin directory. Visual Studio was copying project references from the obj dir (which weren't modified). The timestamp difference triggered unnecessary rebuilds.
If your post-build events modify project outputs then either modify the outputs in the bin and obj dir OR copy the modified outputs in the bin dir on top of those in the obj dir.
You can uncheck the build option for specified projects in your Solution configuration:
(source: microsoft.com)
You can can create your own solution configurations to build specific project configurations...
(source: microsoft.com)
We actually had this problem on my current project, in our scenario even running unit tests (without any code changes) was causing a recompile. Check your build configuration's "Platform".
If you are using "Any CPU" then for some reason it rebuilds all projects regardless of changes. Try using processor specific builds, i.e. x86 or x64 (use the platform which is specific to the machine architecture of your machine). Worked for us for x86 builds.
(source: episerver.com)
Now, after I say this, some propeller-head is going to come along and contradict me, but there is no way to do what you want to do from Visual Studio. There is a way of doing it outside of VS, but first, I have a question:
Why on earth would you want to do this? Maybe you're trying to save CPU cycles, or save compile time, but if you do what you're suggesting you will suddenly find yourself in a marvelous position to shoot yourself in the foot. If you have a library 1 that depends upon library 2, and only library 2 changes, you may think you're OK to only build the changed library, but one of these days you are going to make a change to library 2 that will break library 1, and without a build of library 2 you will not catch it in the compilation. So in my humble opinion, DON'T DO IT.
The reason this won't work in VS2005 and 2008 is because VS uses MSBuild. MSBuild runs against project files, and it will examine the project's references and build all referenced projects first, if their source has changed, before building the target project. You can test this yourself by running MSBuild from the command line against one project that has not changed but with a referenced project that has changed. Example:
msbuild ClassLibrary4.csproj
where ClassLibrary4 has not changed, but it references ClassLibrary5, which has changed. MSBuild will build lib 5 first, before it builds 4, even though you didn't mention 5.
The only way to get around all these failsafes is to use the compiler directly instead of going through MSBuild. Ugly, ugly, but that's it. You will basically be reduced to re-implementing MSBuild in some form in order to do what you want to do.
It isn't worth it.
Check out the following site for more detailed information on when a project is built as well as the differences between build and rebuild.
I had this problem too, and noticed these warning messages when building on Windows 7 x64, VS2008 SP1:
cl : Command line warning D9038 : /ZI is not supported on this platform; enabling /Zi instead
cl : Command line warning D9007 : '/Gm' requires '/Zi'; option ignored
I changed my project properties to:
C/C++ -> General -> Debug Information Format = /Zi
C/C++ -> Code Generation -> Enable Minimal Build = No
After rebuilding I switched them both back and dependencies work fine again. But prior to that no amount of cleaning, rebuilding, or completely deleting the output directory would fix it.
I don't think there's away for you to do it out of the box in VS. You need this add-in
http://workspacewhiz.com/
It's not free but you can evaluate it before you buy.
Yes, exclude the non-changing bits from the solution. I say this with a caveat, as you can compile in a way where a change in build number for the changed lib can cause the non built pieces to break. This should not be the case, as long as you do not break interface, but it is quite common because most devs do not understand interface in the .NET world. It comes from not having to write IDL. :-)
As for X projcts in a solution, NO, you can't stop them from building, as the system sees a dependency has changed.
BTW, you should look at your project and figure out why your UI project (assume it is UI) references the same library as everything else. A good Dependency Model will show the class(es) that should be broken out as data objects or domain objects (I have made an assumption that the common dependency is some sort of data object or domain object, of course, but that is quite common). If the common dependency is not a domain/data object, then I would rethink my architecture in most cases. In general, you should be able to create a path from UI to data without common dependencies other than non-behavioral objects.
Not sure of an awesome way to handle this, but in the past if I had a project or two that kept getting rebuilt, and assuming I wouldn't be working in them, I would turn the build process off for them.
Right click on the sln, select configuration manager and uncheck the check boxes. Not perfect, but works when Visual Studio isn't behaving.
If you continue to experience this problem, it may be due to a missing or out of date calculated dependency (like a header) that is listed in your project, but does not exist.
This happens to me especially common after migrating to a new version (for example: from 2012 to 2013) because VS may have recalculated dependencies in the conversion, or you are migrating to a new location.
A quick check is to double-click every file in offending project from solution explorer. If you discover a file does not exist, that is your problem.
Failing a simple missing file: You may have a more complicated build date relationship between source and target. You can use a utility to find out what front-end test is triggering the build. To get that information you can enable verbose CPS logging. See: Andrew Arnott - Enable C++ and Javascript project system tracing (http://blogs.msdn.com/b/vsproject/archive/2009/07/21/enable-c-project-system-logging.aspx). I use the DebugView option. Invaluable tool when you need it.
(this is a C# specific question, but a different post was merged as identical)

Resources