MSBuild not copying compiled binaries to <app>\bin - visual-studio

I have a build process (let's call it the "engine") that has been using a command line call to Visual Studio's devenv.exe to build a project. I have known for some time that VS is just building with MSBuild, so I finally got around to updating the engine to use MSBuild directly. However, I'm finding a strange anomaly with MSBuild.
For the sake of discussion, there's projects A, B, C, and D. Project A is the main project I'm building, a web app, that depends (through project references) on the other 3 projects. When built manually in VS, A\bin is populated with assemblies. When built in the engine with devenv.exe A\bin is again populated with the expected binaries. When built in the engine use MSBuild, A\bin contains nothing. However, B\Release\bin, C\Release\bin and D\Release\bin contain their binaries as they did using the former 2 build methods.
This happens with just a single project as well. The problem doesn't appear to be related to dependent projects.
I have attempted to explicitly set the MSBuild OutDir property, but it doesn't appear to have any affect.
I have run builds with diagnostic output on and can't see anything obvious (granted, there is a LOT there so it's possible I have yet to find something significant).
I've also been trying to figure out how to see the command line call to MSBuild that VS is making when run from devenv.exe but I can't seem to find it.
I have looked at several other SO posts (here and here) but they aren't the same problem.
Anyone have an idea of what this could be or where else I could look for an answer or more diagnostic information?
EDIT 1: The arguments pattern used for the call to MSBuild looks like this:
/nologo /target:Compile /property:Configuration=%%BUILDCONFIG%% /maxcpucount
/property:OutDir=%%OUTDIR%%\bin\ /verbosity:diag /detailedsummary "%%PROJPATH%%"
The lower half of that shows my attempt to force the output directory as well as the enhanced output to show more details of the process. Build engine code replaces with "%%TOKEN%%" items with the appropriate replacement values for the project being built.
EDIT 2: After more research and looking into suggested provided, I've decided to abandon the effort to use msbuild instead of devenv. It seems there is a lot more going on under the hood of devenv in preparation its own call to msbuild and I could likely break something else going on if I don't fully understand the entrance in msbuild. I did try to see if the call to msbuild from devenv is logged, but it doesn't seem to be. I've considered building a dummy msbuild app to just dump the command going into it and temporarily swap out the actual msbuild to generate this diagnostic information, but that's more effort than it's worth at this point. The performance gain isn't so great that it's worth pursuing further for now.

I would look at the Output path on the build tab of your project properties. There are more than few differences when using MSBuild and when using Visual Studio (Even from the command line). It could be you have A configured differently than B,C,D and synching A to the rest will make it work. Also, if you plan to build the projects individually, not as a solution make sure you don't use Solution Level macros that won't be available to the project file on it own.

You are supposed to set OutputPath instead of OutputDir.
Since you already used /verbosity:diag, why not redirect the output to a text file and carefully analyze where csc.exe (or another compiler in use) stores the binaries? That's quite simple and informative for you to learn how MSBuild works under the hood.

Related

How incorporate a MSBuild custom task into Visual Studio solution?

I have a project where there are files in a particular non-standard textual format. When these files are touched/modified, I want to run a certain custom compiler on them to generate XML, which is part of the output of the whole solution.
I'm considering creating a MSBuild task to do this. It will take as input the non-stadard file names and output the requisite XML files. The task will then be used in the other projects in the solution.
I want new developers on this project to have minimal setup. That means, I want to be able to take a clean copy of my solution directly from source control and have the build first build the custom task, then apply it as necessary to the other projects in the class.
I'm concerned that the build output of the project that builds the custom task needs to copy its output assembly to some known location so that the other projects can refer to it. What is the proper way of going about doing this?
You're about to walk into a mess here, because Visual Studio is going to lock the custom task Assembly when it's first used, thereby causing any further builds in Visual Studio (i.e. Build > Solution) to fail.
As #stijn commented, you should override the Build target and use another method of building the assembly with the custom task, e.g. using the Csc task or spawning another MSBuild.exe process (see answer to linked question).
The way I decided to go though was to create a separate solution, e.g. "Build Tools", containing the custom task assembly (among other tools), and required that it be built before anything else. I personally find the notion of checking-in prebuilt binaries of this source very unpalatable. If developers didn't want to build the Build Tools solution, they would copy the output from some nightly build.
Unfortunately there isn't an easy way of getting around "hardcoding" a known (relative) location. Using $(SolutionDir) usually works - just not if you try to run MSBuild on the project directly, instead of the solution (VS is a bit more intelligent when you open a project by itself).

Is there a way I can setup a batch file or script file I can run to compile my .NET so I can edit in Notepad++ alone?

I know I will miss so much of Visual Studio but I am getting really sick of it crashing all the time and being slow, PLUS it is always changing things in my repository that I don't want to change, so I want to just edit with Notepad++. However, now I will have to load up VS just to build things. Is there a way I can build from command line and make a script for it and what not? Will it show the compile errors?
Please don't try to troubleshoot VS for me, I am just asking what is in the question and the rest was just given for context and so nobody was like 'Y U NO RIKE VIZAL STUDIA?'.
build: C:\Windows\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe "PATH TO YOUR SOLUTION FILE"
help: C:\Windows\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe /help
What you need is directly calling csc.exe, the C# compiler (which is called by the build system of Visual Studio, anyways). If you ever worked with GCC, it is quite similar except that options are passed Windows-style with / signs instead of -- and there are no object files or additional linking. The MSDN library has documentation: http://msdn.microsoft.com/en-us/library/78f4aasd.aspx.
Generally, you'd need something like:
csc /target:exe /out:Something.exe *.cs
plus any /reference's you would add in Visual Studio.
If your project is large, it may be uncomfortable to maintain a .bat file to do the compilation, so a build tool like NAnt may be needed, which is quite similar to the Ant used for Java.
This is of course if you want to eliminate Visual Studio entirely. Otherwise, Snowbear's solution of invoking MSBuild.exe will be just as fine.

How do I debug into an ILMerged assembly?

Summary
I want to alter the build process of a 2-assembly solution, such that a call to ILMerge is invoked, and the build results in a single assembly. Further I would like to be able to debug into the resultant assembly.
Preparation - A simple example
New Solution - ClassLibrary1
Create a static function 'GetMessage' in Class1 which returns the string "Hello world"
Create new console app which references the ClassLibrary.
Output GetMessage from main() via the console.
You now have a 2 assembly app which outputs "Hello World" to the console.
So what next..?
I would like to alter the Console app build process, to include a post build step which uses ILMerge, to merge the ClassLibrary assembly into the Console assembly
After this step I should be able to:
Run the Console app directly with no ClassLibrary1.dll present
Run the Console app via F5 (or F11) in VS and be able to debug into each of the 2 projects.
Limited Success
I read this blogpost and managed to achieve the merge I was after with a post-build command of...
"$(ProjectDir)ILMerge.bat" "$(TargetDir)" $(ProjectName)
...and an ILMerge.bat file which read...
CD %1
Copy %2.exe temp.exe
ILMerge.exe /out:%2.exe temp.exe ClassLibrary1.dll
Del temp.exe
Del ClassLibrary1.*
This works fairly well, and does in fact produce an exe which runs outside the VS environment as required. However it does not appear to produce symbols (.pdb file) which VS is able to use in order to debug into the code.
I think this is the last piece of the puzzle.
Does anyone know how I can make this work?
FWIW I am running VS2010 on an x64 Win7 x64 machine.
Update: Why do I want to do this?
It's been asked: 'Do I really need to ILMerge during the debug scenario?'
The assemblies of my solution will need to coexist in the same folder as those of other solutions (some of which I will likely develop)
Some of these solutions will share dependencies on different versions of some assemblies.
So Solution1 might be made up of Console1 and ClassLibrary1.dll(v1) and Solution2 might be made up of Console2 and Classlibrary1.dll(v2).
Rather than register everything in the GAC, I thought I could ILMerge the correct version of a dependency into the primary assembly of the solution to avoid a collision.
However this currently renders it impossible to debug the solution, which I need to do in place in conjunction with the other solutions which will be present.
Does this sound complicated? That's because it is.. :D
I'm sorry you're having problems. I didn't follow your exact steps, but I created a console application, A.exe, that called a method in a dll, B.dll. I built both assemblies in Debug mode (so that they had PDB files). I then merged them like this:
ilmerge /out:foo.exe A.exe B.dll
(Actually A and B were in another directory so my command line was a little more complicated, but that shouldn't make a difference.) After ILMerge completed, there were two files in the current directory: foo.exe and foo.pdb. I then typed:
devenv foo.exe
This opened up Visual Studio and then I hit "F10" to start the debugger. I was able to step into the Main method in the executable and then used "F11" to step into the method in that had originally been in B.dll. The debugging experience was just the same as it had been in the original Visual Studio solution with the two assemblies.
If you are still having problems, please feel free to put your entire solution into a zip file and send it to me (mbarnett at microsoft dot com) and I can try it out.
I would suggest that you only ILMerge release builds of your assemblies. I can't imagine any benefit you'd get from merging debug assemblies.
I tried to do something like this and found that you should not rename anything, neither before not after the merge. Moving stuff to to separate directory is fine. If you do not rename anything, it works.
If you are still using ILMerge, like I am, there is also an another solution to debugging issues. At least for me the dubugging started working after NOT using the portable PDB format. My project is .NET Standard 2.0 and I was running it under .NET Framework.
I don't think ILMerge can do it. OTOH smartassembly from red-gate (not free) can do it, at least so it says at features
And yes, I do agree with Mike to only use ILMerge for release versions.

Visual Studio 2008 Unnecessary Project Building

I have a C# project which includes one exe and 11 library files. The exe references all the libraries, and lib1 may reference lib2, lib3, lib4, etc.
If I make a change to a class in lib1 and built the solution, I assumed that only lib1 and the exe would need to be changed. However, all dll's and the exe are being built if I want to run the solution.
Is there a way that I can stop the dependencies from being built if they have not been changed?
Is the key this phrase? "However, all dll's and the exe are being built if I want to run the solution"
Visual Studio will always try to build everything when you run a single project, even if that project doesn't depend on everything. This choice can be changed, however. Go to Tools|Options|Projects and Solutions|Build and Run and check the box "Only build startup projects and dependencies on Run". Then when you hit F5, VS will only build your startup project and the DLLs it depends on.
I just "fixed" the same problem with my VS project. Visual Studio did always a rebuild, even if didn't change anything. My Solution: One cs-File had a future timestamp (Year 2015, this was my fault). I opened the file, saved it and my problem was solved!!!
I am not sure if there is a way to avoid dependencies from being built. You can find some info here like setting copylocal to false and putting the dlls in a common directory.
Optimizing Visual Studio solution build - where to put DLL files?
We had a similar problem at work. In post-build events we were manually embedding manifests into the outputs in the bin directory. Visual Studio was copying project references from the obj dir (which weren't modified). The timestamp difference triggered unnecessary rebuilds.
If your post-build events modify project outputs then either modify the outputs in the bin and obj dir OR copy the modified outputs in the bin dir on top of those in the obj dir.
You can uncheck the build option for specified projects in your Solution configuration:
(source: microsoft.com)
You can can create your own solution configurations to build specific project configurations...
(source: microsoft.com)
We actually had this problem on my current project, in our scenario even running unit tests (without any code changes) was causing a recompile. Check your build configuration's "Platform".
If you are using "Any CPU" then for some reason it rebuilds all projects regardless of changes. Try using processor specific builds, i.e. x86 or x64 (use the platform which is specific to the machine architecture of your machine). Worked for us for x86 builds.
(source: episerver.com)
Now, after I say this, some propeller-head is going to come along and contradict me, but there is no way to do what you want to do from Visual Studio. There is a way of doing it outside of VS, but first, I have a question:
Why on earth would you want to do this? Maybe you're trying to save CPU cycles, or save compile time, but if you do what you're suggesting you will suddenly find yourself in a marvelous position to shoot yourself in the foot. If you have a library 1 that depends upon library 2, and only library 2 changes, you may think you're OK to only build the changed library, but one of these days you are going to make a change to library 2 that will break library 1, and without a build of library 2 you will not catch it in the compilation. So in my humble opinion, DON'T DO IT.
The reason this won't work in VS2005 and 2008 is because VS uses MSBuild. MSBuild runs against project files, and it will examine the project's references and build all referenced projects first, if their source has changed, before building the target project. You can test this yourself by running MSBuild from the command line against one project that has not changed but with a referenced project that has changed. Example:
msbuild ClassLibrary4.csproj
where ClassLibrary4 has not changed, but it references ClassLibrary5, which has changed. MSBuild will build lib 5 first, before it builds 4, even though you didn't mention 5.
The only way to get around all these failsafes is to use the compiler directly instead of going through MSBuild. Ugly, ugly, but that's it. You will basically be reduced to re-implementing MSBuild in some form in order to do what you want to do.
It isn't worth it.
Check out the following site for more detailed information on when a project is built as well as the differences between build and rebuild.
I had this problem too, and noticed these warning messages when building on Windows 7 x64, VS2008 SP1:
cl : Command line warning D9038 : /ZI is not supported on this platform; enabling /Zi instead
cl : Command line warning D9007 : '/Gm' requires '/Zi'; option ignored
I changed my project properties to:
C/C++ -> General -> Debug Information Format = /Zi
C/C++ -> Code Generation -> Enable Minimal Build = No
After rebuilding I switched them both back and dependencies work fine again. But prior to that no amount of cleaning, rebuilding, or completely deleting the output directory would fix it.
I don't think there's away for you to do it out of the box in VS. You need this add-in
http://workspacewhiz.com/
It's not free but you can evaluate it before you buy.
Yes, exclude the non-changing bits from the solution. I say this with a caveat, as you can compile in a way where a change in build number for the changed lib can cause the non built pieces to break. This should not be the case, as long as you do not break interface, but it is quite common because most devs do not understand interface in the .NET world. It comes from not having to write IDL. :-)
As for X projcts in a solution, NO, you can't stop them from building, as the system sees a dependency has changed.
BTW, you should look at your project and figure out why your UI project (assume it is UI) references the same library as everything else. A good Dependency Model will show the class(es) that should be broken out as data objects or domain objects (I have made an assumption that the common dependency is some sort of data object or domain object, of course, but that is quite common). If the common dependency is not a domain/data object, then I would rethink my architecture in most cases. In general, you should be able to create a path from UI to data without common dependencies other than non-behavioral objects.
Not sure of an awesome way to handle this, but in the past if I had a project or two that kept getting rebuilt, and assuming I wouldn't be working in them, I would turn the build process off for them.
Right click on the sln, select configuration manager and uncheck the check boxes. Not perfect, but works when Visual Studio isn't behaving.
If you continue to experience this problem, it may be due to a missing or out of date calculated dependency (like a header) that is listed in your project, but does not exist.
This happens to me especially common after migrating to a new version (for example: from 2012 to 2013) because VS may have recalculated dependencies in the conversion, or you are migrating to a new location.
A quick check is to double-click every file in offending project from solution explorer. If you discover a file does not exist, that is your problem.
Failing a simple missing file: You may have a more complicated build date relationship between source and target. You can use a utility to find out what front-end test is triggering the build. To get that information you can enable verbose CPS logging. See: Andrew Arnott - Enable C++ and Javascript project system tracing (http://blogs.msdn.com/b/vsproject/archive/2009/07/21/enable-c-project-system-logging.aspx). I use the DebugView option. Invaluable tool when you need it.
(this is a C# specific question, but a different post was merged as identical)

Good techniques to use Makefiles in VisualStudio?

I know the ideal way to build projects is without requiring IDE based project files, since it theoretically causes all sort of trouble with automation and what not. But I've yet to work on a project that compiles on Windows that doesn't depend on the VisualStudio project (Ok, obviously some Open Source stuff gets done with Cygwin, but I'm being general here).
On the other hand if we just use VS to run a makefile, we loose all the benefits of the compile options window, and it becomes a pain to maintain the external makefile.
So how do people that use VS actually handle external makefiles? I have yet to find a painless system to do this...
Or in reality most people don't do this, although its preached as good practice?
Take a look at MSBuild!
MSBuild can work with the sln/csproj files from VS, so for simple projects you can just call them directly.
if you need more control, wrap the projects in your own build process, add your own tasks etc. - it is very extensible!
(I wanted to add a sample but this edior totally messed up the XML... sorry)
Ideally perhaps, in practice no.
Makefiles would be my preference as the build master, however, the developers spend all their time inside the visual studio IDE and when they make a change, it's to the vcproj file, not the makefile. So if I'm doing the global builds with makefiles, it's too easily put out of synch with the project/solution files in play by 8 or 10 others.
The only way I can stay in step with the whole team is to run devenv.exe on the solution files directly in my build process scripts.
There are very few makefiles in my builds, where there are they are in the pre-build or custom build sections or a separate utility project.
One possibility is to use CMake - you describe with a script how you project is to be built, and CMake generates the Visual Studio solution/project files for you.
And if you need to build your project from the command line, or in a continuous integration tool, you use CMake to generate a Makefile for NMake.
And if you project is a cross-platform one - you can run CMake to generate the makefiles for the toolchain of your choice.
A simple CMake script looks like this:
project(hello)
add_executable(hello hello.cpp)
Compare these two lines with a makefile or the way you setup a simple project in your favorite IDE.
In a nutshell CMake does not only cross-platform-enables your project it also makes it cross-IDE. If you like to just test your project with eclipse or KDevelop, or codeblocks, just run CMake to generate the corresponding project files.
Well, in practice it is no always so easy, but the CMake idea just rocks.
For example, if you consider using CMake with Visual Studio there is some tweaking required to obtain the familiar VS project feeling, main obstacle is to organize your header and source files, but it is possible - check the CMake wiki (and by writting a short script you might even simplify this task).
We use a NAnt script, which at the compile step calls MSBuild. Using NAnt allows us to perform both pre- and post-build tasks, such as setting version numbers to match source control revision numbers, collating code coverage information, assembling and zipping deployment sources. But still, at the heart of it, it's MSBuild that's actually doing the compiling.
You can integrate a NAnt build as a custom tool into the IDE, so that it can be used both on a build or continuous integration server and by the developers in the same way.
Personally, I use Rake to call msbuild on my solution or project. For regular development I use the IDE and all the benefits that provides.
Rake is set up so that I can just compile, compile and run tests or compile run tests and create deployable artifacts.
Once you have a build script it is really easy to start doing things like setting up continuous integration and using it to automate deployment.
You can also use most build tools from within the IDE if you follow these steps to set it up.
We use the devenv.exe (same exe that launches the IDE) to build our projects from the build scripts (or the command line). When specifying the /Build option the IDE is not displayed and everything is written back to the console (or the log file if you specify the /Out option)
See http://msdn.microsoft.com/en-us/library/xee0c8y7(VS.80).aspx for more information
Example:
devenv.exe [solution-file-name] /Build [project-name] /Rebuild "Release|Win32" /Out solution.log
where "Release|Win32" is the configuration as defined in the solution and solution.log is the file that gets the compiler output (which is quite handy when you need to figure out what went wrong in the compile)
We have a program that parses the vcproj files and generates makefile fragments from that. (These include the list of files and the #defines, and there is some limited support for custom build steps.) These fragments are then included by a master makefile which does the usual GNU make stuff.
(This is all for one of the systems we target; its tools have no native support for Visual Studio.)
This didn't require a huge amount of work. A day to set it up, then maybe a day or two in total to beat out some problems that weren't obvious immediately. And it works fairly well: the compiler settings are controlled by the master makefile (no more fiddling with those tiny text boxes), and yet anybody can add new files and defines to the build in the usual way.
That said, the combinatorical problems inherent to Visual Studio's treatment of build configurations remain.
Why would you want to have project that "compiles on Windows that doesn't depend on the VisualStudio project"? You already have a solution file - you can just use it with console build.
I'd advise you to use msbuild with conjunction with makefile, nant or even simple batch file if your build system is not as convoluted as ours...
Is there something I'm missing?
How about this code?
public TRunner CleanOutput()
{
ScriptExecutionEnvironment.LogTaskStarted("Cleaning solution outputs");
solution.ForEachProject(
delegate (VSProjectInfo projectInfo)
{
string projectOutputPath = GetProjectOutputPath(projectInfo.ProjectName);
if (projectOutputPath == null)
return;
projectOutputPath = Path.Combine(projectInfo.ProjectDirectoryPath, projectOutputPath);
DeleteDirectory(projectOutputPath, false);
string projectObjPath = String.Format(
CultureInfo.InvariantCulture,
#"{0}\obj\{1}",
projectInfo.ProjectName,
buildConfiguration);
projectObjPath = Path.Combine(productRootDir, projectObjPath);
DeleteDirectory(projectObjPath, false);
});
ScriptExecutionEnvironment.LogTaskFinished();
return ReturnThisTRunner();
}
public TRunner CompileSolution()
{
ScriptExecutionEnvironment.LogTaskStarted ("Compiling the solution");
ProgramRunner
.AddArgument(MakePathFromRootDir(productId) + ".sln")
.AddArgument("/p:Configuration={0}", buildConfiguration)
.AddArgument("/p:Platform=Any CPU")
.AddArgument("/consoleloggerparameters:NoSummary")
.Run(#"C:\Windows\Microsoft.NET\Framework\v3.5\msbuild.exe");
ScriptExecutionEnvironment.LogTaskFinished ();
return ReturnThisTRunner ();
}
You can find the rest of it here: http://code.google.com/p/projectpilot/source/browse/trunk/Flubu/Builds/BuildRunner.cs
I haven't tried it myself yet, but Microsoft has a Make implementation called NMake which seems to have a Visual Studio integration:
NMake
Creating NMake Projects
Visual Studio since VS2005, uses "msbuild" to define and run builds. When you fiddle with project settings in the Visual Studio designer - let's say you turn XML doc generation on or off, or you add a new dependency, or you add a new project or Assembly reference - Visual Studio will update the .csproj (or .vbproj, etc) file, which is an msbuild file.
Like Java's ant or Nant before it, msbuild uses an XML schema to describe the project and build. It is run from VS when you do a "F6" build, and you can also run it from the command line, without ever opening VS or running devenv.exe.
So, use the VS tool for development and command-line msbuild for automated builds - same build, and same project structure.

Resources