Related
I have a project where there are files in a particular non-standard textual format. When these files are touched/modified, I want to run a certain custom compiler on them to generate XML, which is part of the output of the whole solution.
I'm considering creating a MSBuild task to do this. It will take as input the non-stadard file names and output the requisite XML files. The task will then be used in the other projects in the solution.
I want new developers on this project to have minimal setup. That means, I want to be able to take a clean copy of my solution directly from source control and have the build first build the custom task, then apply it as necessary to the other projects in the class.
I'm concerned that the build output of the project that builds the custom task needs to copy its output assembly to some known location so that the other projects can refer to it. What is the proper way of going about doing this?
You're about to walk into a mess here, because Visual Studio is going to lock the custom task Assembly when it's first used, thereby causing any further builds in Visual Studio (i.e. Build > Solution) to fail.
As #stijn commented, you should override the Build target and use another method of building the assembly with the custom task, e.g. using the Csc task or spawning another MSBuild.exe process (see answer to linked question).
The way I decided to go though was to create a separate solution, e.g. "Build Tools", containing the custom task assembly (among other tools), and required that it be built before anything else. I personally find the notion of checking-in prebuilt binaries of this source very unpalatable. If developers didn't want to build the Build Tools solution, they would copy the output from some nightly build.
Unfortunately there isn't an easy way of getting around "hardcoding" a known (relative) location. Using $(SolutionDir) usually works - just not if you try to run MSBuild on the project directly, instead of the solution (VS is a bit more intelligent when you open a project by itself).
We have a few hundred visual studio project files that I need to assemble into a solution for building. We currently have a custom ruby script, that uses rake, to do this. But is fragile, and only allows a few visual studio macros ( $(TargetDir),$(TargetName), etc...) through, and failing on the rest. Plus the grammar of Ruby rubs me like Perl: The wrong way.
So my question is, given a directory is there a tool that will recursively find all all the .vcxproj and .csproj files and generate a solution file with dependencies? When I say 'with dependencies' it means that some projects need to be built before others. I found some other posts here on stack overflow that pointed to a tool that generates solution files: but it doesn't generate dependencies. Therefore without dependencies any solution creation tool is completely useless. Does anyone know of something that will do this?
If not a solution file, does anyone know of something that will just emit a dependency list?
P.S.
And before anyone asks: creating a solution file manually is completely out of the question. We simply have way too many project files.
So my question is, given a directory
is there a tool that will recursively
find all all the .vcxproj and .csproj
files and generate a solution file
with dependencies?
No.
What you're asking for is very reasonable; your approach to the problem is quite rational. Unfortunately, the tools haven't kept up with you. (We had the same problem.)
You're going to have to script that yourself, or otherwise customize tools. That's what we did. Successful approaches I've seen include:
Generate the *.vcproj/*.sln from
"reference project definitions",
using tools like CMake, QMake, Scons, or
Gyp. Our main system currently sits
on Scons, with our custom Python
code to navigate these dependencies,
generate solutions based on projects
(spidering dependencies). By
default, we generate a "complete"
solution for each project (including
all required supporting projects),
plus a "Master All Projects"
solution. It works very well. But,
it was custom work that took effort,
and we extended Scons somewhat to
describe our projects (but we simply
rely on the Scons generation of
*.sln and *.vcproj).
Write a custom tool to "find" these dependencies by
parsing all the *.vcproj files in
your workspace. This is work, but can be done. Those files can be "tricky" to navigate, but you might be fine with a "good enough" solution that uses the GUIDs as hash keys to generate those dependencies.
I totally agree with you: This type of stuff (project dependencies) is prohibitively difficult to maintain manually when you move beyond "simple" (e.g., many dozens of projects, yes, we also have hundreds).
Sorry. MSVS is a pretty good IDE (intended for iterative development), and a terrible build configuration management system, and not designed to do what we're talking about.
Because I care about your sanity and Your Everlasting Soul, please Please PLEASE do not attempt to write your custom solution in MSBuild.
On a side note, having hundreds of VS projects is a bad idea, it will kill VS performances, see the two white-books:
Partitioning code base through .NET assemblies and Visual Studio projects (8 pages)
Defining .NET Components with Namespaces (7 pages)
I know the ideal way to build projects is without requiring IDE based project files, since it theoretically causes all sort of trouble with automation and what not. But I've yet to work on a project that compiles on Windows that doesn't depend on the VisualStudio project (Ok, obviously some Open Source stuff gets done with Cygwin, but I'm being general here).
On the other hand if we just use VS to run a makefile, we loose all the benefits of the compile options window, and it becomes a pain to maintain the external makefile.
So how do people that use VS actually handle external makefiles? I have yet to find a painless system to do this...
Or in reality most people don't do this, although its preached as good practice?
Take a look at MSBuild!
MSBuild can work with the sln/csproj files from VS, so for simple projects you can just call them directly.
if you need more control, wrap the projects in your own build process, add your own tasks etc. - it is very extensible!
(I wanted to add a sample but this edior totally messed up the XML... sorry)
Ideally perhaps, in practice no.
Makefiles would be my preference as the build master, however, the developers spend all their time inside the visual studio IDE and when they make a change, it's to the vcproj file, not the makefile. So if I'm doing the global builds with makefiles, it's too easily put out of synch with the project/solution files in play by 8 or 10 others.
The only way I can stay in step with the whole team is to run devenv.exe on the solution files directly in my build process scripts.
There are very few makefiles in my builds, where there are they are in the pre-build or custom build sections or a separate utility project.
One possibility is to use CMake - you describe with a script how you project is to be built, and CMake generates the Visual Studio solution/project files for you.
And if you need to build your project from the command line, or in a continuous integration tool, you use CMake to generate a Makefile for NMake.
And if you project is a cross-platform one - you can run CMake to generate the makefiles for the toolchain of your choice.
A simple CMake script looks like this:
project(hello)
add_executable(hello hello.cpp)
Compare these two lines with a makefile or the way you setup a simple project in your favorite IDE.
In a nutshell CMake does not only cross-platform-enables your project it also makes it cross-IDE. If you like to just test your project with eclipse or KDevelop, or codeblocks, just run CMake to generate the corresponding project files.
Well, in practice it is no always so easy, but the CMake idea just rocks.
For example, if you consider using CMake with Visual Studio there is some tweaking required to obtain the familiar VS project feeling, main obstacle is to organize your header and source files, but it is possible - check the CMake wiki (and by writting a short script you might even simplify this task).
We use a NAnt script, which at the compile step calls MSBuild. Using NAnt allows us to perform both pre- and post-build tasks, such as setting version numbers to match source control revision numbers, collating code coverage information, assembling and zipping deployment sources. But still, at the heart of it, it's MSBuild that's actually doing the compiling.
You can integrate a NAnt build as a custom tool into the IDE, so that it can be used both on a build or continuous integration server and by the developers in the same way.
Personally, I use Rake to call msbuild on my solution or project. For regular development I use the IDE and all the benefits that provides.
Rake is set up so that I can just compile, compile and run tests or compile run tests and create deployable artifacts.
Once you have a build script it is really easy to start doing things like setting up continuous integration and using it to automate deployment.
You can also use most build tools from within the IDE if you follow these steps to set it up.
We use the devenv.exe (same exe that launches the IDE) to build our projects from the build scripts (or the command line). When specifying the /Build option the IDE is not displayed and everything is written back to the console (or the log file if you specify the /Out option)
See http://msdn.microsoft.com/en-us/library/xee0c8y7(VS.80).aspx for more information
Example:
devenv.exe [solution-file-name] /Build [project-name] /Rebuild "Release|Win32" /Out solution.log
where "Release|Win32" is the configuration as defined in the solution and solution.log is the file that gets the compiler output (which is quite handy when you need to figure out what went wrong in the compile)
We have a program that parses the vcproj files and generates makefile fragments from that. (These include the list of files and the #defines, and there is some limited support for custom build steps.) These fragments are then included by a master makefile which does the usual GNU make stuff.
(This is all for one of the systems we target; its tools have no native support for Visual Studio.)
This didn't require a huge amount of work. A day to set it up, then maybe a day or two in total to beat out some problems that weren't obvious immediately. And it works fairly well: the compiler settings are controlled by the master makefile (no more fiddling with those tiny text boxes), and yet anybody can add new files and defines to the build in the usual way.
That said, the combinatorical problems inherent to Visual Studio's treatment of build configurations remain.
Why would you want to have project that "compiles on Windows that doesn't depend on the VisualStudio project"? You already have a solution file - you can just use it with console build.
I'd advise you to use msbuild with conjunction with makefile, nant or even simple batch file if your build system is not as convoluted as ours...
Is there something I'm missing?
How about this code?
public TRunner CleanOutput()
{
ScriptExecutionEnvironment.LogTaskStarted("Cleaning solution outputs");
solution.ForEachProject(
delegate (VSProjectInfo projectInfo)
{
string projectOutputPath = GetProjectOutputPath(projectInfo.ProjectName);
if (projectOutputPath == null)
return;
projectOutputPath = Path.Combine(projectInfo.ProjectDirectoryPath, projectOutputPath);
DeleteDirectory(projectOutputPath, false);
string projectObjPath = String.Format(
CultureInfo.InvariantCulture,
#"{0}\obj\{1}",
projectInfo.ProjectName,
buildConfiguration);
projectObjPath = Path.Combine(productRootDir, projectObjPath);
DeleteDirectory(projectObjPath, false);
});
ScriptExecutionEnvironment.LogTaskFinished();
return ReturnThisTRunner();
}
public TRunner CompileSolution()
{
ScriptExecutionEnvironment.LogTaskStarted ("Compiling the solution");
ProgramRunner
.AddArgument(MakePathFromRootDir(productId) + ".sln")
.AddArgument("/p:Configuration={0}", buildConfiguration)
.AddArgument("/p:Platform=Any CPU")
.AddArgument("/consoleloggerparameters:NoSummary")
.Run(#"C:\Windows\Microsoft.NET\Framework\v3.5\msbuild.exe");
ScriptExecutionEnvironment.LogTaskFinished ();
return ReturnThisTRunner ();
}
You can find the rest of it here: http://code.google.com/p/projectpilot/source/browse/trunk/Flubu/Builds/BuildRunner.cs
I haven't tried it myself yet, but Microsoft has a Make implementation called NMake which seems to have a Visual Studio integration:
NMake
Creating NMake Projects
Visual Studio since VS2005, uses "msbuild" to define and run builds. When you fiddle with project settings in the Visual Studio designer - let's say you turn XML doc generation on or off, or you add a new dependency, or you add a new project or Assembly reference - Visual Studio will update the .csproj (or .vbproj, etc) file, which is an msbuild file.
Like Java's ant or Nant before it, msbuild uses an XML schema to describe the project and build. It is run from VS when you do a "F6" build, and you can also run it from the command line, without ever opening VS or running devenv.exe.
So, use the VS tool for development and command-line msbuild for automated builds - same build, and same project structure.
So this is a question for anyone who has had to integrate the building/compilation of legacy projects/code in a Team Build/MSBuild environment - specifically, Visual Basic 6 applications/projects.
Outside of writing a custom build Task (which I am not against) does anyone have any suggestions on how best to integrate compilation and versioning of legacy VB6 projects into MSBuild builds?
I'm aware of the FreeToDev msbuild tasks at CodePlex but they've been withdrawn at the moment.
Ideally I'm looking to version and compile the code as well as capture the compilation output (especially errors) for the msbuild log.
I've seen advice on encapsulating this functionality in a custom task, but really wondered if anyone has tried another solution (aside from executing shell commands) -
In essence, does anyone have a "cleaner" solution?
Ideally, executing commands using would be a last resort..
The VB6 task will be back on Monday. With regards to versioning, there is no explicit vb versioning task in the pack, however you could make use of the TfsVersion (TaskAction="GetVersion") and the File (TaskAction="Replace") tasks. If you think there is value in creating a new task to encapsulate / provide other functions, then please let me know and I will add it to the pack for the benefit of the whole community.
Apologies for the withdrawal, but come Monday I'm sure all will understand.
I am using Nant to build VB6 projects daily. This does resort using the Nant execute command to do the builds (we build 4 projects as part of one "solution").
It also allows you to label versions in your source control repository, get latest code, check in, check out, all the normal requirements, compile the update/setup programs copy the files to required locations and send emails of the results.
The logged results are fairly minimal though as you only get the output provided by a VB6 command line compile.
For versioning, I had to write a small app to extract the version number of my compiled executable and write it to a text file that Nant could then read and use (for labels, file names etc. (A bit of a pain but VB generated version numbers don't comply anyway).
For help with other non-core tasks see NAntContrib - from the NAnt link above.
I would like to use Visual Studio 2008 to the greatest extent possible while effectively compiling/linking/building/etc code as if all these build processes were being done by the tools provided with MASM 6.11. The exact version of MASM does not matter, so long as it's within the 6.x range, as that is what my college is using to teach 16-bit assembly.
I have done some research on the subject and have come to the conclusion that there are several options:
Reconfigure VS to call the MASM 6.11 executables with the same flags, etc as MASM 6.11 would natively do.
Create intermediary batch file(s) to be called by VS to then invoke the proper commands for MASM's linker, etc.
Reconfigure VS's built-in build tools/rules (assembler, linker, etc) to provide an environment identical to the one used by MASM 6.11.
Option (2) was brought up when I realized that the options available in VS's "External Tools" interface may be insufficient to correctly invoke MASM's build tools, thus a batch file to interpret VS's strict method of passing arguments might be helpful, as a lot of my learning about how to get this working involved my manually calling ML.exe, LINK.exe, etc from the command prompt.
Below are several links that may prove useful in answering my question. Please keep in mind that I have read them all and none are the actual solution. I can only hope my specifying MASM 6.11 doesn't prevent anyone from contributing a perhaps more generalized answer.
Similar method used to Option (2), but users on the thread are not contactable:
http://www.codeguru.com/forum/archive/index.php/t-284051.html
(also, I have my doubts about the necessity of an intermediary batch file)
Out of date explanation to my question:
http://www.cs.fiu.edu/~downeyt/cop3402/masmaul.html
Probably the closest thing I've come to a definitive solution, but refers to a suite of tools from something besides MASM, also uses a batch file:
http://www.kipirvine.com/asm/gettingStarted/index.htm#16-bit
I apologize if my terminology for the tools used in each step of the code -> exe process is off, but since I'm trying to reproduce the entirety of steps in between completion of writing the code and generating an executable, I don't think it matters much.
There is a MASM rules file located at (32-bit system remove (x86)):
C:\Program Files (x86)\Microsoft Visual Studio 9.0\VC\VCProjectDefaults\masm.rules
Copy that file to your project directory, and add it to the Custom Build Rules for your project. Then "Modify Rule File...", select the MASM build rule and "Modify Build Rule...".
Add a property:
User property type: String
Default value: *.inc
Description: Add additional MASM file dependencies.
Display name: Additional Dependencies
Is read only: False
Name: AdditionalDependencies
Property page name: General
Switch: [value]
Set the Additional Dependencies value to [AdditionalDependencies]. The build should now automatically detect changes to *.inc, and you can edit the properties for an individual asm file to specify others.
You can create a makefile project. In Visual Studio, under File / New / Project, choose Visual C++ / Makefile project.
This allows you to run an arbitrary command to build your project. It doesn't have to be C/C++. It doesn't even have to be a traditional NMake makefile. I've used it to compile a driver using a batch file, and using a NAnt script.
It should be fairly easy to get it to run the MASM 6.x toolchain.
I would suggest to define Custom Build rules depending on file extension.
(Visual Studio 2008, at least in Professinal Edition, can generate .rules files, which can be distributed). There you can define custom build tools for asm files. By using this approach, you should be able to leave the linker step as is.
Way back, we used MASM32 link text as IDE to help students learn assembly. You could check their batchfiles what they do to assemble and link.
instead of batch files, why not use the a custom build step defined on the file?
If you are going to use Visual Studio, couldn't you give them a skeleton project in C/C++ with the entry point for a console app calling a function that has en empty inline assembly block, and let them fill their results in it?
Why don't you use Irvine's guide? Irvine's library is nice and if you want, you can ignore it and work with Windows procs directly. I've searching for a guide like this, Irvine's was the best solution.