I want to export all targets imported in an msbuild file.
I have an msbuild file named Msbuildtest.proj contains .targets files imported.Files are imported using import statement.All these files are in different location and i have to go to each folder location every time to open the files.Is there any way to export all imported target to a single file,so that i can save lot of time.
Thanks
First you open a command prompt and navigate to the location of MsBuildtest.proj.
Then you "preprocess" the MsBuild script with an MsBuild command switch like so.
msbuild.exe MsBuildTest.proj /pp >MsBuildTest.txt
This will flatten the MsBuild import taxonomy into standard output and pipe the output to a text file via the > operator.
Related
I'm working on developing a specific comparison tool in Visual Studio 2012 with TFS 2012 installed, and am having trouble getting around TFS's use of temporary files for comparison. In Tools > Options > Source Control > Visual Studio TFS > Configure User Tools..., shown here:
I'm modifying the command for comparing .xml files, and TFS gives me the following options to add as command lines parameters to Powershell:
The issue is, I'm calling a custom Powershell script that will do some processing and determine what action to take based on the files that are being compared. However, some of the information required to process is the actual source control filepath of each file; which would appear to be stored in command line args %1 and %2. Unfortunately, TFS uses .tmp files to actually perform comparisons, so each of those paths points to the .tmp (stored in local AppData) file instead of the original .xml filepath that I need.
The contents of the files do not have the information I'm looking for, and the filepath is guaranteed to. Is there any way at all to use any other of the command line arguments provided by TFS to pass through the original filepaths to Powershell, or can I somehow tie back the .tmp files to the .xml path? I'm kinda stuck at this point. Thanks!
There is no option to specify the file in the argument. I've tested on my side with TFS2015 + VS2015, when added the PS tool to compare .xml file, it did compare .xml files, not .tmp file.
You can also specify the Source Path and Target Path when you compare the .xml files:
I have zip file containing my installation files. I'm extracting this zip and copying these files into installation directory with the script shown below:
ZipDLL::extractall "$OUTDIR\demo.zip" "C:\myapp\demo\"
if I remove zip file from $OUTDIR than installer is not able to find zip file as expected. What I want to do is embedding this zip or its extracted folders into exe itself. I added
File -r "$OUTDIR/demo"
but this script didn't worked as well.
When you use the ZipDll plugin, you are referring to the file you want to process (demo.zip) by using its place at run time: along the installer.exe.
When you use the File statement to embed some files into the produced installer, you need to refer to the files by using their place at compile time.
Replace the $OUTDIR in the File statement by the path relative to the .nsi script.
BTW, you should take the habit to check at the compilation log, NSIS probably tells you about that kind of problem when paths are incorrect at compile-time.
I have VS 2012 project with structure like this:
Project
Folder1
file.xml
schema.xsd
code.cs
Folder2
code1.cs
code2.cs
I set Copy to output directory property of file.xml and schema.xsd to Copy always and want to output them to the same folder where assemblies outputed (bin\Debug) but they always copied to folder bin\Debug\Folder1. Is there a way to achieve my goal without moving files to the root of the project?
I recommend you do this with a Post-built event script. It's not as hectic as it sounds, and gives you loads of flexibility.
Right-click on your project and select Properties.
Go to the Build Events tab and click Edit Post-build...
(See screenshots below)
You can now specify shell commands to be executed after each successful build of your project. The following achieves the example in your question:
copy "$(ProjectDir)Folder1\file.xml" "$(TargetDir)"
copy "$(ProjectDir)Folder1\schema.xsd" "$(TargetDir)"
$(ProjectDir) and $(TargetDir) are macros which insert the respective values relevant to your project. You can select and insert macros from the macro menu in the edit window.
The quotes are included above, because your ProjectDir and TargetDir might resolve to full paths that include spaces, and that will break the copy command if the spaces aren't there.
I'm working on a visual studio solution with over 30 projects and multiple filters.
What is the easiest way to determine all the projects a file belongs too?
First, open a command shell window and create a list of all project files in a text file. For example, for C# projects (having the ending .csproj), run this command in the root folder of your solution:
dir /s /b *.csproj >projectlist.txt
Then, you can easily determine all projects containing a specific file by the command
findstr /f:projectlist.txt /m Name_Of_Your_File
Just a suggestion: you can avoid much trouble for the future if you make sure each project has it's own folder, and all files belonging to that project are in or below that folder.
Use AgentRansack or similar tool that allows searching text contained in a file.
Use the following settings:
File Name: *.csproj
Containing Text: YourCodeFile.cs
Look in: YourSolutionFolder
Run the search and you will get a list of all project files that are holders of the CS file.
The Scenario
My project has a post-build phase set up to run a batch file, which reads a text file "version.txt". The batch file uses the information in version.txt to inject the DLL with a version block using this tool.
The version.txt is included in my project to make it easy to modify. It looks a bit like this:
#set #Description="TankFace Utility Library"
#set #FileVersion="0.1.2.0"
#set #Comments=""
Basically the batch file renames this file to version.bat, calls it, then renames it back to version.txt afterwards.
The Problem
When I modify version.txt (e.g. to increment the file version), and then press F7, the build is not seen as out-of-date, so the post-build step is not executed, so the DLL's version doesn't get updated.
I really want to include the .txt file as an input to the build, but without anything actually trying to use it.
If I #include the .txt file from a CPP file in the project, the compiler fails because it obviously doesn't understand what "#set" means.
If I add /* ... */ comments around the #set commands, then the batch file has some syntax errors but eventually succeeds. But this is a poor solution I think.
So... how would you do it?
This works in VS2005. If you're not using that, some of the settings may be in different places or with different names.
Add the text file to your project, right click on it in the Solution Explorer and select 'Properties'. Under Configuration Properties > General make sure that the file is not excluded from the build. Under Custom Build Step > General, put your existing post-build command as the Command Line setting. Make sure you specify your .txt file as the output file. Now F7 should spot changes to the text file and run your batch file.
This may be too "hacky" but this might work:
Write a quick Perl (etc.) script to check version.txt has been updated.
If the file has been modified, have this script update a dummy source file that is compiled with your project.
Run the script as a pre-build event.
This way if the script sees that the file has changed, it will change the other one, which will force a re-build.
Hacky, but try it if you're scraping the bottom of the barrel.