TFS Build 2010 - Minify JS/CSS before deployment - visual-studio-2010

I've got hold of the Ajax minifier in order to minify some JS and CSS files as part of a build/package/deploy process. It's a great tool and does exactly what we need. However, integrating this into our build/deploy process is proving very difficult.
Ideally, we want to run this tool only when we execute one of our TFS 2010 builds (i.e. NOT a local (Ctrl+Shift+B jobbie) build on a dev machine). Also, we want to replace our currently 'non-minified' files in this scenario with the minified ones (i.e. under the same file name) rather than having a load of additional files named '.min.js' etc.
After a lot of reading I think the key is for a custom build task within the workflow - but I've no idea on how to approach this - espeically as I'm looking to minify files that will have been pulled down directly from our release branch in TFS (i.e. not in someone's local workspace) as part of a TFS 2010 build.
This is the closest discussion I've found to what I'm trying to achieve: Microsoft Ajax Minifier - TFS 2010 Workflow - AjaxMin in the TFS Build
I beleive I will need a custom code activity within the build workflow but have no idea on how to create one to solve this problem. Can anyone shed any light on a process which will allow minification prior to deployment?

OK - thanks to these answers and a lot of research... I came to the following solution :)
Starting with custom code activities, I tried to run the minifier from C# code and then called the activity as part of the workflow. This didn't work as the .dll version of the minifier exposes a couple of methods for compressing both .js and .css files and then makes you open up a StreamWriter of some kind and re-write the file with the compressed string returned from the method (if you're wanting to overwrite your existing files). Pretty intensive opening up and closing files all the day so I wasn't massively happy with that solution. Using the process class to run the .exe with the -clobber option on (for overwriting files) isn't ideal either and produced some odd results (not correctly minifying the files and writing some garbage at the head of each file).
So, you ask, the solution I settled on was to write a PowerShell script (the beginnings of which I got from here - which I then modified slightly to take a command-line parameter - which would be the root folder of your project. The script recursively goes through each file (and each sub-directory's files) and minifies the .css and .js inside. Pretty neat. The bones of which looks something like this:
$ScriptDirectory = $args[0]
Write-Host "Validating directory parameter: $ScriptDirectory"
Write-Host ""
if ((Test-Path -path $ScriptDirectory) -ne $True)
{
#Throw an error of some kind (the parameter passed in isn't a valid directory).
}
$Minifier = “C:\Program Files\Microsoft\Microsoft Ajax Minifier 4\AjaxMin.exe”
get-childitem $ScriptDirectory -recurse -force -include *.js, *.css -exclude *.min.js, *.min.css | foreach-object {&$Minifier $_.FullName -out $_.FullName -clobber}
So we go through each child item of the root folder with an extension of .js or .css (ignoring extensions of .min.* as these have already been compressed).
In TFS, all we need to do is add an InvokeProcess step to execute the PowerShell script in TFS. You can pass your parameter in (the directory to start minifiying) using the Arguments property of the InvokeProcess activity.
To get the directory that TFS build is using to compile your code before it is released (the temporary workspace, if you like), you can use the SourcesDirectory variable available to you in the Run On Agent sequence of the build. This is the location that your files are compiled and packaged up by the TFS build process so anything that gets minified here will end up in the final deployment package.
P.S - the SourcesDirectory is quite high up - you might not want to drill all the way from there to get to your .js and .css files so you mgiht need to specify something like:
SourcesDirectory + "/" + "MyProjectFolder/Scripts"
Make sure you add this InvokeProcess step before your code is deployed in the workflow and hey-presto - you'll have minified .js and .css files, which keep the original file names only as part of a TFS build and not a local one.
Many thanks to all who answered and pointed me in the right direction. I hope this helps someone along the way!

Doing this as a custom code activity is certainly doable, but you'll have to put some effort in it. My suggestion would be:
First follow the TFS 2010 customisation blog from Ewald Hofman at http://www.ewaldhofman.nl/post/2010/04/29/Customize-Team-Build-2010-e28093-Part-4-Create-your-own-activity.aspx to learn aboutr creating custom activities
Then take a look at http://www.ewaldhofman.nl/post/2010/06/01/Customize-Team-Build-2010-e28093-Part-10-Include-Version-Number-in-the-Build-Number.aspx from the same series to implement a mechanism to find all files that conform to a certain pattern. In that example, assemblyInfo.cs files are indexed and their content changed. Replace that by searching for your *.js files.
Release the power of Ajax Minifier on your file selection and replace the original file with the minified one.
Build the activity, include it as a step in the build process template you use in TFS 2010 (also described in the same blog posts) and fine tune it to the point you are satisfied with it.
Alternatively, you can ask the author of the post you have included in your question to share the Minifier TFS activity he created with us :-)
Please, let me know how that works out for you.

You need to implement an Invoke Process Activity so that Minifier gets executed during your TFS builds.
To this purpose you will also have to install the minifier in the Server(s) doing your builds, so so-called Build-Agent(s). By doing that you 'll be ensuring that the Minifier gets invoked only during your TFS-Builds (as opposed to local VS-Builds).
In order to rename your generated output files (*.min.js) you need to implement another custom activity just for that.
Overwritting your checked-in files needs you to first make them writable, this means yet another custom activity (I 've provided in another answer a snippet for that).
The whole choreography is
Invoke Minifier with InvokeProcess --> make checked in files writable --> overwrite checked in files with renamed minified files.
The right way to do this in TFS build is to wrap them in a Sequence.
A good introductory blog post about how to implement an Invoke Process is to be found here.
I have also found the series by E.Hofman of real value.

Related

Using PowerShell and TFS.exe to update and delete files from projects

I am using PowerShell to modify a series of configuration files within a solution. The solution is under TFS 2010 control.
The solution has many projects and the configuration files are all xml files. The easy part is if I just need to modify a file, I check it out using the checkout command then save the file when I'm done. All good. I go into Visual Studio and see the modified files are updated with pending changes as I would expect
The part I'm having difficulty with is when I have a configuration file that is no longer needed and can be deleted. Using the delete command does, in fact, mark the file for a pending delete, but it does not modify the project file where the deleted file is contained.
When I delete a file via Visual Studio, it automatically checks out and modifies the project file for me. I'm not getting the same result when using a command line delete.
It's not practical for me to do this by hand as I am eliminating over 1,000 files.
Any help would be greatly appreciated!
Thank you.
There are two components at work here. When you are running inside VS, the project system processes all file commands (adds, deletes, edits, etc.) and then calls into the TFS Object Model to actually pend the changes in TFS. The project system is also the one responsible here for removing the reference from the project file. The TFS OM has no knowledge of whether a file is part of a project or not when it is run outside of Visual Studio.
If you have a list of the xml files that you need to delete your best bet is to write a script that reads these in and removes them from the project file (after pending an edit on the project file, of course).
-Taylor,
TFS Version Control Development Lead
Thank you all for your responses. After much digging and trial and error, I figured it out. It was way more simple than I was making it.
In short, I used DTE and ran my script from within VS using the PowerShell console. It went something like this:
$mySolution = $dte.Solution
$projectItem = $mySolution.FindProjectItem($fileToRemove)
if ( $projectItem -ne $null )
{
$projectItem.Remove()
}
Executing the Remove() command on the ProjectItem checks out the corresponding project and edits it accordingly.
Again, thank you again for the time you all took to look at my question and respond. Hope this helps someone else someday!

How to automatically download files in Visual Studio

I'd like to use visual studio to store in source control xml files coming from a server.
I have a request like http://server/query.aspx?FILE_ID=1234 that allows me to download an xml file. Those file are part of our development activities, that's why I'm looking for a convenient way of integrating those file in source control.
I'd like to have a project containing all the xml files I want to check-in in source control and add a pre-build command allowing to download the files, but I did not find any convenient way of doing it.
People have a tendency to forgetting to do it manually, and we have already seen all the possible scenarios: lost files, released version without the ability to know the exact configuration used, ... I'd like to automated this step so that it does not happen again in the future.
I'm sure there is a simple and smart solution, but I could not find it. Any suggestion would be appreciated.
You should be able to use wget in a pre-build action to fetch the latest version of the files. I can't think of a reason why that wouldn't work.
Personally i would consider finding a way to automatically commit those files to source control whenever they change on the server. I've never used tfs, but I assume there is a commandline-client which allows you to commit files in a scripted way. If you don't have any control over when the files change you could do this every N minutes on a machine which is always on.
You can write a (powershell) script that does the fetching, and checkin of the file before your build starts. That's how we fetch external assemblies to be included in our build.
To get you started, take a look at these powershell functions for TFS interaction:
http://www.brokenwire.net/bw/Programming/73/ (TFS 2008)

How to setup the target output path of a given resource file in Visual Studio

In the main project of my VS Solution I have a Resources folder with some required external tools. When building and publishing the solution, I get a .\Resources* with all required files there.
So far so good.
However I have to move some files to the parent directory.
My first attempt was do so with the Post Build Events. It works and does move them the correct folder.
Nevertheless in the publish output they still appear in the Resources folder and I need them in the parent one :/
Is there any way to setup the target output path for resources in Visual Studio?
After some research and experimental, I solved my problem.
Still, here's what I learned in the process.
The first attempt was adding the file to the project root and mark it as a resource. After publishing it worked. But having those files in the project root its lame.
Since I needed some *.exe files compiled in another VS solution, added them as a project reference. Gave it a try and it passed the "Publish" test. But still.. not the best way to do it.
After that, with some scripting and a post-build event, I copied the required files to the correct folder. Works.. but after publishing, they don't appear in the package.
However, there is still a possibility with the Mage tool:
http://msdn.microsoft.com/en-us/library/acz3y3te.aspx
This lead to some promissing experiments, however they ended up helping me realize how limited the MS ClickOnce is, so I decided to try other tools.
Here's a good start to follow:
What alternatives are there to ClickOnce?
I had a similar situation once. I found it became more trouble than it was worth to customize output paths and such in Visual Studio, to the extent that I wanted.
I ended up letting Visual Studio do its own thing with regards to file/project structure, and wrote a post-build script to copy everything that was needed into a final, 'publish-ready' directory.
I then set the execution target in Visual Studio to the new location, so I could run/debug as normal, but with the new folder that was organized how I needed it. Careful, I think this is a user project setting; so other developers will need to do this on their machines too, if they so desire.
I do recall changing some output paths and such to make the post-build script more simple. But changing things like that can lead to annoyances when you add new projects to the solution; you might need to configure them to match. It's all a trade-off :)
Two ideas:
Maybe you could move your resources into another project - a project just for resources - and then set their Build Action to Content and Copy To Output to true. Then reference this new project and build the solution. (This may not work as you want, just an idea).
Why not make your resources embedded resources instead. Keep them all within the Resources\ directory and access them programatically?

How can I exclude files from being harvested with heat (WiX 3.5)?

I would like to harvest a folder with a lot of files by using heat.exe. But instead of harvesting all files, I would like to exclude specific file extensions like "*.txt" or something like that.
How can I do this?
I think the only option for now is to harvest the entire folder and apply a transform to the resulting .wxs file (see -t:<xsl> switch) to exclude what is not required (txt files in your case). However, I didn't try the 3.5 version of heat (judging based on the 3.0), but I don't think there are changes in this area.
I'm not a huge proponent of this pattern. How do you ensure change control when using a non-deterministic process? How do you know a file that appeared in a directory really should ship in a product and how do you know a file that vanished from the directory shouldn't break a build? How do you know you are breaking the component rules and creating servicability issues?
I used to do dynamic file linking in the 1990's because it was "easy" but I can remember it biting me many times and I haven't done it ever since.
I know Bob Arnson used to agree with this view point:
http://www.mail-archive.com/wix-users#lists.sourceforge.net/msg03420.html
But now in WiX 3.5 I'm starting to see capabilities that support dynamic linking and I just don't understand why they would go that way. I'd much rather update a WXS file and check it back into source control then risk putting my deployment process on autopilot.
Instead of trying to figure out how to harvest selected files from of a folder, I use a before build action to populate a folder with just the files that I want harvested. The following workflow has been working for me:
Delete a "files" if it exists
Create a "files" folder
Copy the files to the "files" folder. I use the robocopy build action, that gives me enough control to specify which files to include or exclude.
Harvest the entire folder.
I have it set to run the harvest action conditionally, just for debug builds. Release builds are generated from our TFS server and use the generated .wxs from source control. It should be OK to run harvest on the build server, but it's an extra step and not having it run eliminates the "non-deterministic process" problem described by Christopher Painter. Other than that one step, the same steps execute on the build server as they do on my dev machine.

Working with XSLT in Visual Studio

In my C# client application, I use XSLT to transform XML into HTML.
I would like to be able to edit these files in place, without having to recompile the entire solution. I'm having trouble working out how to set up Visual Studio 2008 to allow this.
The problem is that the XSLT files must get copied to the output directory somehow. Currently this happens during the build process. (My XSLT files are set to "copy if newer".) The build process can take a few minutes, which seems excessive for making small tweaks to the HTML.
I could make my XSLT edits in the output directory itself, but the output directory is not under source control. I have accidentally wiped out my quick edits several times by building my solution.
I'd like to reduce the cycle time for debugging XSLT, while keeping my XSLT files under source control and preventing accidental overwrites.
Summary of Responses: It appears that the most practical approach for solving this problem -- given that Visual Studio doesn't have a nice way of doing it out of the box -- is to create a separate project that contains the content files. These files get copied to the output location when the project gets built. That way I don't have to compile the whole solution, just the one project with all the static information like XSLT, CSS, images, etc.
Several folks suggested using sync or batch copy tools, but while this would work for me personally, setting it up for the other members of the team too would be a lot of extra work.
I am not entirely clear about your question, but you can instruct Visual Studio to copy the file from the solution to the output folder every time that you build.
Let me try to understand your scenario:
You have the XSLT files checked into source control along with your C# code. For example, if your project is in a folder called MyProj, then the XSLT files reside in MyProj/Templates
You want to be able to edit the xslt files in the Templates folder and submit those changes to source control just like you do with .cs or other files in your project.
You want a copy of your xslt files in the bin/Debug or bin/Release folder along with your executable.
If that is the case, add the XSLT files to your Visual Studio project. Then right click on them, open Properties, and set "Build Action" = "Content" and "Copy to Output Directory" = "Always". Whenever you build your project, the latest copy of the XSLT files will be placed in your bin/Debug or bin/Release directory.
One approach is to include a C# Preprocessor Directive to point my XSLT load function to the solution directory when in debug mode, but the output directory when doing a release build.
Something like:
string viewFolder = AppDomain.CurrentDomain.BaseDirectory;
#if DEBUG
// Move up from /bin/debug
viewFolder = viewFolder + #"..\..\";
#endif
But that feels like a hack.
Apparently you're managing two concerns in one project. The first concern is your business logic (instantiating an XSLT transform, calling it to transform some XML content, outputting the HTML result....). The second concern is the Transformation itself.
So why not create a separate project for your xslt sheets? "Building" this project would consist of copying the sheets to the output folder. Changing xslt will not influence the other project, hence reduce the build time.
Separation of Concerns at project level, that is :)
You can edit the file directly in the output folder.
On another note, a lot of people don't know that rich tools are built into VS to allow debugging xslts.
http://msdn.microsoft.com/en-us/library/ms255605(VS.80).aspx
One solution that might work for you is to setup a junction to your Templates in your output folder. This would allow you to use the XSLTs directly without copying them to the output folder. A good idea is to ensure (create) the junction as a build action.
Prerequisites:
NTFS
A tool to create junctions (e.g. junction)
Create a batch file that copies your xslt's from their source-controlled location to all your bin directories (bin/debug bin/release or whatever ones you have defined)
Add the batch file as an External Tool, optionally assigning a keystroke (or chord) to execute the batch file
Edit, run tool (I'd assign a keystroke to this to make this easy), then check your webpage.
Could you use a file synchronization program (e.g. Microsoft SyncToy "is a free application that synchronizes files and folders between locations") to copy the files? This would allow you to avoid the "copy on build" step because the files are automatically copied after saved. Also, if you edited them in the output directory, the changes could be copied back into your source controled directory. Not what the best real time sync program is for this scenario is, but that could be another question.
I have exactly the same issue. I have bought a program called ViceVersa (http://www.tgrmn.com/) in which I have setup sync profiles so that my css, layout and xslt folders are synced from my machine to my dev server as soon as any changes are made. If I make any code changes then I just publish as normal.
I understand this is an older post but I found a different solution to basically the same problem.
Visual Studio allows you to 'link' files.
Right click on the folder in the solution where you want the file link to be located.
Click
'Add'
'Existing Item..'
(select the file)
Go the 'Add' Drop down and select 'Add as Link'

Resources