Visual Studio - Publish to multiple locations? - visual-studio

Is there a way to automatically publish a website to multiple locations at once?
Our website is load balanced across multiple servers, so when I want to publish I have to do it to each server individually.
Thx,
Trev

Perhaps with some build scripts, such as MSBuild? Or perhaps you could create a script (PowerShell, VBScript, whatever), which copies all contents of a directory, and invoke it in the post build event (configurable in Visual Studio), so that once your solution (or the last project, actually) is built the script will run and copy the output files to wherever you need them.

You could
Publish the content to a UNC and have all of the web servers work off of that UNC
Push the content from your staging/QA server to production using a tool like MSDeploy
Use FRS to replicate the files from a "master" webserver to everyone else.

MSDeploy is setup to handle this using multiple Configurations for each location. Scott Hanselman presented on this at Mix '10
http://live.visitmix.com/MIX10/Sessions/FT14

or just use multiple publish commands. one for each location.

Related

VS2010 - Zip and upload to FTP

Is it possible to have Visual Studio Zip up certain files and upload them to a ftp server whenever I have a successful build? What would be the best way to go about doing it?
Of course you can. Set up a post build script to copy the various files to your FTP server. That depends on your network configuration (may be a simple copy if it is a shared drive).
Alternatively use Jenkins and have an autobuild.

Visual Studio: Pre-Build add contents of directory to project

I have the following setup:
Main Website - MVC 3 project, to be hosted on www.domain.com
Intranet Web App - MVC 3 project, windows authentication, hosted on admin.domain.com, which is only accessible from within the local subnet.
CDN Website - A simple web app that merely serves images to both of the above. It will be hosted (publically) on cdn.domain.com, when we go live. I have set up a local project to mock the CDN during development.
I've written a business layer that allows users in the admin panel to upload images, which are then physically saved to the CDN path that's configured (currently on the local machine i.e. C:\Code\SolutionName\CDNProject\images). The main website then uses the same business layer to find and distribute the images via http://cdn.domain.com/images/. http://cdn.domain.com is currently set to http://localhost:55555, while we develop.
Whenever an image is created via the admin panel, it is physically created on disk. Each developer works on his own machine, we we want to be able to check these files in to TFS, for the time being. As you might have guessed, adding files to the file system does not automatically reference them in the project:
I thought there may be some way to reference these images as resources, or set a directory to a "content" directory of sorts... but I can't find anything.
Some developers work remotely via VPN, and do not have access to the local network (only TFS), so a network path is not an acceptable solution.
I thought I might be able to set a pre-build event up, to add all files in a directory to the project?
There is no very easy way to do that. There are a few ways to think about:
1) Write VS adding which adds new files to project (via DTE - starting point). Find out how to automatically run this VS addin on Pre-Build step. Install this addin to your developers machines.
2) Extend your admin logic to automatically check-in the uploaded files to TFS via TFS API
3) try to apply more sofisticated techonologies like this one: T4 Tutorial: Integrating Generated Files in Visual Studio Projects
Hope that helps,
Visual Studio project files have an XML syntax. Project file properties can be modified in a simple text editor (files added/removed, etc.).
You can create a script to open your solution, and before actually opening the solution, you can scan that directory and "inject" the files (with the appropriate XML tags) in the project files.
I don't think you can add this as a pre-build event because the project files are already loaded at that point, and you cannot modify them while they're used.

Is there a better way to deploy web applications using Visual Studio 2010?

I am using Visual Studio 2010 and IIS 7.0 .Currently when I want to deploy an website to my web server I follow these steps -
1.Right-click on website and say publish..to get the entire site copied to a local folder.
2.Next using filezilla just ftp the copied files to the web server.
The problem is I have to deploy entire website all the time since I can't keep a track of the changes. Although I do find my way easier and without problems. I dont want to a whole lot of configuration and deployment packages unless it is really worth it and also relatively easy to do. Is there a better way I should do the deployment ? Any suggestions are welcome !
You could use the Web Deployment tool. It needs to be installed on the webserver too and can even take care of publishing a sql server database.
http://www.iis.net/download/WebDeploy
Do NOT use the Web Platform installer to install this package.
You can just right click on website and Publish Web Site; the Publish Website Wizard opens. You can click the ... button to browse on the Target Location textbox and choose FTP over in the left hand side, then put in your FTP credentials.
You can tick 'Allow this precompiled site to be updateable' so if you need to make minor changes (such as scripts, css, or html) but I don't know how reliable that is.
Good luck!
Scott Gu just published an article about the Deploy Features in VS today:
http://weblogs.asp.net/scottgu/archive/2010/07/29/vs-2010-web-deployment.aspx
Personally I use Dispatch for ASP.NET. Works well for me. It only uploads the files that have changed and can check for files that are missing locally or on the server.
http://dispatchasp.net/
If you are using the Publish Wizard then you have no choice but to deploy the whole site. There is no way for the wizard to look at the files on the server and know definitively if the file has changed or not (it could look at file size or something, but that's not 100% guarantee of no change and FTP doesn't offer an easy way to do a checksum algorithm).
Other then that, do it the way you would do it on any other language/tool. Just manually FTP the files you've changed. Of course, this means you have know which files are side-affected by your changes. And if you're not confident as to what files you've side affected.... publish wizard is your friend :)

How to backup existing database and website as part of a MSDeploy package?

I am researching one-click deployment with Visual Studio 2010, the current deployment process involves zipping up the contents of the IIS folder and taking a backup of the current database before completing the remaining manual deployment steps. This allows us to roll back a deployment, I need to retain the essence of this process if not the specifics.
Is there a way of automating this with MSDeploy?
You can have MSDeploy execute a batch file that backs up the IIS directory (see example)
You can also write some SQL, put it in a .sql file, execute the SQL script in the batch file as well. See this example to at least get a start. It is for SQL server, but if you are not using that then hopefully the database you are using has something similar.
Finally I found the answer, thank you to kniemczak for posting the information about how to backup IIS and SQL Server from the command line.
Automating ASP.NET MVC deployments using Web Deploy
Web Deploy runCommand Provider
It seems the following:
msdeploy.exe
-verb:sync -source:runCommand='C:\Scripts\Backup.cmd'
-dest:auto,computername=192.168.0.1
Should cover my needs.

Changing an IIS6 website directory remotely

First, the prior situation: We have this project with a one-click build script. It's cobbled together with TFS Deployer + PowerShell + VB Script. TFS Deployer sits on the production machine, copies the new website files into a brand new directory, and then calls a VB Script that changes the IIS website to the new directory.
Now, I'm moving the team away from the horror that is TFS/MSBuild. I have a TeamCity build agent on a dedicated build server. A simple NANT script deploys the build artifacts from the build server to the production server through a shared folder. Simple, quick, and effective.
However, I haven't found either a way a) to run the VB Script remotely b) update the IIS site remotely with a different mechanisms (programmatically within the 1-click build). Windows Server 2003/IIS6. Any ideas?
Update: I solved this by creating another vbs that remotely called the old vbs I had through WMI. Thanks everyone!
If I were to head in any direction, I would consider setting up a WMI script to do the work and then configuring it on the server in question. I would have to think about how to easily include this into your automated build. I personaally have not worked with TeamCity yet, although I have attended sessions on how it works.
WMI might be able to run the script, as well, and act as a sort of service front end, so you may be able to reuse what you have already spent effort on.
Could you change the vbscript file into an ASP file in a different website on the same server? This would allow you to call it remotely.
We've used NAntContrib's mkiisdir task to create/modify a virtual directory on remote machines.
<mkiisdir iisserver="Staging" dirpath="c:\temp" vdirname="Temp" />
This should either create (if the vdir doesn't exist) or change the location (if the vdir already exists).
Generally, it seems the cleanest way to do this is to first delete the vdir with the deliisdir task, followed by a create.
<deliisdir vdirname="Temp" failonerror="false" />
<mkiisdir dirpath="c:\temp" vdirname="Temp" accessread="true" accesswrite="false" accessscript="true" enabledirbrowsing="false" authntlm="true" authbasic="false" authanonymous="false" appcreate="Pooled" />
Happy coding!

Resources