How to backup existing database and website as part of a MSDeploy package? - visual-studio-2010

I am researching one-click deployment with Visual Studio 2010, the current deployment process involves zipping up the contents of the IIS folder and taking a backup of the current database before completing the remaining manual deployment steps. This allows us to roll back a deployment, I need to retain the essence of this process if not the specifics.
Is there a way of automating this with MSDeploy?

You can have MSDeploy execute a batch file that backs up the IIS directory (see example)
You can also write some SQL, put it in a .sql file, execute the SQL script in the batch file as well. See this example to at least get a start. It is for SQL server, but if you are not using that then hopefully the database you are using has something similar.

Finally I found the answer, thank you to kniemczak for posting the information about how to backup IIS and SQL Server from the command line.
Automating ASP.NET MVC deployments using Web Deploy
Web Deploy runCommand Provider
It seems the following:
msdeploy.exe
-verb:sync -source:runCommand='C:\Scripts\Backup.cmd'
-dest:auto,computername=192.168.0.1
Should cover my needs.

Related

How to re-deploy, re-create database on each test run

Currently I'm using Visual Studio 2012 RC and SQL Server 2012 RTM.
I'd like to know how to re-deploy/re-create a test database for each test run.
Keep in mind I've a SQL Server database project for the database using Visual Studio 2012's template.
Actually I'm not very sure about an idea I got in my mind, but .testsettings file has Setup and cleanup scripts. Is this the way to go? For example, a PowerShell script reading the database project generated script and executing it against the database?
I guess there're better ways of doing that and it should be an out-of-the-box solution but I ignore it and Google doesn't help me in finding the right solution.
As mentioned you'll probably want to use the VS 2012 .Local.testsettings > Setup and Cleanup scripts to create / tear down you SQL Server database.
For the script you may want to use powershell with a .dacpac (rather than just a T-SQL script), since you are using a SSDT project. Here's a link to some example code - in particular you may want to take a look at the 'Deploy-Dac' command.
If you are unfamiliar with .dacpacs as the (build) output of SSDT-created database projects, take a look at this reference link.
Edit: Although this doesn't answer the question in a plain SQL Server way, an easy Entity Framework approach would be the following: I found that I could create and destroy my database every time correctly by using the DbContext.Database.CreateIfNotExists() and DbContext.Database.Delete() methods in my setup and cleanup phases of my tests.
The fastest solution, while a bit of a hack, is really straightforward. You can set the DB Projects properties under the debugging tab to "always re-created DB". Then test in two clicks, do a debug/build, then run all tests. You should get a freshly built DB on localDB for you tests to be ran against. You can also change the target for the debugging DB (again the DB projects properties) to whatever you want, so you can deploy to a .dacpac, or to an existing SQL DB or wherever. It means testing in two steps, and if your build is long, it may be annoying, but it works. Otherwise, I believe scripting is your only option.

Seriously, overriding the DefaultDataPath in the sqlcmdvars for a SQL Database project deployment

I have an SQL 2008 database project in Visual Studio 2010 that is sync'ed on a regular basis from a schema comparison during the development phase. This same project is also under TFS source control. I have two environments, Debug and Production. Each environment is a single machine that runs both IIS and SQL Server. The production environment however has different data and log paths for the database D:\Data\ and E:\Logs\ versus my development server at the standard c:\program files\sql....\data.
What I'm trying to do is setup the way I transact my deployments from the debug to production environments. I've gotten WebDeploy 2.1 setup and I build my deployment packages in Visual Studio via the right-click context menu on the website project. I want to manually copy deployment packages to the production server via RDP, so there's no over the wire concerns here. The deployment package settings are setup to include all databases configured in Package/Publish SQL tab. In the Package/Publish SQL tab I don't pull data from data/schema from an existing database because I want to deploy from the SQL database project instead. So I just point to the pre-generated .sql script file located in my database project's /sql/release folder. To top it off, I generate the .sql script in the post-build events in the SQL project via VSDBCMD.exe /dd:- /a:Deploy /manifest:... so that a simple solution rebuild all, then website project deploy ensures I always have the latest .sql script in the deployment package.
This is great and all, but I have a major problem here I can't seem to overcome. It has to do with the database data and log files paths being different from debug to production environments. I actually receive an exception during the WebDeploy in IIS on the production server that says it can't find c:\programs files...\MyDatabase.mdf file. And what's scarey is after this exception, the entire database is deleted. The empty databases I create right before doing the deployment. Happen both times I tried messing around with it. I'm not sure how I feel about that, but I'm hoping I could find a reliable solution to this.
I have been feverishly looking for a way to change the paths during a deployment and have found many places that mention changing the paths in the *.sqlfiles.sql files under Schema Objects\Database level objects\Storage\Files because the path it tries to deploy to is the path specified in those because of the Schema Comparisons and Writes from the Debug SQL server database. Changing the paths here will work temporarily, until I do my next schema comparison and write, then the sqlfiles.sql files will get overwritten with the info from the Debug database again. And I don't want to have to remember to never update these files during a schema comparison because any mistake has the potential to delete the production database.
I think my salvation lies in my Release.sqlcmdvars file. It's a tease actually, I can see a place I "could" type the default database path, but it appears to a read-only field as it mentions "Location where database files are created by default (set when you deploy)." It would be grand if I could specify the paths here. Is there any way at all to specify the path in a variable here that would override the paths from the *.sqlfiles.sql files?
In the solution where I work at, there are two custom variables in the sqlcmdvars called Path1 and Path2 that I thought were reserved names that do such that. However, this doesn't work in my solution and the difference between the two solutions are the other solution gets deployed via TFS build controller. Doing the TFS build controller route isn't an option really because I opted out to save money while using a third party source control service.
Any help with this would be great. I have even gone so far as to create separate *.sqlfiles.sql files for debug and release and configured the dbproj file to use one or the other depending on the Configuration, but this doesn't seem to be working either. Also, using the custom PATH1 variable in the sqlfile.sql file like FILENAME = '$(PATH1)\Cameleon_log.ldf', doesn't work either. I seriously think it shouldn't be this difficult. Am I missing something simple here??
Thanks!
Okay, this was an exercise in futility. Apparently with out syncing with the target database during the script generation the script would be exactly what is needed to build the database from scratch. Even if I could override the file paths, the deployment would complain about database objects already existing. I needed to specify the connection string of the target database in the deploy settings so a comparison is done during the script generation and only the relevant differences are added to the script. I really wanted to avoid exposing my production SQL server to the outside world, but it is what it is. No need to override the paths anymore because it looks the database file paths are conveniently ignored during this comparison!!

MVC Code First: App_Data Folder Not Being Created

I'm using the MVC Code First approach to create a SQL Compact database (from WebConfig: data source=|DataDirectory|MailBoxDB.sdf). The .sdf file should get created automatically, and it does if I manually create the App_Data folder on the web server (Windows Server 2008). However, I'm trying to automate the deployment process and I want to eliminate this manual step. I'm using MSDeploy to create the deployment package. Is it a permissions issue that prevents IIS (7.5) from creating the App_Data folder on its own? If so, which settings should I be focusing on? Any ideas?
I had a similar scenario (generally more complex, although without a database in App_Data). I pieced together several SO questions and found a solution which I've posted on my question on the same topic. Take a look at my answer and hopefully it helps.

Changing an IIS6 website directory remotely

First, the prior situation: We have this project with a one-click build script. It's cobbled together with TFS Deployer + PowerShell + VB Script. TFS Deployer sits on the production machine, copies the new website files into a brand new directory, and then calls a VB Script that changes the IIS website to the new directory.
Now, I'm moving the team away from the horror that is TFS/MSBuild. I have a TeamCity build agent on a dedicated build server. A simple NANT script deploys the build artifacts from the build server to the production server through a shared folder. Simple, quick, and effective.
However, I haven't found either a way a) to run the VB Script remotely b) update the IIS site remotely with a different mechanisms (programmatically within the 1-click build). Windows Server 2003/IIS6. Any ideas?
Update: I solved this by creating another vbs that remotely called the old vbs I had through WMI. Thanks everyone!
If I were to head in any direction, I would consider setting up a WMI script to do the work and then configuring it on the server in question. I would have to think about how to easily include this into your automated build. I personaally have not worked with TeamCity yet, although I have attended sessions on how it works.
WMI might be able to run the script, as well, and act as a sort of service front end, so you may be able to reuse what you have already spent effort on.
Could you change the vbscript file into an ASP file in a different website on the same server? This would allow you to call it remotely.
We've used NAntContrib's mkiisdir task to create/modify a virtual directory on remote machines.
<mkiisdir iisserver="Staging" dirpath="c:\temp" vdirname="Temp" />
This should either create (if the vdir doesn't exist) or change the location (if the vdir already exists).
Generally, it seems the cleanest way to do this is to first delete the vdir with the deliisdir task, followed by a create.
<deliisdir vdirname="Temp" failonerror="false" />
<mkiisdir dirpath="c:\temp" vdirname="Temp" accessread="true" accesswrite="false" accessscript="true" enabledirbrowsing="false" authntlm="true" authbasic="false" authanonymous="false" appcreate="Pooled" />
Happy coding!

Visual Studio - Publish to multiple locations?

Is there a way to automatically publish a website to multiple locations at once?
Our website is load balanced across multiple servers, so when I want to publish I have to do it to each server individually.
Thx,
Trev
Perhaps with some build scripts, such as MSBuild? Or perhaps you could create a script (PowerShell, VBScript, whatever), which copies all contents of a directory, and invoke it in the post build event (configurable in Visual Studio), so that once your solution (or the last project, actually) is built the script will run and copy the output files to wherever you need them.
You could
Publish the content to a UNC and have all of the web servers work off of that UNC
Push the content from your staging/QA server to production using a tool like MSDeploy
Use FRS to replicate the files from a "master" webserver to everyone else.
MSDeploy is setup to handle this using multiple Configurations for each location. Scott Hanselman presented on this at Mix '10
http://live.visitmix.com/MIX10/Sessions/FT14
or just use multiple publish commands. one for each location.

Resources