Visual Studio 2010 Database Project Deployment Strategy - visual-studio-2010

I've just read the VS 2010 Database Project Guidance document (link to guidance document) and I'm still none the wiser about what's a sensible strategy for Continuous Integration DB builds and Unit Testing the DB.
We currently have our DB project in the same solution as our application and build the whole solution on check-in, is deploying the DB with every build practical?
Should we separate the DB projects to another solution and then they'd get built less often so deploying on build would be more sensible?
Should we forget about automatically deploying all together and just make that a manual step?
How do you deploy databases for Database Unit Test projects?
There's a lot of information in the guidance document but no definitive answers.
Thanks
Ben

It depends what your goals are. What environment(s) are you considering deploying to on every build?
How often, and where, you deploy to is usually determined by your QA and/or release processes. I'm assuming you don't want to auto-deploy to the production database on every check-in, chances are you have a QA process that needs to happen first.
So I'd be looking at what environments do you have, what are they used for, and how often would you like them to be updated. It is pretty common to have a QA environment that is updated nightly, but updating it on every checkin would disrupt QA activities.
Some people have an environment that is deployed to on every build and automated tests are run against it, if you have this then deploying there every build would make sense.
You can build DB projects without having to deploy them, the build and deploy are separate concepts when it comes to DB projects in Visual Studio.

Related

Clarification on correct usage of TFS to publish Web Application

I'm trying to setup CI via TFS 2015, I've got a solution that has got 2 main Web application that currently we deploy manually editing the config files and so on (which sometimes leads to errors)
I've read about build/release process and in the past I've used Jenkins as build server. But till today I've got a question and that's related to when apply the transformation of XML config files.
In my current VTFS2015 setup I've created a build process and I build the project with the following line
msbuild /p:Configuration=Test /p:PublishProfile=Test /p:DeployOnBuild=true xxx\xxx.csproj
This creates me in the folder obj\Test\Package\PackageTmp the package
Is this ok? or should this be done in the release management tab? Consider in my farm I've
Test (from DEV trunk)
Staging (from Dev trunk as well)
Production (from production trunk on 3 machines)
My goal is to have them automatically delivered on the machines, but I don't know the right moment to apply the transformation (during the build I can use the publish feature, during the RM I can use a ps1 script)
Thanks in advance
Well, I think this thread will helps: What TFS 2017 build task applies web.config transforms during TFS builds?
To apply the transformations you can use the extension: Apply transformations in vNext build process.
Usually it should be a package and be used in a deploy task such as
Deploy: WinRM - IIS Web App Deployment or Azure App Service
Deployment to achieved the deployment.
1) Can transforms be engaged in both Builds and Releases?
Yes, you could also do this in a build pipeline with the useage of build deploy task. You need to add the task after the publish build
artifacts task.
2) Does TFS 2017 require a lot of special handling to engage a
transform file?
update
The BuildConfiguration variable is different in TFS 2017, it's inside
the MSBuild task! Transforms are now applied according
to the MSBuild task Configuration setting.
Edit the .proj file is a method to do the transform. If you don't need to change the transform, it will auto do it during the build.You
could also use some 3-rd party task/extension for extra transform such
as: XDT Transform
Usually we separate the build and release for the deployment, cause
it's easy to configure multiple environments and easy to debug issue.
You definitely could do this only in build but with a bloated process.
You could refer this tutorial: Build and Deploy Azure Web Apps using
Team Foundation Server/Services vNext Builds.
For a separate build and release solution, you could take a look at
this blog: Using web.config transforms and Release Manager – TFS
2017/Team Services edition

Azure deployment has machine specific delays

I'm deploying from Visual Studio to an Azure instance as part of a small team, and we've found that deployment behavior depends on which machine was last used to deploy.
If we change which machine we deploy from, the deployment updates a large list of DLLs and CSS files, even though they're unchanged from the previous deployment. If the same machine is used more than once, all deployments after the first are smooth and take very few updates.
Is there a known reason for this behavior that I've been unable to find, and is there a way to avoid updating unchanged files?
While deploying manualy from Visual Studio is nice for getting a prototype working, it isn't consired as best practice to build and deploy from a local development machine, since the result strongly depends on how the local machine is configured, which DLLs are in GAC, etc... This may also be one of your problem, but this is realy hard to debug.
So I strongly recommend either using deploy through source control or using a build server like TeamCity, Jenkins or similar to deploy.
Deployment with source control are realy easy to setup. You can follow the this guide in order to get started.

Building on Team Foundation Services and Testing in Azure?

I'd like to run some integration tests in Azure. I can't run these in TFServices because they require a database. I'm wondering if it would be possible to push my project up to TFS, have TFS build the solution and push it to Azure, and then have Azure run some tests against a test database before committing it to production. Any failures along the pipeline populate back to visual studio. Is this even remotely possible?
if it would be possible to push my project up to TFS, have TFS build
the solution and push it to Azure
Yes, this is possible.
and then have Azure run some tests against a test database before
committing it to production
Well, this is not possible.
Any failures along the pipeline populate back to visual studio
And this is in a dream world. The only viable resolution is to have work items created out from test fails. But populate back to Visual Studio is a dream.
Going back to have Azure run some tests against a test database before committing to production.
What exactly is the issue you face when you want to run integration tests from TF Services? is this just an SQL Azure Firewall issue or something else? Have you even tried it? What was the result? If it is just a Firewall issue you most probably can mitigate by using some custom code in test initialization phase - like disabling firewall for the SQL Azure server, then enabling it again on test tear down phase.

Setup Continuous Integration with Visual Studio 2010 and Team Foundation Server 2010

Is there a good guide on how to do this with these two exact pieces of software? I have found a lot of generalized CI setup guides, but none involving these two exact components.
TFS makes setting up a continuous integration server very simple. You will first need to create a new TFS project. Then you need to configure Visual Studio to use your new project. Finally, you can setup a build for the project within TFS. There is a pretty good blog post about how to do it. The important step is setting the build trigger:
You can set various build triggers, including a continuous schedule. Obviously, you can get much more complex with the build definition. Deployment options, config options, but hopefully this will be enough to get you started.

Best way to do Visual Studio post build deployment in a team environment?

I'm working in a team environment where each developer works from their local desktop and deploys to a virtual machine that they own on the network. What I'm trying to do is set up the Visual Studio solution so that when they build the solution each projects deployment is handled in the post-build event to that developers virtual machine.
What I'd really like to do is give ownership of those scripts to the individual developer as well so that they own their post build steps and they don't have to be the same for everyone.
A couple of questions:
Is a post build event the place to execute this type of deployment operation? If not what is the best place to do it?
What software, tools, or tutorials/blog posts are available to assist in developing an automatic deployment system that supports these scenarios?
Edit: MSBuild seems to be the way to go in this situation. Anyone use alternative technologies with any success?
Edit: If you are reading this question and wondering how to execute a different set of MSBuild tasks for each developer please see this question; Executing different set of MSBuild tasks for each user?
If you are using Visual Studio 2005 or later, project files are MSBUild files. Inside the MsBuild file, there is an "AfterBuild" target. I would recommend using this to do your deployment tasks, instead of the Post Build Event.
By using MSBuild tasks, you are more prepared to move into a Continuous Integration system like CruiseControl.NET, or Team City.
I'm not sure why all of your developers have their own virtual machines, it would seem at some point you'd want a central location where all developers work is built to ensure the code from all developers will integrate and build (this is a reason for using Continuous Integration systems). I'm sure you can find a way that CruiseControl.Net or Team City or one of several other choices can help you in this scenario. But as far as getting the initial setup, use MSBuild.
i'd look into MSBuild or ANT

Resources