Visual Studio Team Services: Agents Configuration for different environments in release definition - continuous-integration

I am using Visual Studio Team Services, Release management and Continuous integration on the cloud. My release definition contains two environments: Dev and PRO.
I want to know what the possible configuration if i don't want to buy a license.
we need to associate each of the environments with specific agent. I guess they are called private agents.
So are the agents installed on those release environments (which are on local domain) private agents? if yes, can i configure those 2 environments to run under same agents and how?
What is the possible configuration where i can create 2 environments, each one corresponds to a different physical machine without paying for additional agents?

You only need to have a single Agent per network that you are deploying to (unless you want to deploy in parallel).
The Agent is an orchestrator and does not need to be installed on the target environments. You use remote PowerShell (or other scripting) to execute tasks on the target servers for deployment.

Related

How to Deploy BizTalk Application into production Server?

I have the following doubts in BizTalk deployment:
How to deploy the BizTalk application to the production server?
When I am modify the existing BizTalk application like artifacts, custom pipeline/functions, custom classes, etc., how again do I deploy the BizTalk application to the server?
I know BTDF is the one of the best tools for deploying BizTalk applications and we can deploy BizTalk application to server using it?
1. Deployment
For deployment you can use the built-in MSI generation wizard.
It means you deploy the application on a dev environment using Visual Studio, then on the admin console, export the application a MSI using the wizard.
Finally you can use that MSI to deploy the app to the Production server.
That's a two step process (Run MSI, import MSI in Bizalk Admin console).
Note that only your Biztalk assemblies are installed by the MSI.
If you use .NET assemblies in your solution, they need to be GAC'ed manually.
You will also need to restart the host instances running your Biztalk application.
See details here:
https://msdn.microsoft.com/en-ca/library/aa559168.aspx
That's a few manual steps.
Alternatively you can automate some of these steps by using the BTSTask, a command line tool included with Biztalk.
You can script all the manual steps.
Obviously it takes time to write such script, so it's only worth it if you are going to deploy many times in non-dev environments.
BTSTask reference:
https://msdn.microsoft.com/en-ca/library/aa559686.aspx
2. Redeployment
Usually you completely remove the old version and then install the new one:
Delete the application from the Biztalk Administration Console and ungac the assemblies it uses.
The whole process would look like:
1. Make sure there are no running instances in you application. You can always disable your receive location and let the running instances complete
2. Delete Biztalk application
3. UnGAC associated assemblies
4. Deploy new Biztalk application version and GAC associated assemblies
5. Restart Host Instances used by your Biztalk application
EDIT: To address OP's concern about deleting a running application:
It is indeed possible to deploy resources independently and never delete your application.
But it does not mean you will not interrupt the service.
An orchestration for example, can never be redeployed when it has running instances.
So assuming that you divided your functionality properly into applications, I find it cleaner and easier to delete the whole application than going after each resource.
Otherwise, yes you can go and replace your resources separately.
But to me it seems like an overhead caused by not having defined applications correctly.
3. BTDF
The Biztalk Deployment Framework is a good tool to have some kind of automation in your deployment without having to write the scripts yourself.
Good compromise between customization to your needs and setup time.
I have used it on a freelance project. It was very helpful because I was able to deliver a package with a only a couple of deployment instructions, and the non-techie client was able to deploy painlessly.

How to set up RM agents in Visual Studio Team Services to deploy backend changes for mulitple environments

I am confused while creating the agents. Do we need to create multiple agents for each environment or single agent will work.
Usually, the answer is No, You can use just one agent to deploy your build to several environments. But in some condition, for example, the environments are assigned for different networks, then you may need to setup one agent per network.
Similar question: Visual Studio Team Services: Agents Configuration for different environments in release definition

Release Management using Team Foundation Services (one Branch)

I need to use visual studio services (on cloud) to automate integration and release process.
I have 3 environments dev, test and deployment. I am planning to use only one branch the Dev that promotes the changes to next Env (TEST) and then to release.
Question 1:
Can I do it using one branch (Main Dev) or i need to create separate branch for each one? and how?
As far as i know that when using TFS server on premise, we should install release manager on the same machine and deployment agents on the different environments.
Question 2:
How can I automate release management using visual studio cloud services, taking into consideration that test and production environment don't use Azure services, they just use IIS to host our websites.
For the first question, the answer is YES and usually there should be only one branch for one release. The release/build that been deployed to the three environments in the release should be the same. Use the build from different branches in three environments does not make sense.
For the second question, you can use web deploy or Adding FTP Publishing to a Web Site in IIS 7 and then deploy via FTP.
You start using Release Management by creating a release definition in the RELEASE hub of your team project. A release definition specifies What to deploy - the set of artifacts that constitutes a new release, and How to deploy - the series of automation tasks that should be run in each environment. Each environment is simply a named logical entity that represents a deployment target for your release.
It seems you want to change project between environment in one release definition, which is not supported. A typical use case for Release Management:
You can get more information of Release Management at website https://msdn.microsoft.com/en-us/library/vs/alm/release/overview

Should a Windows based build server get automatic updates installed?

I'm asking myself if it is a common practice to have automatic updates activated on build server with windows operating system. The build server uses jenkins, visual studio and java to drive the build. On the one hand I want a system that is clearly defined which software is installed. On the other I have a server that should have up to date patches installed.
What is a common practice?
In my previous company, we were using Windows to host the Jenkins master and all the slaves. We were building our code with Visual Studio 2010. We tested the automatic updates and it broke our configuration 2 times (in 3 years). So if you want to control your server's configuration, I recommend to apply the Microsoft patches manually (you can test the patches on a staging environment before applying the patch in production).

Setup CI Environment Using TFS and Amazon EC2

Can someone recommend a good approach for setting up a CI environment that would deploy to a multiple websites (QA/PROD) hosted on Amazon's EC2 while using TFS?
Here are the requirements I'm looking to fulfill:
Have TFS deployed somewhere to track tasks, manage source control, run tests on code check-in and do automatic deployments to a QA environment.
If everything passes the CI build in TFS, code should be automatically deployed to a QA environment hosted in Amazon EC2.
After testing, take the same deployment package we used for the QA environment and push it to an identical environment in EC2 which is our production environment.
We are a start-up so we don't need all the of bells and whistles just yet. We have limited resources currently so I am trying to be as minimal as possible while meeting the above requirements.
My first pass at this was to use the free program at Amazon for first-time users to have access to the EC2 cloud for free for 12 months. Then to setup a virtual machine at a low cost (~$20) with a web host to host our TFS environment which would then push to the Amazon cloud.
We also considered using "Visual Studio Online" to do this but it looks like it only deploys to Azure which is a little more than twice the cost to host a website with SQL on than Amazon so we don't want to go that route.
Is this a good approach? I'd appreciate any feedback. Thanks!
Using the latest TFS vNext build system this should be possible. You would need to install the AWS command line tools via npm, load your credentials, and then use aws command line to deploy to ec2.
Here is a screen shot of adding an npm task. Install the aws command line sdk. Run commands to package and upload your project.
You should install Release Management for Visual Studio 2013. It is easy to install and comes with your MSDN.
http://nakedalm.com/install-release-management-2013/
With it you can create a release pipeline with rollback to deploy your application. You will likely need to add any command line tools that you need to deploy to amazon and it will make sure they get to where they need to be.
http://nakedalm.com/building-release-pipeline-release-management-visual-studio-2013/

Resources