I'm deploying from Visual Studio to an Azure instance as part of a small team, and we've found that deployment behavior depends on which machine was last used to deploy.
If we change which machine we deploy from, the deployment updates a large list of DLLs and CSS files, even though they're unchanged from the previous deployment. If the same machine is used more than once, all deployments after the first are smooth and take very few updates.
Is there a known reason for this behavior that I've been unable to find, and is there a way to avoid updating unchanged files?
While deploying manualy from Visual Studio is nice for getting a prototype working, it isn't consired as best practice to build and deploy from a local development machine, since the result strongly depends on how the local machine is configured, which DLLs are in GAC, etc... This may also be one of your problem, but this is realy hard to debug.
So I strongly recommend either using deploy through source control or using a build server like TeamCity, Jenkins or similar to deploy.
Deployment with source control are realy easy to setup. You can follow the this guide in order to get started.
Related
We're just in the process of transitioning from VS2013&15/TFS2013 to VS2017/TFS2017 (on-site TFS, not VSTS) and the first test solution is a dotNet Core 1.1 based one (a multi-project web service).
The solution builds fine on the original developer's box and I've got it out of TFS and it builds fine on mine too. In keeping with our previous methodology the contents of the packages folder are checked in with the projects as this makes the packages locally available on the build box (no internet).
Building the solution on the build server is a different story, however, as I get multiple errors of the form...
..\obj\project.assets.json' not found. Run a NuGet package restore to generate this file.
I get the errors both when I run the TFS build definition and when I remote to the box and build directly through the VS on the box itself.
This whole project.assets.json not found issue seems to be causing headaches all over. In my case the issue is that I'm trying to resolve it on our TFS 2017 Build Server, which does not and never will have internet access ('cos it's a server!).
All the solutions I've seen thus far seem to suggest running the Nuget Restore command but that can't work since the server cannot get to nuget.
This is nothing fancy yet, just a simple TFS 2017 Build definition with a Get sources and a Build solution step. I can't understand how something so simple has become so difficult.
Changing the Nuget Package Restore options makes no difference.
Since the project.assets.json files are generated on the fly in the obj folder, I can't even check them in to reuse. Can anyone please suggest a workaround, at the moment the test project is dead in the water.
Edit: trying the same process with a 4.6.1 web project created with VS2015 had similar results of unresolved references (e.g. System.Web) but didn't raise the same error, probably due to being an older, non-Core project.
According to I get the errors both when I run the TFS build definition and when I remote to the box and build directly through the VS on the box itself.
The issue seems not related to TFS build side since it also not work with local build through VS in the build agent machine.
Since this is a dotnet project. So, you could try to use “dotnet restore” and not “nuget restore”. Try using the dotnet core template (which uses dotnet restore).
If you are using authenticated nuget feeds, then you can use nuget restore but you also need to use nuget installer task. See https://github.com/Microsoft/vsts-tasks/issues/3762 for a discussion on that.
The Nuget version should be higher than 4.0.
Without dotnet restore and Nuget restore and only use get source/Visual Studio Build will not be able to build the dotnet core project. If your server do not have internet access, as a workaround you should use Local feeds.
We have a C# .NET project using Visual Studio 2013 and we're setting it up to release and deploy with Visual Studio Team Services (VSTS). The websites were pretty simple and easy to set up and they work fine. A few projects are libraries or Console applications and we're trying to determine the best method for creating an automated release for these.
The publish profile asks for a location to publish to - we've experimented with the build drop on our VSTS build server (where all of the other files are) and then asks for a website, a UNC path or a CDROM. We chose "UNC Path" and put the same build drop location in, but in UNC format.
It hasn't really worked yet, so I thought I would see if any best practices for creating VSTS releases and deploys for Console or Code Libraries exist.
Thank you!
Have you considered installing the agent on the target environments and using a release definition that simply copies the right files at the right place?
Regards
Note: copy path shouldn't be hardcoded and rely on variables.
To specify the agent queue by going to the tasks tab when editing the release definition. Click on the "run on agent" header, that will open the details, select your queue here.
Agent queues can contain multiple agents, so your job when you add agents is to organize them by queues that make sense in your context.
My company is currently implementing a versioning system using Mercurial and BitBucket. We currently have respositories set up on bitbucket and are able to use them, but our work processes for doing so are a bit clunky. We use Visual Studio for web programming in .Net. Currently, we have set up a cloned repository locally and work from there. We can do this using Visual Studio with VisualHg.
In order to edit files we open them in Visual Studio from the local repository and make our edits. We then commit our changes to Hg, which updates the repository as it should. Then we need to FTP the files from our local system to the DEV server for testing and then FTP again to the Production server once QA is completed and approved.
It would help streamline things if we could have the BitBucket repository synced with our DEV server so that all that was required is to commit changes for testing in DEV, bypassing the otherwise necessary step of locating and FTP'ing all relevant files.
Does anyone know if this is possible? If so, can you point me to any documentation that would show me how to set this up? Our developers would be eternally grateful. Thanks for your time.
In my opinion, using Mercurial is not the correct solution for this problem.
The main reason for it not being the correct solution is that the files that are in Mercurial are not the files that you want on the production server and so aren't the files that you want to use on your development server (because you want the QA environment to be as close to the production environment as possible). There are no assembly files stored in Mercurial (or there shouldn't be) and those are the files that the server should be using to run the application.
There are deployment tools built into Visual Studio that you can use for this task. They can be configured to upload all the necessary files with one button click.
Scott Hanselman has a post on his blog about this.
Troy Hunt takes it one step further by introducing a build server with this excellent set of posts. It uses Subversion as the repository but it can be done using Mercurial too.
I prefer the build server method as, once you have it set up correctly, it makes it 100% reliable. It will do the same thing every time you ask it to do the deployment. If you use Visual Studio to do it the developer doing the publish could choose different options and get it wrong.
I've just been upgrading an Azure project to Visual Studio 2010 and have been taking advantage of the new XML configuration transformation feature that is built into VS2010 web projects. It seems to work great with Azure web roles. I even managed to get the Azure project service configuration file to do a similar thing by following the instructions here.
However, I can't seem to get configuration transformation working for the lone worker role in my Azure project. I know that VS2010 only has built-in support for config transformation with web roles, but I found a good article describing how to get config transformations working with non-web projects. I've followed the instructions and it works - but only to a point. It successfully spits out the correct .config file (with appropriate transformations) into the worker role project's own bin directory, but it doesn't pick this new .config file up when it's put into the cloud package.
I suspect there's some MSBuild trickery needed to get this to work, but I don't know MSBuild very well, so am appealing to any gurus out there for help and/or samples :)
I have found the best way to do this is to use msbuild. I usually do this with a separate msbuild file outside my solution so I keep the local dev settings separate from the production settings. You can find out more here. I then can run the build to change the settings and upload the project to Azure. I can also run this to change the settings and then run deploy through VS if I need to debug the problem. I also have a target in the msbuild file that then can revert everything back to local. It would be nice to have these things in VS (which I have asked for from the product team). The sample project is on github.
This is also explained in the book we wrote in the Life Cycle chapter.
I'm working in a team environment where each developer works from their local desktop and deploys to a virtual machine that they own on the network. What I'm trying to do is set up the Visual Studio solution so that when they build the solution each projects deployment is handled in the post-build event to that developers virtual machine.
What I'd really like to do is give ownership of those scripts to the individual developer as well so that they own their post build steps and they don't have to be the same for everyone.
A couple of questions:
Is a post build event the place to execute this type of deployment operation? If not what is the best place to do it?
What software, tools, or tutorials/blog posts are available to assist in developing an automatic deployment system that supports these scenarios?
Edit: MSBuild seems to be the way to go in this situation. Anyone use alternative technologies with any success?
Edit: If you are reading this question and wondering how to execute a different set of MSBuild tasks for each developer please see this question; Executing different set of MSBuild tasks for each user?
If you are using Visual Studio 2005 or later, project files are MSBUild files. Inside the MsBuild file, there is an "AfterBuild" target. I would recommend using this to do your deployment tasks, instead of the Post Build Event.
By using MSBuild tasks, you are more prepared to move into a Continuous Integration system like CruiseControl.NET, or Team City.
I'm not sure why all of your developers have their own virtual machines, it would seem at some point you'd want a central location where all developers work is built to ensure the code from all developers will integrate and build (this is a reason for using Continuous Integration systems). I'm sure you can find a way that CruiseControl.Net or Team City or one of several other choices can help you in this scenario. But as far as getting the initial setup, use MSBuild.
i'd look into MSBuild or ANT