Visual studio 2010 Publish / web deploy issues - visual-studio

I'm using Publish/Web Deploy to deploy an asp.net aplication from Visual studio 2010. It works perfect, but there is a problem. If the new release is not working as expected, the old version is already replaced by the new one and there is no easy way to roll back to the working version. How is this best solved? I wish it was possible to keep the old version on the server so I could just switch back if needed.

With WebDeploy there is no built in rollback feature, so once you've deployed that's it.
There's a number of hand rolled strategies you could put in place, for example:
Limited Access e.g. Shared Hosting:
Where you don't have full access to the machine -
Backup the live site beforehand by downloading it.
Keep copies of what you deployed so you can push the previous version should something break
Full Access:
Maintain two sets of folders for the application and map your site to one or other of these folders. When you come to deploy switch the IIS site's physical path to the other folder then deploy. If the site fails then just knock the site back to the original folder. Each successful deploy would alternate between these two folders.
For stuff like user uploaded content you'd need to map virtual directories to a place on the file system that's always the same place because you don't want to be copying these around each time.
You're not the only one who has encountered these issues. Have a look at this article by Rob Conery and his observations about the state of affairs regarding ASP.NET deployment.
ASP.NET Deployment Needs To Be Fixed
Getting Constructive On ASP.NET Deployment

Using some form of Source Control would be another alternative. We use subversion, so if the publish goes bad, we can just update back to the last-good revision, and publish that. Even if you're the only developer, using source control can be very useful.

Related

tfs2013 share project across many projects

I have a few (3) core projects I want to share across many solutions (12+).
So, say I have 12 websites and they use some shared back end core code (in this case I'm not talking about shared js, css or views - I'm talking about business objects, entity stuff, etc.).
I need to be able to identify which site has which version of the shared code in dev, test, prod, etc. so a developer can get the website code and get the right version of the shared code to develop or patch the website.
And then the MS build server needs to know which version of the shared code to get for the deployment.
To solve this, I'm seeing people branch that core code - which seems absurd to do 12+ times. (I do expect to branch the core code sometimes for things like hot fixes and long running projects.)
I'm also seeing people copy DLLs of the core code and check those in.
I would think I would list the dependencies for my solutions based on TFS label names somewhere so developers can easily get the apps running with the right code and given a tfs label the build server can get the code for the website and the proper version of the core code. I'm using TFS & VS 2013 at the moment too, so there's that.
So, is there a way to do this that's straightforward, supportable/scale-able and intuitive? Thanks - Peter
Labels in TFS is very limited. For example once the label created you couldn't change and update it. If one of your core projects updated, did you need to create a new label for it. If you did and use the new label for one of your solution. However you found there are some bugs in this update, you need a newer update of your core project to fix the bug. Then a newer label created, you need to manually maintain the dependencies which seems not to be an easy job.
Moreover how to list the dependencies for your solutions based on TFS label names? TFS don't have this built-in option, seems the only way is store it in a txt or someother files and check in the source control. Every time the developer open a website application need to check it first and get label from server to their workspace and work on it.
Usually the purpose of sharing code between projects is reducing maintenance. There’s two main code sharing paths: source and binary. The difference between them you could take a look at this blog: Code Sharing in Team Foundation Server
Sharing code between products is a primary cause of quality erosion and elevated bug counts. I would recommend you to build separately and sharing binary output through NuGet which use preferable.
Also take a look below similar questions:
Sharing code between solutions in TFS
TFS 2010 Branch Across Team Projects - Best Practices

Does Visual Studio Publish to Azure Website Cause Whole Site to Recycle?

We've recently launched a new website in Azure (i.e. Azure Websites) and as is typical with new launches we've had to deploy a few tweaks to fix minor issues shortly after launch.
We want to use Slots in the long run but this is not possible at the moment. Hence we are deploying to the live site. It's a fairly busy site with a good amount of traffic and obviously want to keep downtime to am minimum.
We are using Visual Studio to publish file changes to Azure but have noticed that even if we publish a relatively insignificant single file the whole site goes down and struggles to come back up. I was assuming that publishing a single file would literally just replace that file on the file system but it's behaving more like it recycles the application pool (or Azure equivalent) for the site. The type of files I've been publishing have been Razor views, hence would not typically cause a recycle.
Does anyone know what actually happens under the hood of VS Publish and if there is a way to avoid this happening?
Thanks.
I just tried this using a basically clean new MVC app (https://github.com/KuduApps/Dev14_Net46_Mvc5), and I did not see this behavior. The Index.html view has a hit count based on a static, which would tell us if the app or the page got restarted (or if that specific page got recompiled).
Then the test is to publish it, make a change to some other view (about.cshtml), and publish again. WHen doing this and hitting Index.cshtml, the count keeps going up, and there is minimal slowdown.
If you see it getting restarted after a view change, I suggest using Kudu Console to look at the files in site\wwwroot before/after the publish, and check what has a newer timestamp (e.g. check web.config, bin folder, ...).

Preferred setup for development with Umbraco

We're starting new web site development with Umbraco, and having some difficulties with optimal setup for multiple developers.
Right now we have a complete umbraco install in a code directory, with IIS pointing to it as well, and a local DB for each developer. We're planning to use Courier package to push/pull content changes, and Git for source code.
This setup allows to debug from Visual Studio (using F5), instead of attaching to a w3p, which is annoying. Separate db is a part I don't really like, I'd prefer a shared one, but with Umbraco's caching model (in xml file) this isn't optimal either - changes to data types etc are not reflected in other developer's environments. This does mean, however, that sharing changes among developers is a 2-stage process - Git + Courier.
I'd guess people have already came up with some best practices on umbraco setup for team development - would be nice to hear about them.
Thanks !
We use a central source control system for the code and share one database with all developers. This works quite good, but after a change or update of the source control repository, the only thing to keep in mind is that you need to update the cache (right click the root content-node in Umbraco and "republish entire website").
With this setup we all share the code and database in the development stage. Courier can then be used to transfer umbraco content back and forward to the test and production environment.

Recommend a Visual Studio FTP deployment plug-in

we've recently stumbled across the excellent Dispatch for ASP FTP deployment plug in. It looks great apart from one thing: It doesn't work with Visual Studio 2010, at least for us, anyway. (It's supposed to work fine.)
(Yes, we've tried everything: We've managed to get Dispatch working for another FTP site, but not the main one we regularly deploy to. We have managed to connect to our main site through FileZilla FTP, so the site itself is configured correctly. All settings have been triple checked, but the software still throws up weird errors (always to do with its internal libraries).)
So does anyone know of any other comparable FTP-based, deployment plug-in for Visual Studio?
Here's what Dispatch does (and so any suggested replacement must do):
Monitor any altered files
in the project. When a file is
changed, it's added to a list of
files to be deployed.
To deploy these
files to the live site, all we need
to do is click "Upload" and the plugin will
connect via FTP to our live site and
upload the selected files.
We can filter out
any filenames we don't want to be
monitored/uploaded (e.g. .cs or
web.config or /Images/, etc.)
I think that's all the features that we need. Thanks for any suggestions!
Note: If you're wondering why we need such a service, it's because we deploy many site changes over the course of a single day. Publishing the entire project to a folder, zipping it up, then FTPing that zip file, only to have to unzip it, and then install the entire project into the live wwwroot takes far too long. With Dispatch you're able to upload individual files in a single click.
After much back and forth between me and the creator of Dispatch, we managed to narrow down the problem to the library he was using (Rebex FTP). I posted a question about the issue on the Rebex forums, and it was revealed that their software might have a bug with IIS7.5. They suggested a quick hack/fix, which I tested and discovered worked.
Mr. Dispatch then quickly implemented this hack/fix into his software, and lo! I had a fully working copy of Dispatch... So no need for a replacement any more!
(And from what I've seen, there isn't even any other plug-ins offering this functionality, so it's just as well.)
Just an update - Dispatch does not work with VS 2013 so if you have VS 2012 with Dispatch installed, Keep it. Also the website is gone so it looks like all development has ceased. I have been using Dispatch since VS2005 and it has been great for just sending single files up when I need to. Too bad it is gone.
I built a very simple one for myself - you right click the file in Solution Explorer and it then uploads that file based on a settings file you create.
It's super crude but it works and the source is there to make it better if you like -
https://github.com/garazy/vs-2017-ftp-upload
Big enhancements have been added to VS2008, VS2010, VS2012. Below is the article. I found that Microsoft did all the above while I was searching. Since this came up high in what I was searching, thought should share this knowledge.
Deploy a Web Application Project Using One-Click Publish Without Web Deploy

Visual Source Safe - Removing files from web projects

I'll try to make this as straight forward as possible.
Currently our team has a VSS database where our projects are stored.
Developers grab the code and place on their localhost machine and develop locally.
Designated developer grabs latest version and pushes to development server.
The problem is, when a file is removed from the project (by deleting it in VS2008) then the next time another developer (not the one who deleted it) checks in, it prompts them to check in those deleted files because they still have a copy on their local machine.
Is there a way around this? To have VSS instruct the client machine to remove these files and not prompt them to check back in? What is the preferred approach for this?
Edit Note(s):
I agree SVN is better than VSS
I agree Web Application project is better than Web Site project
Problem: This same thing happens with files which are removed from class libraries.
You number one way around this is to stop using web site projects. Web Site Projects cause visual studio to automatically add anything it finds in the project path to the project.
Instead, move to Web Application Projects which don't have this behavior problem.
Web Site projects are good for single person developments.
UPDATE:
VB shops from the days gone past had similiar issues in that whatever they had installed affected the build process. You might take a page from their playbook and have a "clean" build machine. Prior to doing a deployment you would delete all of the project folders, then do a get latest. This way you would be sure that the only thing deployed is what you have in source control.
Incidentally, this is also how the TFS Build server works. It deletes the workspace, then creates a new one and downloads the necessary project files.
Further, you might consider using something like Cruise Control to handle builds.
Maybe the dev should take care to only check in or add things that they have been working on. Its kind of sloppy if they are adding things that they were not even using.
Your best solution would be to switch to a better version control system, like SVN.
At my job we recently acquired a project from an outsourcing company who did use VSS as their version control. We were able to import all of the change history into SVN from VSS, and get up and running pretty quickly with SVN at that point.
And with SVN, you can set up ignores for files and folders, so the files in your web projects dont get put into SVN and the ignore attributes are checked out onto each developer's machine
I believe we used VSSMigrate to do the migration to SVN http://www.poweradmin.com/sourcecode/vssmigrate.aspx
VSS is an awful versioning system and you should switch to SVN but that's got nothing to do with the crux of the problem. The project file contains references to what files are actually part of the project. If the visual studio project isn't checked in along with the changes to it, theres no way for any other developer to be fully updated hence queries to delete files when they grab the latest from VSS. From there you've got multiple choices...
Make the vbproj part of the repository. Any project level changes will be part of the commit and other developers can be notified. Problem here is it's also going to be on the dev server. Ideally you could use near the same process to deploy to dev as you would to deploy as release. This leads into the other way...
SVN gives you hooks for almost all major events, where hooks are literally just a properly named batch file / exe. For your purposes, you could use a post-commit hook to push the appropriate files, say via ftp, to the server on every commit. File problems solved, and more importantly closer towards the concept of continuous integration.
Something you may want to consider doing:
Get Latest (Recursive)
Check In ...
Its a manual process, but it may give you the desired result, plus if VS talks about deleted files, you know they should be deleted from the local machine in step 1.

Resources