It seems that whenever I upload a file onto my shared hosting site, the site goes offline whilst it compiles the page no matter how slight the change is. It seems its compiling the whole site. I have something like 5000+ pages on my site. My question is :-
Is this normal on a shared hosting site?
Is there something I missed or have left out of the web.config that will force the compiler to only compile as and when the page is accessed rather than compile the whole site which it seems to be doing?
Yes, IIS recompiles everything into assemblies, so a small change might affect the whole site. Because of this, the whole site is recompiled. It's recommended to do deployments like this during off-peak hours due to the delay.
One thing you can do is to precompile the site before deploying it to the server. You can find this option when publishing the site in visual studio (right click on the project node in Solution Explorer and choose publish.)
Related
When you publish through Visual Studio to Azure, you have the option to precompile your pages. There are quite some options that explain what it does, but nothing about the trade-offs.
Why wouldn't I check 'Allow [...] to be updatable'?
Why merge into one file? Is this faster? Does it load faster or compile faster?
There is some information on the Microsoft site, but on above points they are not diving really into it.
https://msdn.microsoft.com/en-us/library/hh475319(v=vs.110).aspx
I found information about these options scattered on different pages of Microsoft's documentation. Here is what I found by putting things together:
Why wouldn't I check 'Allow precompiled site to be updatable'?
This might answer your question:
If a compiled site is updatable, UI content can be updated without recompiling the site. When a Web site is updatable, the content files remain in their original folders and only the associated code files are merged. If the site is not updatable, .ascx, .master, and .skin content files are removed from their original folder. ASP.NET .aspx files are replaced with a marker file that has no content. UI content and code are merged in this case.
Source: https://msdn.microsoft.com/en-us/library/bb397866.aspx
An updatable site stores the source file of the UI content and does not compile it. Only the non-UI code is precompiled. This makes updating the site really easy since you can change the code of one webpage without having to precompile the whole website. The down side of this option is that the web pages cannot be precompiled and are compiled every time a user requests the page (minus the caching that might occur). This will reduce performances of page loads.
Why merge into one file? Is this faster? Does it load faster or compile faster?
Merging files together makes deployment easier since there is less files to upload. It also permits more optimization since the compiler can do batch optimization on multiple web pages. However, when merging everything, the site has to be completely redeployed at every change (vs. only deploying the assemblies that have been updated).
Here is a rundown of the trade-offs of each options:
Merge all outputs to a single assembly: merging everything into one file makes the deployment easier since there is only one file to upload. Everything is compiled in the same package and this permit batch optimizations which makes page loads faster. However, if one part of the site changes, the whole website has to be uploaded again. If your website is not really big, that would be a good option.
Merge each individual folder output in its own assembly: makes deployment easier while avoiding the need to upload the whole site on every change. Only the folder containing the updated code needs to be recompiled and redeployed.
Merge all pages and control outputs to a single assembly: puts all the UI inside the same assembly without merging the code not related to UI. This lets you update the code not related to UI without having to redeploy the UI code.
Do not merge: the code files are compiled, but all the UI content is not precompiled. So a web page UI is compiled every time a user requests the page (minus the caching that might occur) and this makes page loads longer. However, since the UI is not compiled, if you need to edit one web page you can upload the new version of the specific file on the production server without having to recompile any part of the website. This is good for big websites that can't afford to be completely redeployed.
Do not merge. Create a separate assembly for each page and control: compiles every page into it's own assembly. You have the speed of precompiled code, but at the cost of preventing the compiler from doing batch optimizations on multiple pages (slightly longer page loads).
For more information about the merging and compilation of asp.net websites:
ASP.NET Merge Tool (Aspnet_merge.exe)
ASP.NET Compilation Tool (Aspnet_compiler.exe)
I just wanted to post here, because it might be relevant to other people. I had a large legacy project in ASP.NET Framework 4.7 running in Azure.
I had a lot of problems with the "live" compilation of pages on Azure. With a lot I mean really a lot. I.e. this. Sometimes I hit a page that wasn't precompiled and Azure seemed to exhaust all resources just on the compilation, bringing the whole application down. After a restart it took 8 (!!) minutes before it could handle the first hit. Local it was only like 30 seconds.
During Corona I finally had time to move to .NET Core - and all these problems instantly went away.
Despite that Microsoft says they will continue supporting .NET Framework for a long time, it is clear to me that Microsoft doesn't have any passion for that project anymore. The issues I had in combination with Azure were ridiculous.
I strongly recommend to migrate as soon as possible. Even for a large project it was less painful than I imagined beforehand.
A friend has asked me to do some work on his existing site which was built in Rapidweaver. I'm on Windows, so is there another way I can access and edit his site?
The Rapidweaver project file is meant to be edited only in Rapidweaver, really. As far as I know, the only way around would be to use an HTML editor to modify the pages that are already in the server. However, I would not reccomend you to do it unless you are not going back to Rapidweaver anymore. Because changing the files in the server does not update your local Rapidweaver files. So, you could end up editing something in the server, then getting back to Rapidweaver and upload a "new" version that would not be completely up to date (the previous changes in the server version would be overriden by the older rapidweaver project).
For that kind of work, a CMS (Content Management System) is a more flexible way to work. Nowadays, one of the most common is Wordpress. It will require an inicial setup but after it is working it can be updated from anywhere via web browser, or even from an app in your iPhone. But it is not a Rapidweaver based sollution.
There are a couple CMS related plugins or stacks (Dropkick CMS, Armadillo, Easy CMS, Total CMS...) for Rapidweaver that could also be useful in this context. Once again, first you would need to buy a licence and to setup the website using one of those plugins or stacks. Only then you would be able to edit on the go.
We've recently launched a new website in Azure (i.e. Azure Websites) and as is typical with new launches we've had to deploy a few tweaks to fix minor issues shortly after launch.
We want to use Slots in the long run but this is not possible at the moment. Hence we are deploying to the live site. It's a fairly busy site with a good amount of traffic and obviously want to keep downtime to am minimum.
We are using Visual Studio to publish file changes to Azure but have noticed that even if we publish a relatively insignificant single file the whole site goes down and struggles to come back up. I was assuming that publishing a single file would literally just replace that file on the file system but it's behaving more like it recycles the application pool (or Azure equivalent) for the site. The type of files I've been publishing have been Razor views, hence would not typically cause a recycle.
Does anyone know what actually happens under the hood of VS Publish and if there is a way to avoid this happening?
Thanks.
I just tried this using a basically clean new MVC app (https://github.com/KuduApps/Dev14_Net46_Mvc5), and I did not see this behavior. The Index.html view has a hit count based on a static, which would tell us if the app or the page got restarted (or if that specific page got recompiled).
Then the test is to publish it, make a change to some other view (about.cshtml), and publish again. WHen doing this and hitting Index.cshtml, the count keeps going up, and there is minimal slowdown.
If you see it getting restarted after a view change, I suggest using Kudu Console to look at the files in site\wwwroot before/after the publish, and check what has a newer timestamp (e.g. check web.config, bin folder, ...).
I recently changed my project from a Website Project to a Web Application project so I could use build events.
I'm having all sorts of problems now trying to develop.
When I build the project and reload it in the web browser, it hardly ever loads the right version (sometimes it does).
Like if I make a simple update to some text, it will load the last page, like it's using a cached version of the page or something.
Also when I try to debug, it will never hit the breakpoints. I'm not sure what I need to change to fix this issue, any help would be greatly appreciated.
What browser are you using? The chances are the pages are being cached. With most browsers you can do Ctrl + F5 to force a reload of the cached files.
In Internet Explorer, you can change the option for Temporary Internet Files to 'Every time I visit the webpage', but be aware that this affects all sites you visit.
If you are talking about changes to binaries, you might need to do an iisreset. If you are still having issues, try deleting ASP.NET temp files.
Out of nowhere the "Publish Web Site" option in Visual Studio isn't doing anything for a particular project. It still compiles the site (no errors), but it never copies the output to the destination directory. I've tried changing the destination to a number of locations, and it creates the new folder, but never copies the output. I tried other projects and they seem to work fine. What would stop one project from copying the output?
Update: This is a web site project, if that makes any difference.
It appears the problem was with the "Allow this precompiled site to be updateable" option. This was turned off at some point and was causing the problem. Turning it back on caused the files to be copied again. Why should this make a difference? Is there another setting somewhere that needs to work in conjunction with this setting?