I have a new website project created in ASP.Net Core 3.1 with razor pages. The site contains a lot of content images and videos and every time I publish via FTP it takes about 5-10 minutes. This seems unacceptable to have the website down for that long to publish a spelling error change on a view. Am I missing something? Do I really need to republish the entire site every time?
Is there a way to at least exclude all static files and push them up only when needed?
Related
When you publish through Visual Studio to Azure, you have the option to precompile your pages. There are quite some options that explain what it does, but nothing about the trade-offs.
Why wouldn't I check 'Allow [...] to be updatable'?
Why merge into one file? Is this faster? Does it load faster or compile faster?
There is some information on the Microsoft site, but on above points they are not diving really into it.
https://msdn.microsoft.com/en-us/library/hh475319(v=vs.110).aspx
I found information about these options scattered on different pages of Microsoft's documentation. Here is what I found by putting things together:
Why wouldn't I check 'Allow precompiled site to be updatable'?
This might answer your question:
If a compiled site is updatable, UI content can be updated without recompiling the site. When a Web site is updatable, the content files remain in their original folders and only the associated code files are merged. If the site is not updatable, .ascx, .master, and .skin content files are removed from their original folder. ASP.NET .aspx files are replaced with a marker file that has no content. UI content and code are merged in this case.
Source: https://msdn.microsoft.com/en-us/library/bb397866.aspx
An updatable site stores the source file of the UI content and does not compile it. Only the non-UI code is precompiled. This makes updating the site really easy since you can change the code of one webpage without having to precompile the whole website. The down side of this option is that the web pages cannot be precompiled and are compiled every time a user requests the page (minus the caching that might occur). This will reduce performances of page loads.
Why merge into one file? Is this faster? Does it load faster or compile faster?
Merging files together makes deployment easier since there is less files to upload. It also permits more optimization since the compiler can do batch optimization on multiple web pages. However, when merging everything, the site has to be completely redeployed at every change (vs. only deploying the assemblies that have been updated).
Here is a rundown of the trade-offs of each options:
Merge all outputs to a single assembly: merging everything into one file makes the deployment easier since there is only one file to upload. Everything is compiled in the same package and this permit batch optimizations which makes page loads faster. However, if one part of the site changes, the whole website has to be uploaded again. If your website is not really big, that would be a good option.
Merge each individual folder output in its own assembly: makes deployment easier while avoiding the need to upload the whole site on every change. Only the folder containing the updated code needs to be recompiled and redeployed.
Merge all pages and control outputs to a single assembly: puts all the UI inside the same assembly without merging the code not related to UI. This lets you update the code not related to UI without having to redeploy the UI code.
Do not merge: the code files are compiled, but all the UI content is not precompiled. So a web page UI is compiled every time a user requests the page (minus the caching that might occur) and this makes page loads longer. However, since the UI is not compiled, if you need to edit one web page you can upload the new version of the specific file on the production server without having to recompile any part of the website. This is good for big websites that can't afford to be completely redeployed.
Do not merge. Create a separate assembly for each page and control: compiles every page into it's own assembly. You have the speed of precompiled code, but at the cost of preventing the compiler from doing batch optimizations on multiple pages (slightly longer page loads).
For more information about the merging and compilation of asp.net websites:
ASP.NET Merge Tool (Aspnet_merge.exe)
ASP.NET Compilation Tool (Aspnet_compiler.exe)
I just wanted to post here, because it might be relevant to other people. I had a large legacy project in ASP.NET Framework 4.7 running in Azure.
I had a lot of problems with the "live" compilation of pages on Azure. With a lot I mean really a lot. I.e. this. Sometimes I hit a page that wasn't precompiled and Azure seemed to exhaust all resources just on the compilation, bringing the whole application down. After a restart it took 8 (!!) minutes before it could handle the first hit. Local it was only like 30 seconds.
During Corona I finally had time to move to .NET Core - and all these problems instantly went away.
Despite that Microsoft says they will continue supporting .NET Framework for a long time, it is clear to me that Microsoft doesn't have any passion for that project anymore. The issues I had in combination with Azure were ridiculous.
I strongly recommend to migrate as soon as possible. Even for a large project it was less painful than I imagined beforehand.
We've recently launched a new website in Azure (i.e. Azure Websites) and as is typical with new launches we've had to deploy a few tweaks to fix minor issues shortly after launch.
We want to use Slots in the long run but this is not possible at the moment. Hence we are deploying to the live site. It's a fairly busy site with a good amount of traffic and obviously want to keep downtime to am minimum.
We are using Visual Studio to publish file changes to Azure but have noticed that even if we publish a relatively insignificant single file the whole site goes down and struggles to come back up. I was assuming that publishing a single file would literally just replace that file on the file system but it's behaving more like it recycles the application pool (or Azure equivalent) for the site. The type of files I've been publishing have been Razor views, hence would not typically cause a recycle.
Does anyone know what actually happens under the hood of VS Publish and if there is a way to avoid this happening?
Thanks.
I just tried this using a basically clean new MVC app (https://github.com/KuduApps/Dev14_Net46_Mvc5), and I did not see this behavior. The Index.html view has a hit count based on a static, which would tell us if the app or the page got restarted (or if that specific page got recompiled).
Then the test is to publish it, make a change to some other view (about.cshtml), and publish again. WHen doing this and hitting Index.cshtml, the count keeps going up, and there is minimal slowdown.
If you see it getting restarted after a view change, I suggest using Kudu Console to look at the files in site\wwwroot before/after the publish, and check what has a newer timestamp (e.g. check web.config, bin folder, ...).
I had built a website using Webmatrix 3 and used the '.sdf' database. However, I couldn't use the web deploy technique since my host doesn't allow it.
The files have an extension of '.cshtml'
I've uploaded the files using ftp and when I open the link to those pages, they only show the code like they show when they are opened manually by double clicking on them not even on a localhost.
Is there a way I can host my website with the .sdf database?
Sorry, the ASP.NET Web Pages framework (the pages you're talking about '.cshtml') needs your to have the Web Pages framework installed so that they can run.
IIS needs the Web Pages add on to work with the .cshtml files, and to render the HTML using the Razor syntax, that you're trying to tell me. I remember, once I opened the .cshtml page using Opera Web Server, and it gave me the same error since it was not able to render the HTML and thought the code itself was the HTML.
The problem about the .sdf is that you cannot use it. If they don't support Web Deploy, I think they don't support the SQL Server CE database either. This means that you cannot use them.
As, alread told, you can only ask your hosting service provider for help in this scenario. They can guide you further. Or can forward you to some other branch where these services are supported.
We are new to Windows azure and are developing a Sitefinity web application. In the beginning of the project , we have deployed complete code using Sitefinity Thunder to different environments which actually publish complete code. But now as we are in the middle of development , we are just required to upload any new files created which can be quite less in numbers (1 or 2 or maybe few). Now if we deploy with thunder , it publishes all files and then deploys complete code which takes good amount of time. Is there a way we can deploy only changed or new code files via sitefinity thunder or is there any other way with which we can only upload only the changed files?
Please help.
I use Beyond Compare 3 from scooter software to move files to our different Sitefinity environments. I haven't used Sitefinity Thunder to deploy my sites before. Also, you might want to post your question on the Sitefinity Devs group on Google+. Below is the link.
https://plus.google.com/communities/101682685148530961591
This is not easy to do and Azure is not designed for this although many people have requested this feature. The one way to achieve it is to enable Remote Desktop for the cloud service and then by logging onto the server, you can then make some kind of connection to where your files are stored and copy them into the cloud service. However, it is always possible that the instance will be rebooted and even re-provisioned from fresh so I don't know if there are any guarantees that this is a safe way to do it.
I'm using Publish/Web Deploy to deploy an asp.net aplication from Visual studio 2010. It works perfect, but there is a problem. If the new release is not working as expected, the old version is already replaced by the new one and there is no easy way to roll back to the working version. How is this best solved? I wish it was possible to keep the old version on the server so I could just switch back if needed.
With WebDeploy there is no built in rollback feature, so once you've deployed that's it.
There's a number of hand rolled strategies you could put in place, for example:
Limited Access e.g. Shared Hosting:
Where you don't have full access to the machine -
Backup the live site beforehand by downloading it.
Keep copies of what you deployed so you can push the previous version should something break
Full Access:
Maintain two sets of folders for the application and map your site to one or other of these folders. When you come to deploy switch the IIS site's physical path to the other folder then deploy. If the site fails then just knock the site back to the original folder. Each successful deploy would alternate between these two folders.
For stuff like user uploaded content you'd need to map virtual directories to a place on the file system that's always the same place because you don't want to be copying these around each time.
You're not the only one who has encountered these issues. Have a look at this article by Rob Conery and his observations about the state of affairs regarding ASP.NET deployment.
ASP.NET Deployment Needs To Be Fixed
Getting Constructive On ASP.NET Deployment
Using some form of Source Control would be another alternative. We use subversion, so if the publish goes bad, we can just update back to the last-good revision, and publish that. Even if you're the only developer, using source control can be very useful.