I have asp net website with a very slow query on a remote database. To address that problem I use caching in Application_Start.
HttpRuntime.Cache["Folders"] = (from f in db.Folders select f).ToList();
in the controller
var folderList = (List<Folder>)HttpRuntime.Cache["Folders"];
It takes quite an amount of time to load the website the first time but when it's up it's fast. I also use the new serverAutoStart="true" feature of IIS so the website is always running with the Cache loaded. Even if the application pool restart IIS will load the website in a new W3wp process and switch the processes when the new instance is loaded. Resulting in no downtime or slow starting up.
But now I would like to reload the cache when some controller action occurs. So is it possible to reload it asynchronously without blocking all the website or the session that triggered the action while reloading it ? Also I would like that the current Cache["Folders"] still works during the operation.
You can make use of Parallel Tasks
var repopulateCache = new System.Action(() => RepopulateCache(someParameter);
Task.Factory.StartNew(repopulateCache);
It is recommended to know what you are doing before using this, try reading some background information.
That said, I've had this in production for a while and it works very well and means you don't have to worry (too much) about the usual perils of threading. It willl create background work on your web server but in your case it doesn't sound like a big risk.
Related
I have built a new site for a customer and taken over managing their domain and using a new hosting. The previous site and hosting have been completely taken down.
I am running into a major issue that I am not sure how to fix. The previous developer used a service worker to cache and load the previous site. The problem is that users that had previous visited the site keep seeing the old one since it is all loading from a cache. This old site no longer even exists so I have no way of adding any javascript to remove the service worker from their browser unless they hit the new site.
Has anyone ever had this issue and know of a way to resolve it? Note, asking the users to delete the service worker from their browser won't work.
You can use cache busting to achieve the outcome. As per Keycdn
Cache busting solves the browser caching issue by using a unique file
version identifier to tell the browser that a new version of the file
is available. Therefore the browser doesn’t retrieve the old file from
cache but rather makes a request to the origin server for the new
file.
In case you want to update the service worker itself, you should know, for a service worker an update is triggered if any of the following happens:
A navigation to an in-scope page.
A functional events such as push and sync, unless there's been an update
check within the previous 24 hours.
Calling .register() only if the service worker URL has changed. However, you should avoid changing the worker URL.
Updating the service worker
Maybe using the clear-site-data header would be the most thorough solution.
I've created a website that hosted in IIS on a server local to my company, it will only be used by employees of the business. I've been using Chrome dev tools to monitor the network activity and see loading times etc. I've noticed that my TTFB is always really slow, even when I'm loading very little data.
I read an article earlier that stated the TTFB should ideally be less than 500ms, yet mine is routinely over 3 or 4 seconds atleast!
I've included a screenshot of the dev tool page showing what I mean:
The particular page I've loaded is the home page. It connects to an MSSQL DB using EF6 and does one query to check if a person exists in a table with about 200 entries. Here is the code:
using (var db = new DbContext())
{
var ad = httpContext.User.Identity.GetActiveDirectory();
var person = db.PERSON.Include(p => p.PERSON_PRIVILEGE_ALLOC).FirstOrDefault(p => p.ACTIVE_DIRECTORY == ad);
}
If I load this same page on my machine using IIS Express the TTFB for the html doc is 35ms. Compared to a TTFB of 3450ms when loaded on our main server.
This is my first proper website to be hosted and so I'm relatively new to this. I understand what TTFB is but I'm unsure what is taking so long? The Db query is small and quick, the html isn't over sized, the CCS and JS is about 1MB when compressed using GZIP.
Does anyone have any info or suggestions to what is causing this and how it could be sped up? Could it potentially just be caused by the server?
It seems to be a problem with the server or server configuration.
Maybe the connection to Active Directory or database takes a long time.
You could diagnose performance issues with the Stopwatch class.
Start the stopwatch before the call to AD and log the elapsed time when the call returns. Start a second stopwatch for the database query and log the result too.
I'm working on a single page web app which needs to load a lot of data on startup. An inital load can take up to 10 seconds, which can be quite frustrating when you just want to fix/check a minor change.
There are two ajax calls on startup which require the most time. Ideally I'd have some proxy running which can cache these calls as long as I like. It should also be possible to disable these responses easily.
This can be achieved quite easily using the AutoResponder functionality of Fiddler.
Just save the response of query and map it to a rule in the autoresponder rules.
Practical Challenge:
I have a LR script that runs against an app being mocked and do not have a logout button (yet).
The test runs fine With stable response time for about 10 minutes, but after that the response time peaks and the server goes into 99% memory usage and transactions start to fail.
I suspect this is due to the script does not terminate the vusers after each run anf it builds up a lot of running sessions against the server wich is not terminated. But I might be wrong.
Anyays I want to programatically close each run after it has competed the business process.
I have red somewhere that web_set_sockets_option ("SHUTDOWN_MODE", "ABRUPT") could be used for this, but I want to be sure that this function actually does what I want and what does 'ABRUPT' means?
Are there better ways of closing sessions? Clicking the close browser during recording does not result in anything being captured in the script.
It's a server issue on session aging. Your server admin for your website can adjust the timeout values where no activity has taken place on a given session. By default most places have this set at 30 minutes. Trim it to what you need rather than taking the default value on the server.
Also, you may have hit a leak situation if resources are constantly accumulated on the server side but never released.
Based on your question I assume you're using the WEB/HTML protocol. I agree that the core issue is that your app's sessions should expire more elegantly and probably sooner. But, in order to get beyond this while testing you can try this. It isn't a guarantee, but it has worked sometimes for me in the past when dealing with similar situations. Try changing your Run-time Settings for the script:
Run-time Settings > Browser > Browser Emulation
Make sure you have the box checked for "Simulate a new user on each iteration". You can also try playing with the other settings here, like clearing the cache each iteration. This could cause a new connection setting with the web page for each iteration depending on the server's session settings. Again, this isn't 100%, but it has worked for me from time to time.
try this:
web_set_sockets_option("CLOSE_KEEPALIVE_CONNECTIONS", "1");
What is the best way to check memory usage in an ASP.NET MVC3 application?
I have been told by my hosting provider to recyle the IIS application pool every so often to improve the speed of the site. Is this what is 'recommended practice'? Surely I shouldn't need to restart my application every so often? I'd much rather find out if it is an issue with memory usage in my application and correct it. So any tips & best practices you use would be quite helpful too.
The application is based on ASP.NET MVC3, C# and EF Code First. Any guidance, links appreciated.
EDIT:
I found this page after I posted, which is quite useful. But I'd still like to hear any other views.
ASP.NET MVC and EF Code First Memory Usage
Thank you
I have a site that never recycles (until the machine is rebooted weekly)
Your application generally should keep performing fine. If it doesn't, there is some leak.
This can occur because
Cache never expires
Cache never expires
Session storage keeps growing and never times out
ObjectContexts are never disposed and kept in the session, etc
Objects that should be disposed aren't
Objects that are created via a dependency injection container aren't setup to release after each request, and thus potentially have internal collections that keep growing.
There are more causes - but these are a few main ones.
So the question really is 'there is no best practice - it depends on your app'
If you are worried about current sessions during a restart, keep in mind a restart can be quick and current requests are allowed to finish (sometimes) and forms authentication tokens will survive the restart, however sessions will not unless you configure an out of process state server.
If your memory usage keeps growing, then setup a restart schedule, otherwise do once a week or never - or setup once memory goes to XYZ then reset. ASP.NET will restart automatically once a certain threshold is reached as well based on what the hoster has setup on memoryLimit:
http://msdn.microsoft.com/en-us/library/7w2sway1.aspx
By default IIS recycles the application pool automatically at an interval (I think is 29 hours or so) but that is surely set by the host, no matter how little or how much memory you're the process is using. THe recycling trigger can be a time interval or when the process hits a certain memory usage limit. I'm sure any shared host has both of them set.
About memory usage, you can use the GC.GetTotalMemory method which will give you an approximate usage. Even when using Perfmon the readings aren't very accurate but it gives you an idea.
//global.asax.cs
void Application_EndRequest(object o,EventArgs a)
{
var ctype=Context.Response.Headers["Content-Type"];
if (ctype == null || !ctype.Contains("text/html")) return;
Context.Response.Write(string.format("<p>Memory usage: {0}</p>",GC.GetTotalMemory(false)));
}
Be aware that you'll see the usage increasing increasing until the GC kicks in and the usage will drop to a more 'realistic' value.
If you have the money I recommend a specialized tool such as the Memory profiler
Other things you can do to at least be ready if the application has memory or performance problems:
Proper layering of the application, means you can refactor the more inefficient parts without affecting the others.
The Repository pattern will be very helpful, because you can start using EF , find out that EF uses to much memory (like in the link you've found), but then you could switch the repository implementation to use PetaPoco or Dapper.net.
In general an OR\M is more of a heavy library, if the application doesn't need ORM features but just a quick way to work with a db, use from the beginning a mico-Orm like those mentioned above.
Always dispose objects implementing IDisposable.
When dealing with large db records, use pagination. It's good for both server resources usage and user experience
Apply the YAGNI (You Aint Gonna Need It) principle as much as possible, this somehow implies a bit of TDD :)