disable Enterprise Library Caching Application Block - caching

We're using the Enterprise Library Caching Application Block to do caching (in memory) in our web service. Works great, no complaints.
We're starting to do some load testing, and I've been asked to disable the cache so we can get some relative idea of what kind of performance gain caching gives us. I thought this would be simple - it turns out its not.
I can't find any configuration setting to disable the cache. I suppose I could turn down the maximumElementsInCacheBeforeScavenging setting, but is there a better way?
I found one post that suggests creating your own Cache Manager that does nothing - again, is there a better way to do this?

Your best bet is to provide a custom implementation of ICacheManager (interface added in Entlib 4, can't help for earlier ones) that doesn't store anything and never gives a cache hit. Then you configure the block to use your "NullCacheManager" or whatever you want to call it.

Related

Custom static file provider is overridden by feature

I have registered a custom file provider as static file provider, but it seems to be overridden. I never get any calls to IFileInfo.CreateReadStream, but the files are still being served. The thing is I'm trying to add mem-caching for a few commonly read files, so I was implementing this in my custom file-info class on first read. But now it seems that this is overriden by IHttpSendFileFeature. This is not at all very clear. What is it that IHttpSendFileFeature does that is better than basic CreateReadStream+CopyTo? Is it implementing some kind of caching itself? Perhaps I shouldn't try add custom caching here at all. I'm not sure if file-caching is done elsewhere in the pipeline already, and perhaps much better? I'm using Kestrel on .NET 6.
I don't want to read these files from disk on every request since I'm running this on a server with mechanical discs. If I was running this with SSD's I guess I wouldn't have to worry that much about this, I mean since reads are so fast on SSD already. But also, I assume the OS do a bunch of file-paging/caching as well? So perhaps I should just skip all custom caching altogether?
Try to remove the following
app.UseStaticFiles();
which calls
app.UseMiddleware<StaticFileMiddleware>();
If you remove UseStaticFiles() and instead call
app.UseMiddleware<MyStaticFileMiddleware>();
The original StaticFileMiddleware should no longer be called.
Your Post is a little unclear to me. How do you Serve those files?
The Services / Features you describe are not registered in the default service collection.

Can weblogic cache reponses to get requests?

I don't mean using coherence. I am looking for a way to avoid hitting my application to look something up that I've already looked up. When the client performs a GET on a resource I want it to hit the application the first time only and after that return a cached copy.
I think I can do this with apache and mod_mem_cache, but I was hoping there was a weblogic built in solution that I'm just not able to find.
Thanks.
I don't believe there's inbuilt features to do that across the entire app server, but if you want to do it programmatically, perhaps CacheFilter might work.

Using interception to implement caching - how to define keys?

TL;DR Can someone point me to a through implementation of a caching system that is added to the solution through interception?
I'm refactoring one of my solutions so that cross-cutting concerns are implemented through Unity Intercept. I've read the guides from MSFT, and now I think I can very easily implement the interception behaviors.
However, I was wondering about caching; I want to consistently use the cache regions and keys throughout the solution. Furthermore, I have key-specif configurations for expiration on my caching system.
On one example in the Unity's Developer Guide, it checks the method name -- this is a bad approach since it would mean altering the implementation everytime a new class/method must use cache (obviously).
I'm having this (mad) idea of implementing a configurable Interceptor that learns how to compose the region and key from the given parameters, and is configurable for each class(type)/method. However this would push a lot of responsibility to configuration; I don't like the feeling that I'm programming in the *.config file.
As you can see, I'm a tad bit lost on how to go about this. I don't like singletons and right now the caching system is a singleton, accessed everywhere by the solution. Can someone link me to a good documentation on how I should proceed about this? Is it possible to add cache and have proper keys/regions defined on the cache?
Quick search on the similar matter lead me to the "Attribute Based Cache using Unity Interception" project on CodePlex. Entire project looks to be abandoned in some Alpha stage, however, it should provide you with the baseline to start with.

Sitecore with DMS vs caching server - how do you handle it?

We're planning to introduce DMS to our customer's Sitecore installation. It's a rather popular site in our country and we have to use proxy caching server (it's Nginx in this case) to make it high-traffic-proof.
However, as far as we know, it's not possible to use all the DMS features with caching proxy enabled - for example personalization of content - if it gets cached it won't be personalized.
Is there a way to make use of all the DMS features with proxy cache turned on? If not, how do you handle this problem for high-traffic sites - is it buying more Content Delivery servers to carry the load, or extending current server with better hardware (RAM, CPU, bandwidth)?
You might try moving away from your proxy caching for some pages, or even all.
There's no reason not to use a CDN for static assets and media library assets, so stick with that
Leverage Sitecore's built-in html cache for sublayouts/renderings - there are quite a few options for caching
Use Sitecore's Debug feature to track down the slowest components on your site
Consider using indexes instead of doing "fast" or Sitecore queries
Don't do descendants query "//*" (I often see this when calculating selected state for navigation - hint: go the other way, calculate the ancestors of the current page)
#jammykam wrote an excellent answer on this over here.
John West wrote a great blog post on this also, though a bit older.
Good luck!
I've been wondering about this myself.
I have been thinking of implementing an ajax web service that:
- talks to the DMS and returns JSON
- allows you to render the personalised components client side
- allows you to trigger anlaytics events
I have been googling around and I haven't found anyone that has done it and published the information yet. The only place I have found something similar is actually in the mobile sdk, but I haven't had a chance to delve into it yet.
I have also not been able to use proxy server caching and DMS together successfully. For extremely high loads, I have recommended to clients to follow the standard optimization and scaling guidelines, especially architecting for proper Sitecore sublayout and layout caching for as much of the site as possible. With that caching done, follow it up by distributing across multiple Content Delivery nodes with load balancing to help support high volume with personalization at the same time.
I've heard that other CMS's with personalization use a javascript approach to load the personalized content on the client-side, but I would be worried about losing track of the analytics data that is gathered when personalized content is loaded and interacted with.

System.Web.Caching vs. Enterprise Library Caching Block

For a .NET component that will be used in both web applications and rich client applications, there seem to be two obvious options for caching: System.Web.Caching or the Ent. Lib. Caching Block.
What do you use?
Why?
System.Web.Caching
Is this safe to use outside of web apps? I've seen mixed information, but I think the answer is maybe-kind-of-not-really.
a KB article warning against 1.0 and 1.1 non web app use
The 2.0 page has a comment that indicates it's OK: http://msdn.microsoft.com/en-us/library/system.web.caching.cache(VS.80).aspx
Scott Hanselman is creeped out by the notion
The 3.5 page includes a warning against such use
Rob Howard encouraged use outside of web apps
I don't expect to use one of its highlights, SqlCacheDependency, but the addition of CacheItemUpdateCallback in .NET 3.5 seems like a Really Good Thing.
Enterprise Library Caching Application Block
other blocks are already in use so the dependency already exists
cache persistence isn't necessary; regenerating the cache on restart is OK
Some cache items should always be available, but be refreshed periodically. For these items, getting a callback after an item has been removed is not very convenient. It looks like a client will have to just sleep and poll until the cache item is repopulated.
Memcached for Win32 + .NET client
What are the pros and cons when you don't need a distributed cache?
These are the items that I consider for the topic of Caching:
MemCached Win32
Velocity
.net Cache
Enterprise Library Caching Application Block
MemCached Win32: Up until recently I have used MemCached Win32. This is a akin to a web farm (many servers serving the same content for high availability) but it is a cache farm. This means that you can install it locally on your web server initially if you don't have the resources to go bigger. Then as you go down the road you can scale horizontally (more servers) or vertically (more hardware). This is a product that was ported from the original MemCached to work on Windows. This product has been used extensively in very high traffic sites. http://lineofthought.com/tools/memcached
Velocity: This is Microsofts answer to products such as MemCached. MemCached has been out for quite some time, Velocity is in CTP mode. I must say that from what I have read so far this product will certainly turn my head once it is out. But I can't bring myself to run big production projects on a CTP product with zero track record. I have started playing with it though as once it gains momentum MemCached won't even compare for those locked in the windows world! http://blogs.msdn.com/velocity/
.NET Cache: There is no reason to discount the standard .NET Cache. It is built in and ready to use for free and with no (major) set up required. It offers flexibility by way of offering mechanisms for storing items in local memory, a SINGLE state server, or a centralized database. Where Velocity steps in is when you need more than a single state server (cache in memory) and don't want to use a slow database for holding your cache.
Enterprise Application Block: I stay away from all of the Enterprise Application Blocks. They are heavy frameworks that give more than I generally require! As long as you remember to wrap everything that touches code that is not your own and follow simple rules for coding, stick to any of the other methods over this one! (just my opinion of course - MySpace leverages as much as they can out of Enterprise Application Blocks!)
You don't have to choose up front! I generally create a cache wrapper that I communicate with in my code for methods such as Get, Set, Exists, Remove, ListKeys, etc. This then points to an underlying level of cache abstraction that can point to MemCached, Velocity, or .NET cache. I use StructureMap (or choose another IoC container) to inject which form of cache I want to use for a given environment. In my local dev box I might use .NET cache in the session. In production I generally use MemCached Win 32. But regardless of how it is set up you can easily swap things around to try each system out to see what works best for you. You just need to make sure that you application knows as little as possible about how things are cached! Once this layer of abstraction is in place you can then do things such as run a compression algorithm (gzip) for all the data that is going in and out of cache which would allow you to store 10 times the amount of data in cache. - transparently.
I cover .NET Cache, MemCached Win32, StructureMap, and the appropriate abstractions in my book if you are interested!
ASP.NET 3.5 Social Networking (http://www.amazon.com/ASP-NET-3-5-Social-Networking-Enterprise-ready/dp/1847194788/ref=sr_1_1?ie=UTF8&s=books&qid=1225408005&sr=8-1 )
Andrew Siemer www.andrewsiemer.com blog.andrewsiemer.com www.socialnetworkingin.net
Update
Changed the link that lists sites using memcached. Thank you David for noticing that it was broken!
Bear in mind that the EntLib documentation specifically steers you towards the ASP.NET cache for ASP.NET applications. That's probably the strongest recommendation towards using it here. Plus the EntLib cache doesn't have dependencies, which for me is a big reason not to use it.
I don't think there's a technical limitation as such on shipping System.Web as part of your app, though it's slightly odd that they've put that notice in on the .NET 3.5 page. Hanselman actually says he started out being creeped out by this notion, but became convinced. Also if you read the comments, he says that the block has too many moving parts and the ASP.NET Cache is much more lightweght.
I think this is exactly the kind of problem that Velocity is going to solve, but that's only a preview for now :-(
I'd say use Web.Caching and see how you get on. If you put some kind of abstraction layer over the top of it, you've always got the option to swap it out for the EntLib block later on if you find problems.
Take a look at memcached. It is a really cool, fast and lightweight distributed caching system. There are APIs for several of the most popular languages, including C#. It may not serve well on the client side (unless of course the client is obtaining the cached data from a server of some kind), but if you abstract your usage of memcached to a specific interface, you could then implement the interface with another caching system.
#Davide Vosti
"If they put it in the web namespace, I think's it's for a good reason."
Does that same logic apply to the Concurrency and Coordination Runtime (CCR) in the robotic studio? no? didn't think so.

Resources