Caching with Cedar/Memcache/Rails 3.1 - heroku

So I have been trying to get caching to work on my private website, which basically just serves static stuff with little dynamic stuff in it. Since I am going to deply to Heroku Cedar, I have to go with Memcache instead of Varnish. However, I do not seem to be able to make it cache.
I always get cache: stale, invalid, store.
The way I cache is to set the appropriate config params, and use the built-in cache_pages method of ActionController. Can someone help me debugging this?

I see the same. But it looks like page caching works.

So i decided to use Cloudflare for caching, after a couple days of running, i received significant boost in performance/caching.

Related

Varnish plus cloudflare

I found this article while searching if there is any VCL tweak that needs to be done while using varnish and cloudflare. I have currently not included the below tweaks suggested by cloudflare. Can anyone suggest if this is really needed?
Cloudflare varnish VCL configuration
Not a must do, but it may be a good idea to strip not-needed cookies, so that you can cache as much as you can.
Perhaps try with and without the snippet and check which behaviour works best for your use case.

CloudFlare permanently caches page even though cookies are set

I had this problem where CloudFlare wouldn't cache any of my pages because all of my pages returned session cookies. I've fixed this with a own written method, which removes unnecessary cookies from my response header. It's based on the same idea used and described here https://github.com/HaiFangHui/sessionmonster.
Currenlty i'm having this situation which is driving me bananas and i was hoping someone could help me out a little bit with your expertise about this subject.
I'm having this problem that after i login within my site after CloudFlare had it's chance to cache the page in a previous request... It will do that permanently untill the Edge TTL time expires.
Since the official CloudFlare documentation states it will not cache a page if it contains cookies i was hoping that after a succesfull login attempt it will serve a live/personalized version of the page. But so it seems that is not the case.
Does somebody know if this is normal? Of course i'm interested in knowing a way to circumvent this. I'm wondering how other sites solves this problem. My assumption would be i wouldn't be the first one having this issue.
Any advice regarding this subject would be greatly appreciated.
So it seems like supplying a "Bypass cache" setting is the solution.
It's only available on a paid plan.
More info: https://blog.cloudflare.com/caching-anonymous-page-views/

How can I FORCE my web browser to cache images for testing purposes?

I've got a bug report from the field that essentially boils down to image caching. Namely, an image from the same URL is getting cached and it's causing confusion because the image itself is supposed to change.
My fix is to do this bit here. Which I'm certain will work.
However - I can't freaking reproduce this. I would prefer not to do the methods I've seen here because they require code modification and I'd rather test this on the code as it exists now before I test a fix.
Is there any way in a browser like IE to force it to cache like mad? Just temporarily, of course.
You can use Fiddler to force things to cache or not to cache; just use the Filters tab and add a caching header like
Cache-Control: public,max-age=3600
You can have the customer use www.fiddlercap.com to collect a traffic capture so you can see exactly what they see.
You should also understand that the proper way to control caching is by setting HTTP headers rather than forcing the browser to guess: http://blogs.msdn.com/b/ie/archive/2010/07/14/caching-improvements-in-internet-explorer-9.aspx

Use MediaWiki's internal cache to save bad login attempts

I'm working on a MediaWiki plugin that adds a certain captcha if users have more than three bad login attempts. I'm basing this on the existing ConfirmEdit plugin but for some reason the way they store bad login attempts doesn't seem to work for me. After checking the code they use, it seems they're using the global variable $wgMemc, which in my case appears to be an instance of FakeMemchachedClient. This is a fake memchache that just returns true on everything without even saving anything.
I'm trying to find out how to implement another way to internally keep track of the amount of bad logins, preferrably without having to consult the database for this.
The only thing I could come up with that avoids the cache entirely is POSTing the amount of bad logins, but this could easily be modified by a smart user/bot...
Anyone have any ideas?
In ideal case, $wgMemc should be instance of MemcachedPhpBagOStuff. It is actually an interface to memcached, see the MediaWiki page about it for more information and usage in MediaWiki.
For this use case it would be great choice. However, since it seems you don’t have memcached set up, the only viable alternative is probably using the database.

Clear all website cache?

Is it possible to clear all site cache? I would like to do this when the user logs out or the session expires instead of instructing the browser not to cache on each request.
As far as I know, there is no way to instruct the browser to clear all the pages it has cached for your site. The only control that you, as a website author, have over caching of a page occurs when the browser tries to access that page. You can specify that cached versions of your pages should expire at a certain time using the Expires header, but even then the browser won't actually clear the page from its cache at that time.
i certainly hope not - that would give the web site destructive powers over the client machine!
If security is your main concern here, why not use HTTPS? Browsers don't cache content received via HTTPS (or cache it only in memory).
One tricky way to mimic this would be to include the session-id as a parameter when referencing any static piece of content on the site. When the user establishes the session, the browser will recognize all the pieces of content as new due to the inclusion of this parameter. For the duration of the session the browser will used the static content in its cache. After the user logs out and logs back in again, the session-id parameter for the static contents will be different, so the browser will recognize this is as completely new content and will download everything again.
That being said... this is a hack and I wouldn't recommend pursuing it.. For what reason do you want the user's cache to be cleared after their session expires? There's probably a better solution that can fit your situation as opposed to what you are currently asking for.
If you are talking about asp.net cache objects, you can use this:
For Each elem As DictionaryEntry In Cache
Cache.Remove(elem.Key)
Next
to remove items from the cache, but that may not be the full-extent of what you are trying to accomplish.

Resources