I just learned and implemented varnish reverse proxy to increase my website speed.
Everything works fine but something minor bothers me.
For some reason, when I check page TTFB for the first time, I get .999s, however, when I rerun the test the number drops to .237s.
I use the following website to check TTFB:
https://www.webpagetest.org
and my website is:
https://www.findfestival.com/
It makes me wonder if the first request to the website hits the cache. When I use curl I can see x-varnish but still it's strange that first time clicking on links are slower compared to the second time clicking on them. (specifically on mobile)
Can you please help me understand why first time Varnish cache doesn't hit?
This is my default.vcl is:
Thanks,
PS, I have seen this post and already tried the solution with no luck!
Varnish Cache first time hit
Seeing how you have X-Mod-Pagespeed in your headers and minimalistic VCL, the conclusion is that you need to take a look at Downstream Caching and make sure that PageSpeed would not send Cache-Control: max-age=0, no-cache which breaks Varnish caching for the most part.
In my own experience, Pagespeed does not play well with Varnish even with downstream caching configuration applied.
It "loves" to send the aforementioned header no matter what. Even if you manage to turn this behaviour off, it results in PageSpeed's own assets not having proper Cache-Control headers plus a few more interesting issues like causing Varnish "hit-for-pass" when rebeaconing has to take place - which is really bad and breaks caching further.
Also have a look at possible configurations. You might want to put PageSpeed at your SSL terminator level (option #1) - that way you don't need Downstream Cache configuration and PageSpeed will be "in front" of Varnish.
Related
We are hosting a major tennis tournament website and are trying to use Varnish on Rackspace to help with the traffic we anticipate. We have hired too systems consultants to help install Varnish on our cloud servers, but for whatever reason they are not able to get Varnish to work with our scripts. A typical script can be found here:
162.242.140.232/scoring/DemoGetOOP.php
There is nothing special about this script. It doesn't have any special caching commands in the headers and doesn't use session control. You can see by the date/time at the bottom that we have for testing purposes, that the page is not being cached. We set up a timer page which is cached:
162.242.140.232/scoring/timer.php
and also an info.php page at:
162.242.140.232/scoring/info.php
What's odd, is that if you first go to the timer.php, you can see it's cached for 10 seconds. However, if you then run our DemoGetOOP.php script and go back to timer.php, it's no longer cached. We have to clear the cache again or open up a private browser window to see the caching.
if (req.url ~ "^/scoring/DemoGetOOP.php") and
if (req.url ~ "/scoring/DemoGetOOP.php")
any help would be greatly appreciated!
S
First of all i would start with setting correct cache headers, i would prefer the Cache-Control header. The DemoGetOOP script also send a cookie, whereby Varnish will pass caching.
I would suggest to check the varnishlog which will give you a clear insight in why Varnish decides to cache or not.
They seem to be working fine to me, first link has a ttl of 120 seconds and the second link has a ttl of 10 seconds and both are working just fine.
I'd say always double check the cookies when things seem to be not workning.
I've been trying to get my head around the whole issue of browser history Vs caching and RFC2616 13.13
Does this section of the RFC mean that if a user goes "Back" in the browser, for example, it should always display the page from it's local storage, ignoring any cache directives, unless the user has configured it otherwise?
So browsers that reload the page when navigating the history, even if caching directives are instructing it do so, are not complying with the specification? And the spec is saying this is bad because "this will tend to force service authors to avoid using HTTP expiration controls and cache controls when they would otherwise like to."
Also, even though a directive may instruct the broswer not to cache, e.g. using Cache-Control: no-store, it can/should store it in it's history cache?
From what I've read, it seems that most browsers violate the standard, apart from Opera. Is this because the security concerns around the re-display of pages with sensitive data from history are seen as more important than the issue the standard talks about?
I'd be grateful if anyone is able shed some light/clarification on this area, thanks.
History and cache are completely separate. We're trying to clarify this in httpbis; see https://svn.tools.ietf.org/svn/wg/httpbis/draft-ietf-httpbis/latest/p6-cache.html#history.lists
I've got a bug report from the field that essentially boils down to image caching. Namely, an image from the same URL is getting cached and it's causing confusion because the image itself is supposed to change.
My fix is to do this bit here. Which I'm certain will work.
However - I can't freaking reproduce this. I would prefer not to do the methods I've seen here because they require code modification and I'd rather test this on the code as it exists now before I test a fix.
Is there any way in a browser like IE to force it to cache like mad? Just temporarily, of course.
You can use Fiddler to force things to cache or not to cache; just use the Filters tab and add a caching header like
Cache-Control: public,max-age=3600
You can have the customer use www.fiddlercap.com to collect a traffic capture so you can see exactly what they see.
You should also understand that the proper way to control caching is by setting HTTP headers rather than forcing the browser to guess: http://blogs.msdn.com/b/ie/archive/2010/07/14/caching-improvements-in-internet-explorer-9.aspx
Well, I'm trying to optimize my application and currently using page speed for this. One of the strongest recommendations was that I needed to leverage browser caching. The report sent me to this page:
http://code.google.com/intl/pt-BR/speed/page-speed/docs/caching.html#LeverageBrowserCaching
In this page there is this quote:
If the Last-Modified date is
sufficiently far enough in the past,
chances are the browser won't refetch
it.
My point is: it doesn't matter the value I set to the Last-Modified header (I tried 10 years past), when I access and reload my application (always clearing the browser recent history) I get status 200 for the first access, and 304 for the reaming ones.
Is there any way I can get the behavior described in the google documentation? I mean, the browser don't try to fetch the static resources from my site?
You might have better success using the Expires header (also listed on that Google doc link).
Also keep in mind that all of these caching-related headers are hints or suggestions for browsers to follow. Different browsers can behave differently.
The method of testing is a good example. In you case you mentioned getting status 304 for remaining requests, but are you getting those by doing a manual browser refresh? Browsers will usually do a request in that case.
It seems that IE6 ignores any form of cache invalidation sent via http headers, I've tried setting Pragma to No Cache and setting Cache Expiration to the current time, yet in IE6, hitting back will always pull up a cached version of a page I am working on.
Is there a specific HTTP Header that IE6 does listen too?
Cache-Control: private, max-age=0 should fix it. From classic ASP this is done with Response.Expires=-1.
Keep in mind when testing that just because your server is serving pages with caching turned off doesn't mean that the browser will obey that when it has an old cached page that it was told was okay to cache. Clear the cache or use F5 to force that page to be reloaded.
Also, for those cases where the server is serving cached content it you can use Ctrl+F5 to signal the server not to serve it from cache.
You must be careful. If you are using AJAX via XMLHttpRequest (XHR), cache "recommendations" set in the header are not respected by ie6.
The fix is to use append a random number to the url queries used in AJAX requests. For example:
http://test.com?nonce=0123
A good generator for this is the UTC() function that returns a unique timestame for the user's browser... that is, unless they mess with their system clock.
Have you tried setting an ETag in the header? They're a pretty reliable way to indicate that content has changed w3c Spec & Wikipedia
Beyond that, a little more crude way is to append a random query string parameter to the request, such as the current unix timestamp. As I said, crude, but then IE6 is not the most subtle of beasts
see Question: Making sure a webpage is not cached, across all browsers. How to control web page caching, across all browsers? I think this should help out with your problem too.
Content with "Content-Encoding: gzip" Is Always Cached Although You Use "Cache-Control: no-cache"
http://support.microsoft.com/kb/321722
You could also disable gzip just for IE6
A little note: By experience I know that IE6 will load Javascript from cache even if forced to reload the page via Ctrl-F5. So if you are working on Javascript always empty the cache.
The IE web developer toolbar can help immensely with this. There's a button for clearing the cache.