How does varnish deal with dynamic content? - caching

I am studying up on caching and I am looking into varnish for caching. I am wondering though how does varnish deal with dynamically generated content?
All over the place people are saying you shouldn't really cache content that might change a lot but on the other hand when I look at the response headers for stackoverflow I see pages being served up via varnish.
Content here changes by the second so how does this even work? Excuse me if it's a bit of a simple question, I will research some more while this question is up.

You need to define dynamic :
if the content depends on the user (through Cookies for example), it should not be cached as you'll have lots of different contents and your HIT/MISS ration will not be high since every user has a different content.
if the content changes in time, you can always cache the content a little, for example a few seconds.
if the content changes in time, a better option is to separate the "static content" from the dynamic one. You may cache the page template and do ajax calls to refresh the content. You may also use esi, it's an old technology but it lets you specify differents "zones" in your pages, each having its cache duration.
you can benefit from IMS requests. Telling the backend to send the response body only if it has changed since the last request can save you lots of processing time. I think varnish does this from version 4
As for the stackoverflow architecture, you may learn a lot reading Nick Craver's blog post about it : http://nickcraver.com/blog/2016/02/17/stack-overflow-the-architecture-2016-edition/

Related

caching snippets (modX)

I was simple cruising through the modx options and i noticed the option to cache snippets. I was wondering what kind of effect this would have (downsides) to my site. I know that caching would improve the loading time of the site by keeping them 'cached' after the first time and then only reloading the updates but this all seems to good to be true. My question is simple: are there any downsides to caching snippets? Cheers, Marco.
Great question!
The first rule of Modx is (almost) always cache. They've said so in their own blog.
As you said, the loading time will be lower. Let's just get the basics on the floor first. When you chose to cache a page, the page with all the output is stored as a file in your cache-folder. If you have a small and simple site, you might not see the biggest difference in caching and not, but if you have a complex one with lots of chunks-in-chunks, snippets parsing chunks etc, the difference is enormous. Some of the websites I've made goes down 15-30 levels to parse the content in come sections. Loading all this fresh from the database can take up to a coupe of seconds, while loading a flat-file would take only a few microseconds. There is a HUGE difference (remember that).
Now. You can cache both snippets and chunks. Important to remember. You can also cache one chunk while uncache the next level. Using Modx's brilliant markup, you can chose what to cache and what to uncache, but in general you want as much as possible cached.
You ask about the downside. There are none, but there are a few cases where you can't use cached snippets/chunks. As mentioned earlier, the cached response is divided into each page. That means that if you have a page (or url or whatever you want to call it), where you display different content based on for example GET-parameters. You can't cache a search-result (because the content changes) or a page with pagination (?page=1, ?page=2 etc would produce different output on the same page). Another case is when a snippet's output is random/different every time. Say you put a random quotes in your header, this needs to be uncached, or you will just see the first random result every time. In all other cases, use caching.
Also remember that every time you save a change in the manager, the cache will be wiped. That means that if you for example display the latest news-articles on your frontpage, this can still be cached because it will not display different content until you add/edit a resource, and then the cache will be cleared.
To sum it all up. Caching is GREAT and you should use it as much as possible. I usually make all my snippets/chunks cached, and if I crash into problems, that is the first thing I check.
Using caching makes your webserver respond quicker (good for the user) and produces fewer queries to the database (good for you). All in all. Caching is a gift. Use it.
There's no downsides to caching and honestly I wonder what made you think there were downsides to it?
You should always cache everything you can - there's no point in having something be executed on every page load when it's exactly the same as before. By caching the output and the source, you bypass the need for processing time and improve performance.
Assuming MODX Revolution (2.x), all template tags you use can be called both cached and uncached.
Cached:
[[*pagetitle]]
[[snippet]]
[[$chunk]]
[[+placeholder]]
[[%lexicon]]
Uncached:
[[!*pagetitle]] - this is pointless
[[!snippet]]
[[!$chunk]]
[[!+placeholder]]
[[!%lexicon]]
In MODX Evolution (1.x) the tags are different and you don't have as much control.
Some time ago I wrote about caching in MODX Revolution on my blog and I strongly encourage you to check it out as it provides more insight into why and how to use caching effectively: https://www.markhamstra.com/modx/2011/10/caching-guidelines-for-modx-revolution/
(PS: If you have MODX specific questions, I'd suggest posting them on forums.modx.com - there's a larger MODX audience there that can help)

When is it better to generate a static page or dynamically generate?

The title pretty much sums up my question.
When is it more efficient to generate a static page, that a user can access, as apposed to using dynamically generated pages that query a database? As in what situations would one be better than the other.
To serve up a static page, your web server just needs to read the page off the disk and send it. Virtually no processing will be required. If the page is frequently accessed, it will probably be cached in memory, so even the disk access will not be needed.
Generating pages dynamically obviously has more overhead. There is a cost for every DB access you make, no matter how simple the query is. (On a project I worked on recently, I measured a minimum overhead of 0.7ms for each query, even for SELECT 1;) So if you can just generate a static page and save it to disk, page accesses will be faster. How much faster? It just depends on how much work is being done to generate the page dynamically. We don't know what you are doing, so we can't comment on that.
Now, if you generate a static page and save it to disk, that means you need to re-generate it every time the data which went into generating that page changes. If the data changes more often than the page is actually accessed, you could be doing more work rather than less! But in most cases, that's a very unlikely situation.
More likely, the biggest problem you will experience from generating static pages and saving them to disk is coding (and maintaining) the logic for re-generating the pages whenever necessary. You will need to keep track of exactly what data goes into each page, and in every place in the code where data can be changed, you will need to invoke re-generation of all the relevant pages. If you forget just one, then your users may be looking at stale data some of the time.
If you mix dynamic generation per-request and caching generated pages on disk, then your code will be harder to read and maintain, because of mixing the two styles.
And you can't really cache generated pages on disk in certain situations -- like responding to POST requests which come from a form submission. Or imagine that when your users invoke certain actions, you have to send a request to a 3rd party API, and the data which comes back from that API will be used in the page. What comes back from the API may be different each time, so in this case, you need to generate the page dynamically each time.
Static pages (or better resources) are filled with content, that does not change or at least not often, and does not allow further queries on it: About Page, Contact, ...
In this case it doesn't make any sense to query these pages. On the other side we have Data (e.g. in a Database) and want to query it/give the user the opportunity to query it. In this case you give the User a page with the possibility to specify the query and return a rendered page with the dynamically generated data.
In my opinion it depends on the result you want to present to the user. Either it is only an information or it is the possibility to query a Datasource. The first result is known before you do something, the second (query data) is known after you have the query parameters, which means you don't know the result beforehand (it could be empty or invalid).
It depends on your architecture, but when you consider that GET Requests should be idempotent it should be also easy to cache dynamic Pages with a Proxy, and invalidate the cache, when something new happens to the data which is displayed on the cached path. In this case one could save a lot of time, because the system behaves like the cached pages would be static, but instead coming from the filesystem, they come from your memory, which is really fast.
Cheers
Laidback

Varnish & ESIs : Fetching in parallel and possible workarounds

I'm investigating using Varnish with ESIs to cache page content for a high traffic forum-like website.
Context : I want to cache content for visitors only (connected users will have a really different display and need absolute fresh content). Still, for visitors, some parts of a page need to be dynamic :
- not cachable, for example for a visitor-dependant module (think a 'suggestion' widget that is fed with a real-time analysis of the pages viewed, thanks to a beacon)
- cachable with a small TTL of 15mn, for example for a 'latest posts' widget or for very changing ads campaigns.
Currently we are using Apache/PHP/symfony/memcache to cache pages and have in-house ESI-like mecanism : a page from cache is parsed and some specific tags are interpreted (including calls to web services and/or databases). This is not performant enough since server time is then around 700ms.
In remplacement of this solution, we can have Varnish+ESIs. The total nb of ESIs included in a page can reach 15. The real number of ESIs to fetch will be less than that but not so much given the ESI's TTLs. The critical problem is that Varnish fetches the ESIs sequencially instead of parallel and this is not acceptable. This feature is somewhere late in Varnish's roadmap.
So,
What is your experience with Varnish and ESIs ? How many ESIs, response time gain that you have ?
Do you know workarounds or other serious and configurable (VCL was nice) reverse-proxies with parallel ESI fetching ?
If not, what kind of good caching strategy do you use for equivalent use-cases ?
Thanks,
P.
Currently i work for a high traffic site, and performance is everything for us. On several pages we use a lot (20+) of ESI's, for example on our search resultlist. The resultlist is JSON response, and every resultblock in it is a seperate ESI. Okay, we do cache warming. But we didn't run in any performance issues on this. The number of ESI's will be a problem if the backend requests are real slow.
Parallel ESI fetching is on the feature request list of Varnish, but i think it didn't make it in version 4.1.

JSONP and Cross-Domain queries - How to Update/Manipulate instead of just read

So I'm reading The Art & Science of Javascript, which is a good book, and it has a good section on JSONP. I've been reading all I can about it today, and even looking through every question here on StackOverflow. JSONP is a great idea, but it only seems to resolve the "Same Origin Problem" for getting data, but doesn't address it for changing data.
Did I just miss all the blogs that talked about this, or is JSONP not the solution I was hoping for?
JSONP results in a SCRIPT tag being generated to another server with any parameters that might be required as a GET request. e.g.
<script src="http://myserver.com/getjson?customer=232&callback=jsonp543354" type="text/javascript">
</script>
There is technically nothing to stop this sort of request altering data on the server, e.g. specifying newName=Tony. Your response could then be whether the update succeeded or not. You will be limited by whatever you can fit on a querystring. If you are going with this approach add some random element as a parameter so that proxy's won't cache it.
Some people may consider this goes against the way GET's are supposed to work i.e. they shouldn't cause data to change.
Yes, and honestly I would like to stick to that paradigm. However, I might bend the rule and say that, requests which do not alter/deal with CRUCIAL data will be accessible via GET calls... hm...
For instance, I am building a shopping cart system, and I think that allowing the adding/removing/etc of items to/from a cart could very easily be exposed via GETs, since even though you can change data, you cannot do anything critical with it. If someone maliciously added 1,000 flatscreen monitors to your shopping cart, there would be at least one verification step that would NOT be vulnerable to any attacks (a standard ASP.NET page at that point, with verification and all that jazz).
Is this a good/workable solution in anyones' opinion?

Strategies for Caching on the Web? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
What concerns, processes, and questions do you take into account when deciding when and how to cache. Is it always a no win situation?
This presupposes you are stuck with a code base that has been optimized.
I have been working with DotNetNuke most recently for web applications and there are a number of things that I consider each time I implement caching solutions.
Do all users need to see cached content?
How often does each bit of content change?
Can I cache the entire page?
Do I need a manual way to purge the cache?
Can I use a single cache mechanism for the entire site, or do I need multiple solutions?
What impacts occur if informaiton is somehow out of date?
I would look at each feature of your website/application a decided for each feature:
Should it be cached?
How long should it be cached for?
When should the cache be expunged?
I would personally go against caching whole pages in favour of caching sections of the website/application.
First off, if your code is optimized as you said, you will only see noticable performance benefits when the site is being hammered with a lot of requests.
However, It is faster to pull resources from RAM than from the disk, so your web server will be able to handle more requests if you have a caching strategy in place.
As for knowing when you're going to need caching, consider that even low end modern web servers can handle hundreds of requests per second, so unless you expect a decent amount of traffic, caching is probably something you can just skip.
Also, if you are pulling content from your database (for example, StackOverflow probably does this) caching can be very helpful because database operations are relatively expensive and can be a huge bottleneck in high-volume situations.
As for a scenario when it's not appropriate to cache or when caching becomes difficult... If you try to cache a dynamic page that, say, displays the current date and time, you will constantly see an old date/time unless you get a little more involved with your caching strategy. So that's something to think about.
What language are you using? With ASP you have some very easy caching with only adding some property tag over the method and the value is cached depending of the time.
If you want more control over the cache, you can use some popular system like MemCached and have a control with time or by event.
Yahoo for example "versions" their JavaScript, so your browser downloads code-1.2.3.js and when a new version appears they reference that version. By doing this they can make their Javascript code cacheable for a very-very long time.
As for the general answer I think it depends on your data, on how often does it change. For example, images don't change very often, but html pages do. The "About us" page doesn't change too often, but the news section does.
You can cache by time. This is useful for data that change fast. You can set time for 30 sec or 1 min. Of course, this require some traffic. More traffic you have, more you can play with the time because if you have 1 visit every hour, this visit will be populate the cache and not using it...
You can cache by event... if your data change, you update the cache... this is one very useful if the data need to be accurate for the user very fast.
You can cache static content that you know that won't change ofen. If you have a top 10 of the day that refresh every day, than you can stock all in the cache and update every day.
Where available, look out for whole object memory caching. In ASPNET, this is a built-in feature where you can just plant your business logic objects in the IIS Application and access them from there.
This means you can store everything you need to generate a page in memory (persisting writes to database) and generate a page without ANY database IO.
You still need to use the page-building logic to generate the page, but you save a lot of time in getting the data.
Other techniques involve localised output caching, where you capture the output before sending and save it to file. This is great for static sections (like navigation on certain pages, or text bodies) and include them out when they're requested. Most implementations purge cached objects like this when a write happens or after a certain period of time.
Then there's the least "accurate": whole page caching. It's the highest performer but it's pretty useless unless you have very simple pages.
What kind of caching? Server side caching? Client side caching?
Client side caching is a no-brainer with certain things, like Static HTML, SWFs and images. Figure out how often the assets are likely to change, and set up "Expires" headers as appropriate. (2 days? 2 weeks? 2 months?)
Dynamic pages, by definition, are a little harder to cache. There have been some explorations in caching of certain chunks using Javascript (and degrading to IFrames if JS is not available.) This however, might be a little more difficult to retrofit into an existing site.
DB and application level caching may, or may not work, depending on your situation. That really depends on where your bottlenecks are. Figuring out where your application spends the most time on page-rendering is probably priority 1, then you can start looking at where and how to cache.

Resources