varnish and dynamic content from API - laravel

I have a vue app served with Laravel API, so we hope 30.000 people at least, the application present an static data for everyone, but there is some data according to users prefers, so I have the idea to use some cache system like varnish, my question is, if the content is dynamically and some data change according on user, it is good to use the cache system or not? And how can I do to caching only some data and leave other out of the cache system?

Related

Is it a good way to generate static page for dynamic content in a large website and how to manage the static page properly

I have a website with millions of pages. The content on the page stored in database but the data is not changed very frequently. so for the sake of improving the performance of the wesite and reducing the costs of deployment of web applications, I want to generate the static pages for the dynamic content and refresh the pages if the contents are changed. But I am very concerned about how to manage these large amount of pages. how should I store these pages? Is it possible that it will cause IO problems when the web server handle many requests? Is there any better solutions for this issue? Should I use varnish to handle this issue?
Varnish looks like a very good use case for that. Basically, you wouldn't be generating the full site statically, but incrementally, every time there's new requested content varnish hasn't cached yet.
EDIT to cover the comments:
if all the Varnish nodes are down, you can't get your content, same as if the database is down, or if your load-balancers are down. Just have two Varnish load-balanced for high availability with keepalived for example.
if varnish is restarted, the cache will get cleared, unless you are using Varnish Plus/Enterprise with MSE. It may not be an issue if you don't restart often (configuration changes don't need restarts), since the database still has the data to repopulate the cache.
Varnish has a ton of options to invalidate content: purges for just one object, revalidation, bans to target entire sub-domains or sub-trees, xkeys for tag-based invalidation.
Based on the description, your architecture looks like Webpages --> Services -->Database. The pages are generated dynamically based on the data in the database.
For Example, when you search for employed details, the services hits the database and get the details of employee and is rendered on the UI.
Now,if you create a content and store as webpage for every employee, this solution will not scale. Also, if employee information is changed in the database, you will have a stale data if you are not recreating the page.
My recommendation is the architecture should have a cache server and the new architecture should be like Webpages --> Services --> Cache server> Database. Services should query the database, create a page and store in the Cache. Key for cache should be page URL and value should be the page content. Now, when the URL hits the services, services will get the page from the cache rather than going to database. If the key is not available in the cache, services will query the database and fill the cache with key and value.
"Key is Url of the page. Value is the content of the page which has hidden updated date."
You can have the back-end job or a separate service to refresh the cache when data is updated in the database. Job can compare the updated date in the database vs the date in the cache value and flush the cache if the date is not matching. Job running in the back-end to refresh the cache will run behind the scene and will not impact user or UI performance.

Loading a lot of model data into client on page load

I have a question about proper or recommended way of loading a lot of data from database. I have Laravel 5.4 project with Vue.js 2.0. I need display employees table from database on a page. On client Vue component is used for requesting this data with promise and build grid with vue-tables-2.
The problem in case that i can't find proper way for logic. There is already 50 000+ records and will be more. So using Employees::all() is really bad idea. Data is requested with axios from api url. And there is no possibility of using reddis or memcached. Looks like i need some kind of lazy loading post request from client and maybe with Laravel pagination. I will request first part of data and make next request to paginator with next post... and will have spam of requests.
If i will use default caching mechanism i don't understand how to build logic of caching, how to detect that model was changed and cache require rebuilding.
Maybe there is a way to organize lazy loading of data with dynamically add it into table and if user start search or filter before loading finished make request to server for direct database search. But again in this case possibly i will have a lot of database requests.
So question is - maybe there is a recommended way for organization stuff like this ?

Caching a web page with Backbone.js

Can backbone cache a web page (similar to appcache) using Local Storage? I know that backbone can cache collections but I'd like to store the entire page. I'd also like to be able to update content once online. I've looked at various posts but they refer to only caching collections, not the entire web page.
You can store any string using localstorage, though the maximum size can vary depending on the browser.
Something like this could work: localStorage.setItem('page_name', $('body').html());

Drupal 6 caching and blocks

I've read all over the place and I'm trying to figure out if I am understanding the way caching happens in Drupal 6. We have a site that has a real time stock ticker in it. We have Drupal caching enabled so the stock price ends up getting cached and frozen at a specific spot. I figured a way I could handle it would be to put the ticker in a block I make in a custom module and set BLOCK_NO_CACHE, but if I'm understanding this correctly, if you have site caching enabled, then the ENTIRE page gets cached including any and all blocks on it regardless of their individual cache settings. Is this correct? So am I unable to take advantage of site caching then if I have certain spots that should not cache? Does anyone know of another solution I might be able to use to have the best of both worlds? To be able to have site caching, but also have a real time stock ticker? By the way, the stock ticker is making a JSON request to the Yahoo finance API to get the quote.
You are correct, the directive BLOCK_NO_CACHE is only applicable to block level. However when page caching is enabled then Drupal will cache the entire page (which includes the block as well). But this is only applicable to anonymous users. Drupal's philosophy is that the content for anonymous users is always the same so they get served the cached page. But this is not applicable to authenticated users. Since different users might have different access to certain parts of the page (e.g. the links block will look different for an admin than a regular user).
You might want to have a look at this discussion: BLOCK_NO_CACHE not working for anonymous users
And there is a solution, which you'll stumble upon this discussion. It's this module: Ajax Blocks. Extract from the module description:
Permits to load some blocks by additional AJAX request after loading
the whole cached page when the page is viewed by anonymous user. It is
suitable for sites which are mostly static, and the page caching for
anonymous users is a great benefit, but there are some pieces of
information that have to be dynamic.

Designing an application around HMVC and AJAX [Kohana 3.2]

I am currently designing an application that will have a few different pages, and each page will have components that update through AJAX. The layout is similar to the new Twitter design where 'Home', 'Discover', and 'Connect' are separate pages, but interacting within the page (such as clicking 'Followers' or 'Following') uses AJAX.
Since the design requires an initial page load with several components (in the context of Twitter: tweets, followers, following), each of which can be updated individually through AJAX, I thought it'd be best to have a default controller for serving pages, and other controllers with actions that, rather than serving full pages, strictly handle querying the database and returning JSON objects. This way, on initial page load several HMVC requests can be made to gather the data for each component, and AJAX calls can also be made to update each component individually.
My idea is to have a Controller_Default that handles serving pages. In the context of Twitter, Controller_Default would contain:
action_home()
action_connect()
action_discover()
I would then have other Controllers that don't deal with serving full pages, but rather components of pages. For instance, in the context of Twitter Controller_Tweet may have:
action_get()
which returns a JSON object containing tweets for a specific user. Action_home() could then make several HMVC requests to get the data for the several different components of the page (i.e. make requests to 'tweet/get', 'followers/get', 'following/get'). While on the page, however, AJAX calls could be made to the function specific controllers (i.e. 'tweet/get') to update the content.
My question: is this a good design? Does it make sense to have the pages served through a default controller, with page components served (in JSON format) through other function specific controllers?
If there is any confusion regarding the question please feel free to ask for clarification!
One of the strengths of the HMVC pattern is that employing this type of layered application doesn't lock you into a workflow that might be difficult to change later on.
From what you've indicated above, this would be perfectly acceptable as a way of serving content to a client; the default controller wraps sub-requests, which avoids multiple AJAX calls from the client to achieve the same goal.
Two suggestions I might make:
Ensure that your Twitter back-end requests are abstracted out and managed in a library to make the application DRY'er and easier to maintain.
Consider whether the default controller is making only the absolutely necessary calls on each request. Employ caching to avoid pulling infrequently changed data on every request (e.g., followers might only be updated every 30 seconds). This of course depends entirely on your application requirements, but if you get heavily loaded you could quickly find your Twitter API request limit being reached.
One final observation: if you do find the server is experiencing high load and Twitter API requests are taking a long time to return, consider provisioning another server and installing a copy of your application. You can then "point" sub-requests from the default gateway application to your second application server, which should help improve response times if the two servers are connected by a high-speed link.

Resources