Magento top link cart is cached by varnish - magento

I am using Varnish to enhance performance on my Magento store.
My problem is it that Varnish is caching the top links number of items in Cart.
I was thinking to use a Ajax call after page loading, not sure how to implement,
suggestions?
Thanks

If you want to implement this via ajax, here's one possible approach:
Backend work:
For each action which modifies the number of items in the cart, observe the event and fire a method that will update a cookie on the client with necessary data you need. You can do something simple and store a JSON structure: {"cartItem": 2, "isLoggedIn": false}. Some events to observe:
controller_action_postdispatch_checkout
controller_action_postdispatch_customer
checkout_onepage_controller_success_action
Create a controller/action that will return the exact same data structure (as well as set the cookie while its at it).
Frontend work:
On DOM ready your code should look for the cookie set in the backend. If it doesn't exist, make an ajax request to the controller to fetch it.
Once it has the necessary data, update the values in the DOM as necessary
You'll want to make sure you listen to the all the necessary events. Using the cookie will help speed things up on the client side and reduces the number of HTTP requests the browser needs to make.

Related

Render a long list on page load with VueJS

I have a long list that I want to let the user manipulate in a browser-based web application. Using jQuery, the most straightforward way would be to render it on the server as part of the initial page load, and include a small script that registers event handlers for AJAX requests and DOM manipulation.
In the case of VueJs however, it seems to me that the most straightforward way is for the initial request to load the page layout only, then call an API to get the data for the long list. In other words, VueJs renders the initial list, not the server.
While this is workable, I am hesitant to introduce this second request unless I really have to. Is there a more straightforward way to go about this? Am I missing something about how VueJS works? I would really like to render the initial list on the server side if possible. For example, would it be workable to somehow include the initial list as 'transcluded' content?
I don't want to have to get in to VueJS' complete server side rendering, since it looks like an advanced topic (and this is a simple task). I have experimented with passing the initial list data as JSON in the <head> of the page (inside tags that register it as a javascript variable), but that seems like a hack/workaround.
In the case of VueJs however, it seems to me that the most straightforward way is for the initial request to load the page layout only, then call an API to get the data for the long list. In other words, VueJs renders the initial list, not the server.
Yes, it is most straightforward way, and considered as anti-pattern also. Just for the reason in your next sentence: "While this is workable, I am hesitant to introduce this second request"...
I think you should read following post on medium.com first. It is about Vue and Laravel framework, but the principles herein can be considered universal:
https://medium.com/js-dojo/avoid-this-common-anti-pattern-in-full-stack-vue-laravel-apps-bd9f584a724f

Tracking ajax request status in a Flux application

We're refactoring a large Backbone application to use Flux to help solve some tight coupling and event / data flow issues. However, we haven't yet figured out how to handle cases where we need to know the status of a specific ajax request
When a controller component requests some data from a flux store, and that data has not yet been loaded, we trigger an ajax request to fetch the data. We dispatch one action when the request is initiated, and another on success or failure.
This is sufficient to load the correct data, and update the stores once the data has been loaded. But, we have some cases where we need to know whether a certain ajax request is pending or completed - sometimes just to display a spinner in one or more views, or sometimes to block other actions until the data is loaded.
Are there any patterns that people are using for this sort of behavior in flux/react apps? here are a few approaches I've considered:
Have a 'request status' store that knows whether there is a pending, completed, or failed request of any type. This works well for simple cases like 'is there a pending request for workout data', but becomes complicated if we want to get more granular 'is there a pending request for workout id 123'
Have all of the stores track whether the relevant data requests are pending or not, and return that status data as part of the store api - i.e. WorkoutStore.getWorkout would return something like { status: 'pending', data: {} }. The problem with this approach is that it seems like this sort of state shouldn't be mixed in with the domain data as it's really a separate concern. Also, now every consumer of the workout store api needs to handle this 'response with status' instead of just the relevant domain data
Ignore request status - either the data is there and the controller/view act on it, or the data isn't there and the controller/view don't act on it. Simpler, but probably not sufficient for our purposes
The solutions to this problem vary quite a bit based on the needs of the application, and I can't say that I know of a one-size-fits-all solution.
Often, #3 is fine, and your React components simply decide whether to show a spinner based on whether a prop is null.
When you need better tracking of requests, you may need this tracking at the level of the request itself, or you might instead need this at the level of the data that is being updated. These are two different needs that require similar, but slightly different approaches. Both solutions use a client-side id to track the request, like you have described in #1.
If the component that calls the action creator needs to know the state of the request, you create a requestID and hang on to that in this.state. Later, the component will examine a collection of requests passed down through props to see if the requestID is present as a key. If so, it can read the request status there, and clear the state. A RequestStore sounds like a fine place to store and manage that state.
However, if you need to know the status of the request at the level of a particular record, one way to manage this is to have your records in the store hold on to both a clientID and a more canonical (server-side) id. This way you can create the clientID as part of an optimistic update, and when the response comes back from the server, you can clear the clientID.
Another solution that we've been using on a few projects at Facebook is to create an action queue as an adjunct to the store. The action queue is a second storage area. All of your getters draw from both the store itself and the data in the action queue. So your optimistic updates don't actually update the store until the response comes back from the server.

Queries with queues on Laravel

At some pages there are some not important queries (products views increments , grab facebook likes) that have to run after the full load page just for improving the performance
Until now I made that kind of jobs with ajax on $( document ).ready() .
How can I use Event or Queues features of laravel for achieving that.
Is possible to pass an object (like an eloquent collection) also?
Thank you.
A queue is a server side processing event, that is meant to occur after the user does something. For example, after the user signs up, the system 'queues' an email, so it can return to the user quickly, and send the email later.
You cannot use Laravel 'queues' for page loading. This is a user-side event that needs to happen immediately.
Your use of ajax to load slow elements after the initial page load is good. There are other ways to optimize page loads (such as reducing database queries, html + css + js compression etc).

How to structure a Symfony2 app with ESI?

On a new project with lot of traffic, we are thinking on how to structure our Symfony2 app to take advantage of caches, and be ready to be more aggressive in the future. I'd love to know your opinion.
Let's say a user requests a page a list of places. This page has:
- list
- common data (title, author, description)
- user data (the user likes the list + other data)
- first 20 places
- common data (title, photo of each place)
- user data (the rates of the user for those places)
The HTML could be like:
<html>...
<body>
<header>
...
<!-- Embed the top user menu -->
<esi:include src="http://example.com/profile/menu" />
...
</header>
<content>
...
common data of the list
...
<!-- Embed the common data of the first 20 places, the same for everyone -->
<esi:include src="http://example.com/lists/17/places" />
...
<!-- Embed the user data of the list (used in JS) -->
<esi:include src="http://example.com/lists/17/user" />
...
<!-- Embed the user data of the list of places (used in JS) -->
<esi:include src="http://example.com/lists/17/places/user" />
...
</content>
</body>
</html>
The HTML will be cached on the gateway (Symfony or Varnish). The list of places will be cached most of the time on the gateway too. The user data requests will be the ones which are called and not be cached (not initially at least).
Questions:
How do you feel about this structure?
If the user is anonymous, can I avoid making the esi-includes for the user data? Also if I have a cookie for the anon user? How?
Does the esi-include for the user menu makes sense?
Or should we forget about ESI and go always through the controller (caching the rendered view of the common data for example)?
Should we move the 2 ESI-requests that ask for user data to be AJAX-calls, instead of waiting on the server?
Is this a good approach to scale if we need to do it fast? What would be best?
thanks a lot!
We have used Varnish on one site for whole-page caching and I've been using Symfony2 for few years, but keep in mind that I haven't used Varnish + Symfony2 + ESI on any production environment.
I think the basic idea is OK. If menu is the same in many pages and list of places also the same on many pages, you get common content cached by Varnish or Symfony reverse cache. As Varnish usually holds cache in memory, you get your content faster and don't have to call rendering and DB querying code at each request.
The hard part is making those ESI requests cached if the user is logged in. As I know, in default Varnish configuration, requests with Cookie in them are never cached. If you tend to pass cookies to ESI requests, those ESI responses will not be shared between users.
You can try making some rules from URL, but if you use default Symfony twig helpers, generated URLs are /_internal/..., so it might be hard to differ public and private ones.
Also you can configure to always ignore any cookies if Cache-Control: public is passed. This is done by default in Symfony:
if ($this->isPrivateRequest($request) && !$response->headers->hasCacheControlDirective('public')) {
$response->setPrivate(true);
}
As you see from the code, if you have public directive, response will never be private.
I haven't found how Varnish processes this directive - as I understand, it does not cache any requests that have cookie by default. So I think you have to tweak configuration to accomplish this.
If the main page is also to be cached, I don't see how you could skip the includes.
I assume JS is required for your registered users (not search bots), so I would suggest to use Javascript to differ the loading of user data.
Javascript code can look if the user has cookie session-id etc. and make request to get the data only in this case. It might also be a good idea to set some other cookie, like _loggedin to avoid Javascript code from getting the session id.
Not logged in users can also have some data in the cookies, like _likedPost:1,2,132. Javascript can get this cookie and make some HTML corrections without even making the additional request.
As we did with these cookies: we separated JS-only cookies from application cookies. We did this by some pattern, like _\w for JS cookies. Then we tweaked Varnish configuration to split Cookie header and remove these JS-only cookies. Then, if there is no other cookie left, the response is shared with everyone. Application (Symfony) does not get those cookies, as they are stripped.
I think it does if it is the same in every page.
I think ESI is good as Varnish can hold cache in memory. So it might be that it would not even make any queries to your hard disk for the content. As your controller cache might be also in-memory, I think Varnish would look for the cache more quicker than Symfony framework with all the routing, PHP code, services initialization etc.
It depends, but I think that it could be better approach. Keep in mind, that caches live different lives. For example, if your places list is cached for 2 hours, at the end of this time places can have changed - some new items are new on the list and some of them are missing. Your list to the user is still the old one (cached), but you provide user's data about the new list - some of the data is not needed, some of it is missing.
It might be better approach to get loaded places by javascript, for example searching for some HTML attribute like data-list-item-id and then make ajax request querying data about these items. In this case your user data will be synchronized with current cached list and you can make 1 ajax request for both lists instead of 2.
If cache invalidation (PURGE requests) are not used, all HTTP cache sheme is indeed good to scale. You can scale the application to several servers and configure Varnish to call them randomly, by some rule or just to use one of them as a failsafe. If the bandwidth is still too big, you can always modify the cache timeouts and other configuration.

Best practice for combining requests with possible different return types

Background
I'm working on a web application utilizing AJAX to fetch content/data and what have you - nothing out of the ordinary.
On the server-side certain events can happen that the client-side JavaScript framework needs to be notified about and vice versa. These events are not always related to the users immediate actions. It is not an option to wait for the next page refresh to include them in the document or to stick them in some hidden fields because the user might never submit a form.
Right now it is design in such a way that events to and from the server are riding a long with the users requests. For instance if the user clicks a 'view details' link this would fire a request to the server to fetch some HTML or JSON with details about the clicked item. Along with this request or rather the response, a server-side (invoked) event will return with the content.
Question/issue 1:
I'm unsure how to control the queue of events going to the server. They can ride along with user invoked events, but what if these does not occur, the events will get lost. I imagine having a timer setup up to send these events to the server in the case the user does not perform some action. What do you think?
Question/issue 2:
With regards to the responds, some being requested as HTML some as JSON it is a bit tricky as I would have to somehow wrap al this data for allow for both formalized (and unrelated) events and perhaps HTML content, depending on the request, to return to the client. Any suggestions? anything I should be away about, for instance returning HTML content wrapped in a JSON bundle?
Update:
Do you know of any framework that uses an approach like this, that I can look at for inspiration (that is a framework that wraps events/requests in a package along with data)?
I am tackling a similar problem to yours at the moment. On your first question, I was thinking of implementing some sort of timer on the client side that makes an asycnhronous call for the content on expiry.
On your second question, I normaly just return JSON representing the data I need, and then present it by manipulating the Document model. I prefer to keep things consistent.
As for best practices, I cant say for sure that what I am doing is or complies to any best practice, but it works for our present requirement.
You might want to also consider the performance impact of having multiple clients making asynchrounous calls to your web server at regular intervals.

Resources