My site uses varnish cache heavily and is set to refresh every 5 minutes. I found out that this was skewing product view stats making them less than what they actually should be.
I want to turn off Magentos default product view logging facility so that no product views are being recorded.
I want to mimic the action by doing custom inserts into the relevant tables i.e. tf_report_viewed_product_index
Inserting into the tf_report_viewed_product_index table alone is not allowed since it has foriegn key constraints. There is more to it.
In case anyone comes across this you can use xml to disable an event:
<frontend>
<events>
<catalog_controller_product_view>
<observers>
<reports>
<type>disabled</type>
</reports>
</observers>
</catalog_controller_product_view>
</events>
</frontend>
Then using an ajax call from the product view page I simply insert a new row into tf_report_viewed_product_index table.
This is not a Magento issue, this is a user-request-reaching-your-web-app (Magento) issue. The speed and load-handling benefits realized by using Varnish exist precisely because pre-generated static content is cached and served ahead of the dynamically generated content from Magento (which also includes the overhead and resources of logging traffic to the report_* and log_* tables).
I've not too much experience in this area, but I believe you should use varnishcsa to log cache hits and then process them via cron using the Magento Report module's modeling; see Mage_Reports_Model_Event_Observer::catalogProductView() for a start, but note that this method normally handles logging of single views. You will likely want to do a mass insert of processed Varnish log data and then calculate.
And, here's a link SO post on setting up logging with varnishcsa.
Related
I am creating a JMeter Script to add products into the cart for the Magento2 website. Post request is executing without any error but the product isn't displaying into cart after that request.
Check snap of post request
JMeter automatically treats HTTP responses with status codes below 400 as successful, it doesn't know anything about Magento or product adding to cart. If you need to introduce an extra check that the product is there - you need to add Response Assertion
In the absolute majority of cases the main reason for "failing" requests is missing or not properly implemented correlation of the dynamic parameters:
Check that your ${form_key} JMeter Variable has anticipated value using Debug Sampler
Your request URL looks very suspicious, it might be the case you need to extract the URL for adding this particular product from the previous response as well
Don't forget to add HTTP Cookie Manager to your Test Plan
Also be aware that there is a Magento performance-toolkit containing benchmark.jmx script which you can use as the reference/basis for your own tests
You likely need to handle the dynamic values associated with cart identifier with the session. I will save you some time however, architecturally when Magento issues the cart is a known antipattern - every time I see Magento it is a default cart model.
What does this mean, the "default cart...?" This is a pattern of too early allocation of a resource. Twenty percent or fewer of visitors will ever use the cart, yet 100% of the population are allocated carts. You now have cart objects and cart resources which are dead resource allocations in the system that have to be managed, which includes clean up after a period of time/time out.
The performant design pattern is a just-in-time cart which is only created when the use either adds something to the cart or selects the cart icon to view the contents of a stored cart. Going along with this, the number of items in a perpetual cart should be persisted in a local cookie value rather than require a "pull" of the cart every time the user visits the site.
We are working on a site in AEM 6.1 which has news and events content with most pages having info on recent and related news/events based on tagging that are dynamic. We are using dispatcher. Please suggest on some caching techniques that could be implemented at application level apart from the dispatcher. Thanks.
Aim of implementing the caching on dispatcher is to allow less hits on your app server and serve as much as possible from web server. In short improving response time from your app server. But in some cases we can't cache too much on web server if results change on app server frequently.
On app server we can have following solutions implemented to get results quickly on top of having dispatcher in place.
Make sure your content hierarchy where you are ingesting news items have as less number of article as possible. Divide your hierarchy based on following structure. Year >> Month >> Day >> Hour (this can be ignored if content flow is less) >> news items.
Having this structure in place, write path based query so that you don't have to traverse in whole content hierarchy.
There is a concept of transient node in CQ, for each news item which is getting created in CQ, update the transient node with newly created item. Means for recent news you don't have to traverse content structure just refer to transient node which has reference to newly created news item.
You could also write a cron job which gets executed in background and takes care of collating views namely top recent news.
To complement the answer of Rupesh I would say that definitely use dispatcher cache as much as you can and for using local caching strategies in AEM try using guava cache it is a very good and easy to use tool there is also a lot of information on how you can set it up and use it for your specific needs. Hope it helps.
I would suggest the following:
For recent news/events, write a scheduler
(https://sling.apache.org/documentation/bundles/scheduler-service-commons-scheduler.html) that will compute the list of recent news/events and write it to a specific node as properties, example:
/tmp/recent
news [/path/to/news1,/path/to/news2]
events [/path/to/event1,/path/to/event2]
Most recent always at the end of the array. Your code need to limit it to the amount of max recent you want to have.
Let's say you want to have the last 5 changed and a 6th page is changed, then you just pop and push(new_page_path)
This could run once a day or at the frequency which you feel is the best depending on your requirements.
If you need instant update, then you can additionally write a Listener when a page is changed/deleted and update the recent list. In this case I would suggest putting the code that deal with updating the recent list into a service and use that service in both the scheduler and the listener.
Listener and scheduler need to run on both author and publisher and on publisher trigger dispatcher cache invalidation for /tmp/recent afterwards.
In order to render the recent list without having to invalidate the whole pages, I would suggest you use SSI for that, that means have a component in your page that will render an SSI include to /tmp/recent.news.html or /tmp/recent.events.html depending on whether you want to render recent news or events.
Give the node /tmp/recent the resourceType for handling the "news" and "events" selector and implement that resourceType to render the content.
For the related
Use the Tag Manager (https://docs.adobe.com/docs/en/cq/5-6-1/javadoc/com/day/cq/tagging/TagManager.html) "find" method to lookup for all news/events having the same tag as the current page. I assume your news and events pages have a dedicated template or resource type.
Also I would suggest having a dedicated component that would include that content using SSI include. Let's say your page has 2 tags, ns/tag1 and ns/tag2, then you could perform the SSI include like this:
SSI include /etc/tags/ns/tag1.related_news.html
SSI include /etc/tags/ns/tag1.related_events.html
SSI include /etc/tags/ns/tag2.related_news.html
SSI include /etc/tags/ns/tag2.related_events.html
depending on what you want to include
Write a component under /apps/cq/tagging/components/tag (sling:resourceSuperType= /libs/cq/tagging/components/tag) that will provide the rendering for the "related_news" and "related_events" selector and list all related pages.
The advantage with this approach is that you can share the related page for each tag and whenever the tag is changed/deleted then the cache gets invalidated automatically.
In both cases (recent and related) configure the dispatcher to cache the output.
At some pages there are some not important queries (products views increments , grab facebook likes) that have to run after the full load page just for improving the performance
Until now I made that kind of jobs with ajax on $( document ).ready() .
How can I use Event or Queues features of laravel for achieving that.
Is possible to pass an object (like an eloquent collection) also?
Thank you.
A queue is a server side processing event, that is meant to occur after the user does something. For example, after the user signs up, the system 'queues' an email, so it can return to the user quickly, and send the email later.
You cannot use Laravel 'queues' for page loading. This is a user-side event that needs to happen immediately.
Your use of ajax to load slow elements after the initial page load is good. There are other ways to optimize page loads (such as reducing database queries, html + css + js compression etc).
My team and I are rapidly launching new stores and views on Magento Enterprise Edition but we're running into an issue with caching. To be clear, the caching part itself works great. We have several complex products that take about 17 seconds to build, but after its cached the pages loads in 300ms, which is awesome! Unfortunately, if we clear the cache under any serious load (high traffic) we seem to be experiencing a cache miss storm, where every page request is trying to populate the cache, causing our webhead to stall out with load averages above 50.
Do you have any suggestions for avoiding this? Are there documented best practices for pre-warming a cache for new code deployments or even just content and configuration changes?
This could be related, so I'll include it: After clicking the button to refresh the cache and before the refreshing process is complete most pages on the front end die with 500 error codes and seemingly random error messages. Any idea what might cause that?
I implemented a solution to warm up a cache after a CMS Block is saved. You may take the inspiration of this solution to do the same for different cases (product save, CMS block, CMS Page, Category Save, etc)
This piece of code can be triggered after a CMS Block save by using an observer cms_block_save_after:
/**
* Clean targeted cache block and warmup if content is provided
*/
public function clearBlockHtmlCache(Varien_Event_Observer $observer)
{
$block = $observer->getEvent()->getObject();
$id = $block->getCacheKey();
// Remove only specific cache block
Mage::app()->getCacheInstance()->getFrontend()->remove(strtoupper($id));
// no print, it's ok just warmup cache with filters processing
$block->toHtml();
}
I am using Varnish to enhance performance on my Magento store.
My problem is it that Varnish is caching the top links number of items in Cart.
I was thinking to use a Ajax call after page loading, not sure how to implement,
suggestions?
Thanks
If you want to implement this via ajax, here's one possible approach:
Backend work:
For each action which modifies the number of items in the cart, observe the event and fire a method that will update a cookie on the client with necessary data you need. You can do something simple and store a JSON structure: {"cartItem": 2, "isLoggedIn": false}. Some events to observe:
controller_action_postdispatch_checkout
controller_action_postdispatch_customer
checkout_onepage_controller_success_action
Create a controller/action that will return the exact same data structure (as well as set the cookie while its at it).
Frontend work:
On DOM ready your code should look for the cookie set in the backend. If it doesn't exist, make an ajax request to the controller to fetch it.
Once it has the necessary data, update the values in the DOM as necessary
You'll want to make sure you listen to the all the necessary events. Using the cookie will help speed things up on the client side and reduces the number of HTTP requests the browser needs to make.