shopify website takes a long time to load - ruby

url: http://bookpanda.pk/
The website takes too long to load the landing page. Even though the size of page is 1.97 MB, but the time it takes to load is around 8-9 seconds.
Can you please guide me where the problem persists and what I can do about it?

For such a general, high-level question, I can only provide an equivalently high-level, situationally relevant answer:
Use browser dev tools to diagnose at a high level. Check out the load times of each network call as well as their timings. Modern browsers also have performance profiling tools built in that help you find out where it's spending time or waiting for something else that's slow.
For example, here's a JS Flame Chart of the first 10 seconds of page load:

Related

LCP time between LightHouse and Performance - Google Chrome

With google chrome chrome dev, I am running a lighthouse Analysis for mobile.
Lighthouse shows a 7.0 seconds delay for Largest Contentful Paint (LCP):
I decide to dive into this and click on: "View original trace".
It redirects me to the Performance tabs:
Here it says that the LCP is 749.7ms (= 0.7497 seconds).
Where this discrepancy between LightHouse and Performance tab comes from?
0.7497 seconds for Performance
7.0 seconds for LightHouse
Why is Lighthouse showing much longer load times?
The answer is a combination of simulated network throttling and CPU throttling.
Simulated Network Throttling
When you run an audit it applies 150ms latency to each request and also limits download speed to 1.6 Megabits (200 kilobytes) per second and upload to 750 megabits (94 kilobytes) per second.
This is done via an algorithm rather than applied (it is simulated)
CPU throttling
Lighthouse applies a 4x slowdown to your CPU to simulate a mid-tier mobile phone performance.
If your JavaScript payload is heavy this could block the main thread and delay rendering. Or if you dynamically insert elements using JavaScript it can delay LCP for the same reason.
This is also done via an algorithm rather than applied (it is simulated)
So why doesn't it match the performance trace?
Because the trace is "as it happened" and doesn't take into account the simulated network and CPU slowdown.
Can I make the performance trace match Lighthouse?
Yes - all you need to do is uncheck "Simulated throttling" under the settings section (you may need to press the cog in the top right of the Lighthouse tab to show this checkbox).
Be aware that you will probably get an even lower score as simulated throttling can be a bit more forgiving.
Also note that your report will take a lot longer to run (which is good for seeing how someone on a slow phone with a slow 4G connection might experience your site!)
Now when you run Lighthouse it will use applied throttling, adding the latency and CPU slowdown in real time. If you view your trace now you will see it matches.
Where can I see what settings were used on a run?
At the bottom of your report you can see what settings were applied. You will see on the screenshot below that "(Devtools)" is listed in the Network Throttling and the CPU throttling sections to show that I use applied throttling.

Why is my webpage load time more than 10 secs even with 98% speed performance score?

Is there any way to identify why this website initial loading is very slow? I have checked in my each and every part of the HTML structure and code (Requests, Response).
It’s loading very fast after first time loading completes.
I have performed page speed optimization then followed page speed rules. After that my page speed score in GTMetrix as below picture. But still initial loading is very slow.
What would be the reason? How can we resolve this type of speed issues?
It seems to me that there is an issue with your "cdn" subdomain.

Browser getting more responsive after a while on heavy web page

When we load in a very heavy web page with a huge html form and lots of event handler code on it, the page gets very laggy for some time, responding to any user interaction (like changing input values) with a 1-2 second delay.
Interestingly, after a while (depending on the size of the page and code to parse, but around 1-2 minutes) it gets as snappy as it normally is with average size web pages. We tried to use the profiler in the dev tools to see what could be running in the background but nothing surprising is happening.
No network traffic is taking place after the page load, neither is there any blocking code running and HTML parsing is long gone at the time according to the profiler.
My questions are:
do browsers do any kind of indexing on the DOM in the background to speed up queries of elements?
any other type of optimization like caching repeated function call results?
what causes it to "catch up" after a while?
Note: it is obvious that our frontend is quite outdated and inefficient but we'd like to squeeze out everything from it before a big rewrite.
Yes, modern browsers, namely modern Javascript runtimes performs many optimisations during load and more importantly during page lifecycle: one of them is "Lazy / Just In Time Compilation, what in general means that runtime observes demanding or frequently performed patterns and translates them to faster, "closer to metal" format. Often in cost of higher memory consumption. Amusing fact is that such optimisations often makes "seemingly ugly and bad but predictable" code faster than well-thought complex "hand-crafted optimised" one.
But Iʼm not completely sure this is the main cause of phenomenon you are describing. Initial slowness and unresponsiveness is more often caused by battle of network requests, blocking code, HTML and CSS parsing and CPU/GPU rendering, i.e. wire/cache->memory->cpu/gpu loop, which is not that dependant on Javascript optimisations mentioned before.
Further reading:
http://creativejs.com/2013/06/the-race-for-speed-part-3-javascript-compiler-strategies/
https://developers.google.com/web/tools/chrome-devtools/profile/evaluate-performance/timeline-tool

Speed test of a web page : how to get the theoretical loading speed

There are many tools online to measure the speed of a web page.
They provide data such as the loading time of a page.
This loading time depends on the number of files downloaded at the same time and the connection speed (and many other things such as the network state, the content providers, so on).
However, because it is based on the speed of the connection, we don't have the theoretical loading time.
A browser downloads many resources at the same time within a certain limit (5 resource at the same time). So it is optimized to load the resources faster.
If we could set the speed connection to a fixed amount, the loading page of a page would "never" change.
So does anyone know a tool which computes this theoretical loading time of a web page ?
I'd like to get this kind of results :
Theoretical loading time : 56 * t
With t equals the amount of time to download 1kb of data.
What do you mean by "theoretical loading time"? Such a formula would have a huge number of variables (Bandwidth, Round-Trip Time, Processor speed, Antenna Wakeup Time, Load on Server, Packet Loss, Whether TCP Connections Are Already Open, ...) which ones would you put into your theoretical calculation?
And what is the problem you are trying to solve? If you just want a more objective measure of site speed, you can use a tester like http://www.webpagetest.org/ which allows you to choose the network speed and then run many tests to find the distribution of load times.
Note also, that there is not even agreement on when a page is finished loading! Most people measure time until onload handler is called, but that can easily be sped up by prematurely reaching onload and then doing the actual loading of resources with JS afterwards. Waiting until all resources are loaded could also be a bad measure, because many modern pages will continually be updating themselves and loading new resources.
You can use any of these three tools: SpeedPage, DevTools, and WebPageTest. Read more at: TEST YOUR WEBSITE'S LOADING TIME / MOBILE SITE LOADING SPEED blog post.

What is an appropriate page processing time for a web application?

I'm working on a web application, and it's getting to the point where I've got most of the necessary features and I'm starting to worry about execution speed. So I did some hunting around for information and I found a lot about reducing page load times by minifying CSS/JS, setting cache control headers, using separate domains for static files, compressing the output, and so on (as well as basic server-side techniques like memcached). But let's say I've already optimized the heck out of all that and I'm concerned with how long it actually takes my web app to generate a page, i.e. the pure server-side processing time with no cache hits. Obviously the tricks for bringing that time down will depend on the language and underlying libraries I'm using, but what's a reasonable number to aim for? For comparison, I'd be interested in real-world examples of processing times for apps built with existing frameworks, doing typical things like accessing a database and rendering templates.
I stuck in a little bit of code to measure the processing time (or at least the part of it that happens within the code I wrote) and I'm generally seeing values in the range 50-150ms, which seems pretty high. I'm interested to know how much I should focus on bringing that down, or whether my whole approach to this app is too slow and I should just give it up and try something simpler. (Based on the Net tab of Firebug, the parts of processing that I'm not measuring typically add less than 5ms, given that I'm testing with both client and server on the same computer.)
FYI I'm working in Python, using Werkzeug and SQLAlchemy/Elixir. I know those aren't the most efficient technologies out there but I'm really only concerned with being fast enough, not as fast as possible.
EDIT: Just to clarify, the 50-150ms I quoted above is pure server-side processing time, just for the HTML page itself. The actual time it takes for the page to load, as seen by the user, is at least 200ms higher (so, 250-350ms total) because of the access times for CSS/JS/images (although I know that can be improved with proper use of caching and Expires headers, sprites, etc. which is something I will do in the near future). Network latency will add even more time on top of that, so we're probably talking about 500ms for the total client load time.
Better yet, here's a screenshot from the Net tab of Firebug for a typical example:
It's the 74ms at the top that I'm asking about.
IMHO, 50-150 ms on client side on server side is fine in most circumstances. When I measure the speed of some very known websites, I rarely see something as fast. Most of the times, it is about 250 ms, often higher.
Now, I want to underline three points.
Everything depends on the context. A home page or a page which will be accessed very frequently will suck a lot if it takes seconds to load. On the other hand, some rarely used parts of the website can take up to one second if optimizations are to expensive.
The major concern of the users is to accomplish what they want quickly. It's not about the time taken to access a single page, but rather the time to access information or to accomplish a goal. That means that it's better to have one page taking 250 ms than requiring the user to visit three pages one after another to do the same thing, each one taking 150 ms to load.
Be aware of the perceived load time. For example, there is an interesting trick used on Stack Overflow website. When doing something which is based on AJAX, like up/down-voting, first you see the effect, then the request is made to the server. For example, try to up-vote your own message. It will show you that the message is up-voted (the arrow will become orange), then, 200 ms later, the arrow will become gray and an error box will be displayed. So in the case of an up-vote, the perceived load time (arrow becomes orange) is 1 ms, when the real load time spent doing the request is 100 ms.
EDIT: 200 ms is fine too. 500 ms will probably hurt a little if the page is accessed frequently or if the user expects the page to be fast (for example, AJAX requests are expected to be fast). By the way, I see on the screenshot that you are using several CSS files and ten PNG images. By combining CSS into one file and using CSS sprites, you can probably reduce the perceived load time, especially when dealing with network latency.
Jakob Nielsen, a well known speaker on usability posted an article [1] on this a few days back. He suggests that under 1 second is deal, under 100ms is perfect as it interrupts the user flow a bit more.
As other users have pointed out it depends on the context of that page. If someone is uploading a file they expect a delay. If they're logging in and it takes ten seconds they can start to get frustrated.
[1] http://www.useit.com/alertbox/response-times.html
I looked at some old JMeter results from when I wrote and ran a suite of performance tests against a web service. I'll attach some of them below, it's not apples-to-apples of course but at least another data point.
Times are in milliseconds. Location Req and Map Req had inherent delays of 15000 and 3000 milliseconds, respectively. Invite included a quick call to a mobile carrier's ldap server. Others were pretty standard, mainly database read/writes.
sampler_label count average min max
Data Blurp 2750 185 30 2528
UserAuth 2750 255 41 2025
Get User Acc 820 148 29 2627
Update User Acc 4 243 41 2312
List Invitations 9630 345 47 3966
Invite 2750 591 102 4095
ListBuddies 5500 344 52 3901
Block Buddy 403 419 79 1835
Accept invite 2065 517 94 3043
Remove Buddy 296 411 83 1942
Location Req 2749 16963 15369 20517
Map Req 2747 3397 3116 5926
This software ran on a dedicated, decent virtual machine, tuned the same way production VMs were. The max results were slow, my goal was to find the number of concurrent users we could support so I was pushing it.
I think your numbers are absolutely ok. With regards to all the other stuff that makes websites seem slow, if you haven't, take a look at YSlow. It integrates nicely with Firebug and provides great information about how to make pages load faster.
50-150ms for page load time is fine - you do not need to optimize further at this point.
The fact is, so long as your pages are loading within a second, you are OK.
See this article, which discusses the effects of load times for conversion (100ms increase = 1% for amazon).

Resources