On mobile phone, the most simple Odoo page loads far too slowly (Odoo.sh v13)
For instance, for the simple contact page form:
https://www.samalife.fr/contact
Page Speed Test : score 35 / 100
https://pagespeed.web.dev/report?url=https%3A%2F%2Fwww.samalife.fr%2Fcontact
And it's even worse for a normal page content including images and videos :
https://www.samalife.fr/bas-du-dos
Page Speed Test : score 28 / 100
The speed diagnostic points these bottlenecks :
JS SCRIPTS ( 2,4 s )
.../web.assets_common_lazy.js
.../web.assets_frontend_lazy.js
CSS SCRIPTS ( 2,4 s )
.../web.assets_common.css
.../web.assets_frontend.css
UNUSED CSS SCRIPTS ( 1,6 s )
.../web.assets_common.css
.../web.assets_frontend.css
SERVER RESPONSE TOO SLOW ( 0,8 s )
Is there a way to improve the different aspects involved in theses slow page load on mobile : js, css, server response ?
How to implement lazy loading for css and javascript files ?
How to implement compression in Odoo to minimze css and javascript files ?
Does Odoo.sh provide alternative hosting packets for higher performance ?
Comparatively, for the same page (contactus), the rosehosting demo :
https://odoo-demo.rosehosting.com/contactus
get a score of 70 / 100
Related
I have been curious about this since I have been working with UI recently and wanted to know more about it.
I have been reading the chrome network reference guide but I am not sure if I understand it correctly. I am mainly curious about the bottom row, or the summary pane.
So from what I see is that here were 30 requests which total size was 34.7 KB but I am unsure about the thing that follow. So I want what the remaining statistics are.
1) 758 KB resources - Is this the total resources loaded and sent?
2) Finish: 448 ms - Is this the time it took for the entire website to finish loading?
3) DOMContentLoaded: 235 ms - Is this the time it took for the browser to parse the DOM being received?
4) Load: 421 ms - Is this the time it took for my browser to finish loading all necessary assets for the page like fonts, pictures, etc...?
transferred means from the network, the rest (out of 758 kB total) was from the cache
finish is the last resource's timestamp so it changes whenever a new request is made
DOMContentLoaded is the timestamp of the DOMContentLoaded event:
fired when the initial HTML document has been completely loaded and parsed, without waiting for stylesheets, images, and subframes to finish loading
Load is the timestamp of the load event:
fired when a resource and its dependent resources have finished loading
I was testing website with 30 users/second and it was working fine. But now it is not even serving 25 users/second. The website is a search engine kind of site. In between these two tests of 30 users/second and 25 users/seconds, we have started the crawler to get some sites crawled and then again stopped it before the load testing. 30 users/second was working fine before crawler turned on. Elastic search is used as DB for the website and it went down saying no nodes available.
I've used standard thread group. Below is the configuration.
Total Samples: 250
Ramp up (sec) : 10
Loop count : 1
When I checked the results in table, it shows all green signal but when we hit the site, it gives nonodesavailable exception
I have an AJAX query on my client that passes two parameters to a server:
var url = window.location.origin + "/instanceStats"
$.getJSON(url, { 'unit' : unit, "stat" : stat }, function(data) {
instanceData[key] = data;
var count = showInstanceStats(targetElement, unit, stat, limiter);
});
The server itself is a very simple Python Flask application. On that particular URL, it grabs the "unit" and "stat" parameters from the query to determine the name of a CSV file and line within that file, grabs the line, and sends the data back to the client formatted as JSON (roughly 1KB).
Here is the funny thing: When I measure the time it takes for the data to come back, I observe that some queries are fast (between 20 and 40 ms), and some queries are slow (between 320 and 350 ms). Varying the "stat" parameter (i.e. selecting a different line in the CSV) doesn't seem to have any impact. The fast and slow queries usually switch back and forth (i.e. all even queries are fast, all odd ones are slow). The Python server itself reports roughly the same time for each query.
AJAX itself doesn't seem to have any impact either, as I can take the url that is constructed in the JS and paste it into the browser myself and get the same behavior. Here are some measurements from two subsequent queries:
Fast: http://i.imgur.com/VQ7qopd.png
Slow: http://i.imgur.com/YuG0ROM.png
This seems to be Chrome-specific, as I've tried it on Firefox and the same experiment yields roughly the same query time everytime (between 30 and 50 ms). This is unfortunate, as I want to deploy on both Chrome and Firefox.
What's causing this behavior, and how can I fix it?
I've run into this also. It only seems to happen when using localhost. If you use 127.0.0.1 (or even the computer name), it will not have the extra delay.
I'm having it too, and it's exactly the same: my Node.js application serves Ajax requests and no matter which /url I request it's either 30ms or 300ms and it switches back and forth: odd requests are long, even requests are short.
The thing I see in Chrome Web Inspector (aka Chrome DevTools) is that there is a long gap between "DNS lookup" and "Initial Connection".
They say it's OCSP related here:
http://www.webpagetest.org/forums/showthread.php?tid=12357
OCSP is some kind of certificate validation protocol:
https://en.wikipedia.org/wiki/Online_Certificate_Status_Protocol
Moving from localhost to 127.0.0.1 seems to fix it: response times are 30ms now.
I am using simplehtmldom parser for my recent web scraping project and the project is actually building a price comparing website build with CodeIgniter. The website has to fetch product names, description and price from different shopping websites. Here is my code:
$this->dom->load_file('http://www.site1.com');
$price1 = $this->dom->find("span[itemprop=price]");
$this->dom->load_file('http://www.site2.com');
$price2 = $this->dom->find("div.price");
$this->dom->load_file('http://www.site3.com');
$price3 = $this->dom->find("div.priceBold");
$this->dom->load_file('http://www.site4.com');
$price4 = $this->dom->find("span.fntBlack");
$this->dom->load_file('http://www.site5.com');
$price5 = $this->dom->find("div.price");
The above code takes approximately 15-20 seconds to load the result into the screen. When I try with only one site, it just takes 2 seconds. This is how the simplehtmldom works with multiple domains? Or is there a way to optimize it?
PHP Simple HTML DOM Parser has some memory leak issue, so before trying to load a new page, clear the previous one using:
$this->dom->clear();
unset($this->dom);
If this doesn't change anything, then one of your websites is taking much time to respond... you'll have to check one by one to find the culprit xD
I call 2 locations. From an xml file I get the longtitude and the langtitude of a location. First the closest cafe, then the closest school.
$.get('https://maps.googleapis.com/maps/api/place/nearbysearch/xml?
location='+home_latitude+','+home_longtitude+'&rankby=distance&types=cafe&sensor=false&key=X',function(xml)
{
verander($(xml).find("result:first").find("geometry:first").find("location:first").find("lat").text(),$(xml).find("result:first").find("geometry:first").find("location:first").find("lng").text());
}
);
$.get('https://maps.googleapis.com/maps/api/place/nearbysearch/xml?
location='+home_latitude+','+home_longtitude+'&rankby=distance&types=school&sensor=false&key=X',function(xml)
{
verander($(xml).find("result:first").find("geometry:first").find("location:first").find("lat").text(),$(xml).find("result:first").find("geometry:first").find("location:first").find("lng").text());
}
);
But as you can see, I do the function verander(latitude,longtitude) twice.
function verander(google_lat, google_lng)
{
var bryantPark = new google.maps.LatLng(google_lat, google_lng);
var panoramaOptions =
{
position:bryantPark,
pov:
{
heading: 185,
pitch:0,
zoom:1,
},
panControl : false,
streetViewControl : false,
mapTypeControl: false,
overviewMapControl: false ,
linksControl: false,
addressControl:false,
zoomControl : false,
}
map = new google.maps.StreetViewPanorama(document.getElementById("map_canvas"), panoramaOptions);
map.setVisible(true);
}
Would it be possible to push these 2 locations in only one request(perhaps via an array)? I know it sounds silly but I really want to know if their isn't a backdoor to reduce these google maps requests.
FTR: This is what a request is for Google:
What constitutes a 'map load' in the context of the usage limits that apply to the Maps API? A single map load occurs when:
a. a map is displayed using the Maps JavaScript API (V2 or V3) when loaded by a web page or application;
b. a Street View panorama is displayed using the Maps JavaScript API (V2 or V3) by a web page or application that has not also displayed a map;
c. a SWF that loads the Maps API for Flash is loaded by a web page or application;
d. a single request is made for a map image from the Static Maps API.
e. a single request is made for a panorama image from the Street View Image API.
So I'm afraid it isn't possible, but hey, suggestions are always welcome!
Your calling places api twice and loading streetview twice. So that's four calls but I think they only count those two streetviews as once if your loading it on one page. And also your places calls will be client side so they won't count towards your limits.
But to answer your question there's no loop hole to get around the double load since you want to show the users two streetviews.
What I would do is not load anything until the client asks. Instead have a couple of call to action type buttons like <button onclick="loadStreetView('cafe')">Click here to see Nearby Cafe</button> and when clicked they will call the nearby search and load the streetview. And since it is only on client request your page loads will never increment the usage counts like when your site get's crawled by search engines.
More on those usage limits
The Google Places API has different usages then the maps. https://developers.google.com/places/policies#usage_limits
Users with an API key are allowed 1 000 requests per 24 hour period
Users who have verified their identity through the APIs console are allowed 100 000 requests per 24 hour period. A credit card is required for verification, by enabling billing in the console. We ask for your credit card purely to validate your identity. Your card will not be charged for use of the Places API.
100,000 requests a day if you verify yourself. That's pretty decent.
As for Google Maps, https://developers.google.com/maps/faq#usagelimits
You get 25,000 map loads per day and it says.
In order to accommodate sites that experience short term spikes in usage, the usage limits will only take effect for a given site once that site has exceeded the limits for more than 90 consecutive days.
So if you go over a bit not and then it seems like they won't mind.
p.s. you have an extra comma after zoom:1 and zoomControl : false and they shouldn't be there. Will cause errors in some browsers like IE. You also are missing a semicolon after var panoramaOptions = { ... } and before map = new