I'm trying to optimize my website speed according to Google Speed Test and GTMetrix YSlow. One of the recommendation is to "use cookie-free domains", there are 12 components that are not cookie-free.
The 12 components are images and some CSS and JS files.
What is the problem and how to fix it?
Cookie free domains relates to having domains that send a cookie with every request.
Say for example you serve all of your images from your main domain and your main domain has cookies on it.
Every one of those images will have a cookie sent with the request for the file, which the server ignores anyway.
This is completely unnecessary traffic as the cookie serves no purpose.
To fix this either use a cookie free CDN or set up a sub domain on your server that has no cookies.
Then serve all of your static assets such as fonts, images, JS and CSS from this sub domain / CDN.
[100% Working]
One of my website I installed NitroPack IO - Performance & SEO Booster Extension.
Download Free Extension Here
Before install Extension my website score in Mobile : 27 Desktop : 32
After install Extension score in Mobile : 98 Desktop : 99
I really appreciate this extension and his review is almost very good.
Related
Using Joomla System - Page Cache, my webpage is now around 4-5 sec.
But i have few pages which will be shown only to registered users. I just checked its taking around 10-15 sec. When i inspected using chrome, i can see few things, i have livechat, which is taking around 2 sec, and few things. But live chat is showing in homepage also. but that page is speed.
Wanted to know is Joomla system cache plugin will not work for registered users visible page. or any other plugin i can use to speed up this type of pages.
Joomla have one JCH Optimize plugin which will decrease your website load speed.
It will compress all css and js file into one file.That file will store in cache so website speed will be up.
This plugin will be helpful to you.
Thanks
Are you using Joomla.
for some components only had the problem of page cache. if u need to clear cache.
and you need to speed up the joomla site
follow the basic step:
Enable Gzip Compression
Using the Gzip Compression feature, you can compress your website pages before sending them to the user. After that, they will be uncompressed by the user’s browser. And this process takes less time than transferring uncompressed pages.
Enable Cache System
Optimization Settings (Images, CSS, Java Scripts…)
Now Check Your Joomla Website Speed
it may use full to speed up your site.
Wanted to know is Joomla system cache plugin will not work for registered users visible page.
Per the Joomla! Documentation, Page Caching:
Only caches pages for guest visitors (not for logged in visitors)
or any other plugin i can use to speed up this type of pages.
Aside from JCH Optimize (which was already mentioned), another component I recommend is JotCache, which is far better than just the Joomla! default cache.
You may, also, use GTmetrix to analyze your site against both Google PageSpeed and Yahoo! Yslow.
Finally, you may try using a CDN to speed up resource delivery. Here are a few:
MaxCDN
Amazon CloudFront
Azure CDN
CDN77
CDNetworks
CDNlion
CacheFly
EdgeCast Networks
KeyCDN
SkyparkCDN
You can use CDN for Joomla! to incorporate the CDN technology.
Overall, your best bet is going to be a combination of the CDN and JCH settings to trim down the overall weight of the site using GTmetrix to compare the site after each change.
Further reading: Joomla Performance & Speed
I like that Flash CC 2015 Canvas uses CreateJS, however it's not working in doubleclick as the CDN serving the .js files is being served http and doubleclick needs it to be served as https.
Is create JS aware of this and do they have updated CDN links that we can use when uploading html5 creative to doubleclick, sizemek or other ad networks?
Asset is not SSL-compliant. The following resources are
non-compliant: http://code.createjs.com/easeljs-0.8.1.min.js
http://code.createjs.com/tweenjs-0.6.1.min.js
Did you try removing the scheme http? All should be left is //code.createjs.com/easeljs-0.8.1.min.js. I got a similar complaint.
The security trick is to make all the http:// calls into https://. Just add the s.
Doubleclick now hosts CreateJS on their own CDN: https://support.google.com/richmedia/answer/6307288
Due to the irruption of RTB and Big Ad inventories, non secure protocol url are not allowed.
So, as said you could do both: // or http://
Also, for AdServers, many of them do not accept Folder structure. CreateJS creates an "image" folder for the assets, It is better if you have every asset at root level.
I came across this question on stackoverflow. It made me curious to ask.
Q-Part1. It says the advantages of loading images from other domain is useful for speeding Page load because browser send more HTTP requests at a time. My question is that I have two domains but both are on same shared hosting. More precisely all files are loading from same hosting. Will it still send more HTTP requests?
Q-Part2. I am thinking to redirect one domain to another as follows.
www.websitedevelopers.com.pk to www.websitedevelopers.pk.
If i redirect only html pages from .com.pk to .pk & not images etc will it still get more HTTP requests.
Note : I just jumped into website development so count me a newbie.
Answer Q;Part 1 - There wont be no speed issues if you go multiple sites on shared hosting.
Answer Q;Part 2 - Redirecting to another domain will not affect your HTTP requests.
There are several other ways to speed up your website like :
Size images before upload
Remove unnecessary plugins
Enable browser caching
I checked my site with google page speed.
It said Specify an expiration for my resources.
Can anyone tell me how to leverage browser caching for blogger?
You cannot set custom expiration for resources served by Blogger - that part is already done for you by Google servers. Having said that, if you serve content from any third party domains (which you are), then you should look into enabling compression and caching on those hosts.
Of course, sometimes this will be out of your control - ex, third party widgets which you do not control directly.
You will have to set the expires header in your hosting settings. If you have hosted your website on Google App Engine (which is free) you can set the expires header in webapp-web.xml to 10 days.
<include path="/assets/**.css" expiration="10d" />
<include path="/assets/**.js" expiration="10d" />
I have gone through your site and have seen that you use blogger for articles, you can not set the expires header for blogger images and posts.
I have used Google's infrastructure to host my images, CSS and JS files. But if you want to improvise your pagespeed score on you custom JS files and CSS files, my blogger post can help you to host static content in Google. My blogs pagespeed score is around 94.
Disclaimer: I have authored the post mentioned in the link.
If you are hosting your CSS and JS files in some other hosting location, you should look for help to set the expires header in case you are using Apache web server to host.
I haven't had a huge opportunity to research the subject but I figure I'll just ask the question and see if we can create a knowledge base on the subject here.
1) Using subdomains will force a client side cache, is this by default or is there an easy way for a client to disable it? More curious about what kind of a percentage of users I should be expecting to affect.
2) What all will be cached? Images? Stylesheets? Flash SWFs? Javascripts? Everything?
3) I remember reading that you must use a subdomain or www in your URL for this to work, is this correct? (and does this mean SO won't allow it?)
I plan on integrating this onto all of my websites eventually but first I am going to try to do it for a network of flash game websites so I am thinking www.example.com for the website will remain the same but instead of using www.example.com/images, www.example.com/stylesheets, www.example.com/javascript, & www.example.com/swfs I will just create subdomains that point to them (img.example.com, css.example.com, js.example.com & swf.example.com respectively) -- is this the best course of action?
Using subdomains for content elements isn't so much to force caching, but to trick a browser into opening more connections than it might otherwise do. This can speed up page load time.
Caching of those elements is entirely down the HTTP headers delivered with that content.
For static files like CSS, JS etc, a server will typically tell the client when the file was modified, which allows a browser to ask for the file "If-Modified-Since" that timestamp. Specifics of how to improve on this by adding some extra caching headers would depend on which webserver you use. For example, with Apache you can use the mod_expires module to set the Expires header, or the Header directive to output other types of cache control headers.
As an example, if you had a subdirectory with your css files in, and wanted to ensure they were cached for at least an hour, you could place a .htaccess in that directory with these contents
ExpiresActive On
ExpiresDefault "access plus 1 hours"
Check out YSlow's documentation. YSlow is a plugin for Firebug, the amazing Firefox web development plugin. There is lots of good info on a number of ways to speed up your page loads, one of which is using one or more subdomains to encourage the browser to do more parallel object loads.
One thing I've done on two Django sites is to use a custom template tag to create pseudo-paths to images, css, etc. The path contains the time-last-modified as a pseudo directory. This path component is stripped out by an Apache .htaccess mod_rewrite rule. The object is then given a 10 year time-to-live (ExpiresDefault "now plus 10 years") so the browser will only load it once. If the object changes, the pseudo path changes and the browser will fetch the updated object.