Using Joomla System - Page Cache, my webpage is now around 4-5 sec.
But i have few pages which will be shown only to registered users. I just checked its taking around 10-15 sec. When i inspected using chrome, i can see few things, i have livechat, which is taking around 2 sec, and few things. But live chat is showing in homepage also. but that page is speed.
Wanted to know is Joomla system cache plugin will not work for registered users visible page. or any other plugin i can use to speed up this type of pages.
Joomla have one JCH Optimize plugin which will decrease your website load speed.
It will compress all css and js file into one file.That file will store in cache so website speed will be up.
This plugin will be helpful to you.
Thanks
Are you using Joomla.
for some components only had the problem of page cache. if u need to clear cache.
and you need to speed up the joomla site
follow the basic step:
Enable Gzip Compression
Using the Gzip Compression feature, you can compress your website pages before sending them to the user. After that, they will be uncompressed by the user’s browser. And this process takes less time than transferring uncompressed pages.
Enable Cache System
Optimization Settings (Images, CSS, Java Scripts…)
Now Check Your Joomla Website Speed
it may use full to speed up your site.
Wanted to know is Joomla system cache plugin will not work for registered users visible page.
Per the Joomla! Documentation, Page Caching:
Only caches pages for guest visitors (not for logged in visitors)
or any other plugin i can use to speed up this type of pages.
Aside from JCH Optimize (which was already mentioned), another component I recommend is JotCache, which is far better than just the Joomla! default cache.
You may, also, use GTmetrix to analyze your site against both Google PageSpeed and Yahoo! Yslow.
Finally, you may try using a CDN to speed up resource delivery. Here are a few:
MaxCDN
Amazon CloudFront
Azure CDN
CDN77
CDNetworks
CDNlion
CacheFly
EdgeCast Networks
KeyCDN
SkyparkCDN
You can use CDN for Joomla! to incorporate the CDN technology.
Overall, your best bet is going to be a combination of the CDN and JCH settings to trim down the overall weight of the site using GTmetrix to compare the site after each change.
Further reading: Joomla Performance & Speed
I have a strange issue with Joomla content caching, the articles themselves take up to 10 mins to refresh their titles, images, text.
i have disabled System - Page Cache
Note i am also using cloud flare
Thank You.
If I'm not mistaken, cloud flare has some serious problems with caching. Check with them and they may send you some lines that you have to throw in your .htaccess to disable their caching.
If you have specifically enabled page rules in CloudFlare to cache static content, then it's possible this is CloudFlare related, It's easy enough to check by deleting any page rules which could cause this.
I have dozens of Joomla sites using CloudFlare and I've never seen that there was a problem combining them, or that ClouldFlare would try and cache articles by default.
As you will know, by default CloudFlare will cache static content like javascript and CSS files, but from what you describe, it's seems unlikely that this could be a cause.
I know you mention that articles are being cached, but if you have double checked that
- Extensions > Plugins > System - Page Cache is disabled
- You have cleared System > Page Cache
- You don't have any other caching enabled on your site
- You haven't specified custom ExpiresByType or mod_headers settings in .htaccess
then try setting System > Global Configuration > System > Cache to Off and on the same window if you have selected memcache(d) for the cache handling, change it to File.
These last settings under System > Cache should only affect module caching, not article caching but it doesn't hurt to rule it out.
Good luck!
So I'm wondering how browsers treat requests for images. I'm hoping to use a cdn for serving product images on my website. I'd also like to use the cdn for serving button images and images used in my css.
The problem with this is that I don't have control over the expires headers (Rackspace files is what I'm looking into).
See, say I have a large image file as a background on my home page. So the page is accessed often, but the image stays the same. Is the browser going to request this image every time?
Or should I just use a cdn for my product images?
caching is quite a broad subject. I suggest you start by reading about the different kinds of caching here http://www.mnot.net/cache_docs/#BROWSER and how caching works here http://www.web-caching.com/mnot_tutorial/how.html
Now, to answer your question: assuming the user has caching enabled and the cdn response headers are properly configured a user visiting your page multiple times will only request that background image once until the cache expires or those files are cleaned.
No, AFAIK you need necessarily to add the 'cache' header to your images to enable browser caching. This is a great tutorial about it.
Additionally you can read this article from Yahoo to get a very brief view of the topics.
Review specially these topics of the article:
Minimize HTTP Requests
Add an Expires or a Cache-Control Header
Use a Content Delivery Network
Hope it helps you
If I understand correctly, a broswer caches images, JS files, etc. based on the file name. So there's a danger that if one such file is updated (on the server), the browser will use the cached copy instead.
A workaround for this problem is to rename all files (as part of the build), such that the file name includes an MD5 hash of it's contents, e.g.
foo.js -> foo_AS577688BC87654.js
me.png -> me_32126A88BC3456BB.png
However, in addition to renaming the files themselves, all references to these files must be changed. For exmaple a tag such as <img src="me.png"/> should be changed to <img src="me_32126A88BC3456BB.png"/>.
Obviously this can get pretty complicated, particularly when you consider that references to these files may be dynamically created within server-side code.
Of course, one solution is to completely disable caching on the browser (and any caches between the server and the browser) using HTTP headers. However, having no caching will create it's own set of problems.
Is there a better solution?
Thanks,
Don
The best solution seems to be to version filenames by appending the last-modified time.
You can do it this way: add a rewrite rule to your Apache configuration, like so:
RewriteRule ^(.+)\.(.+)\.(js|css|jpg|png|gif)$ $1.$3
This will redirect any "versioned" URL to the "normal" one. The idea is to keep your filenames the same, but to benefit from cache. The solution to append a parameter to the URL will not be optimal with some proxies that don't cache URLs with parameters.
Then, instead of writing:
<img src="image.png" />
Just call a PHP function:
<img src="<?php versionFile('image.png'); ?>" />
With versionFile() looking like this:
function versionFile($file){
$path = pathinfo($file);
$ver = '.'.filemtime($_SERVER['DOCUMENT_ROOT'].$file).'.';
echo $path['dirname'].'/'.str_replace('.', $ver, $path['basename']);
}
And that's it! The browser will ask for image.123456789.png, Apache will redirect this to image.png, so you will benefit from cache in all cases and won't have any out-of-date issue, while not having to bother with filename versioning.
You can see a detailed explanation of this technique here: http://particletree.com/notebook/automatically-version-your-css-and-javascript-files/
Why not just add a querystring "version" number and update the version each time?
foo.js -> foo.js?version=5
There still is a bit of work during the build to update the version numbers but filenames don't need to change.
Renaming your resources is the way to go, although we use a build number and embed that in to the file name instead of an MD5 hash
foo.js -> foo.123.js
as it means that all your resources can be renamed in a deterministic fashion and resolved at runtime.
We then use custom controls to generate links to resources at on page load based upon the build number which is stored in an app setting.
We followed a similar pattern to PJP, using Rails and Nginx.
We wanted user avatar images to be browser cached, but on an avatar's change we needed the cache to be invalidated ASAP.
We added a method to the avatar model to append a timestamp to the file name:
return "/images/#{sourcedir}/#{user.login}-#{self.updated_at.to_s(:flat_string)}.png"
In all places in the code where avatars were used, we referenced this method rather than an URL. In the Nginx configuration, we added this rewrite:
rewrite "^/images/avatars/(.+)-[\d]{12}.png" /images/avatars/$1.png;
rewrite "^/images/small-avatars/(.+)-[\d]{12}.png" /images/small-avatars/$1.png;
This meant if a file changed, its URL in the HTML changed, so the user's browser made a new request for the file. When the request reached Nginx, it got rewritten to the simple name of the file.
I would suggest using caching by ETags in this situation, see http://en.wikipedia.org/wiki/HTTP_ETag. You can then use the hash as the etag. A request will still be submitted for each resource, but the browser will only download items that have changed since last download.
Read up on your web server / platform docs on how to use etags properly, most decent platforms have built-in support.
Most modern browsers check the if-modified-since header whenever a cacheable resource is in a HTTP request. However, not all browsers support the if-modified-since header.
There are three ways to "force" the browser to load a cached resource.
Option 1 Create a query string with a version#. src="script.js?ver=21". The downside is many proxy servers wont cache a resource with query strings. It also requires site-wide updating for changes.
Option 2 Create a naming system for your files src="script083010.js". However the downside to option 1 is that this as well requires site-wide updates whenever a file changes.
Option 3 Perhaps the most elegant solution, simply set up the caching headers: last-modified and expires in your server. The main downside to this is users may have to recache resources because they expired yet never changed. Additionally, the last-modified header does not work well when content is being served from multiple servers.
Here a few resources to check out: Yahoo Google AskApache.com
This is really only an issue if your web server sets a far-future "Expires" header (setting something like ExpiresDefault "access plus 10 years" in your Apache config). Otherwise, a browser will make a conditional GET, based on the modified time and/or the Etag. You can verify what is happening on your site by using a web proxy or an extension like Firebug (on the Net panel). Your question doesn't mention how your web server is configured, and what headers it is sending with static files.
If you're not setting a far-future Expires header, there's nothing special you need to do. Your web server will usually handle conditional GETs for static files based on last modified time just fine. If you are setting a far-future Expires header then yes, you need to add some sort of version to the file name like your question and the other answers have mentioned already.
I have also been thinking about this for a site I support where it would be a big job to change all references. I have two ideas:
1.
Set distant cache expiry headers and apply the changes you suggest for the most commonly downloaded files. For other files set the headers so they expire after a very short time - eg. 10 minutes. Then if you have a 10 minute downtime when updating the application, caches will be refreshed by the time users go to the site. General site navigation should be improved as the files will only need downloading every 10 minutes not every click.
2.
Each time a new version of the application is deployed to a different context that contains the version number. eg. www.site.com/app_2_6_0/ I'm not really sure about this as users bookmarks would be broken on each update.
I believe that a combination of solutions works best:
Setting cache expiry dates for each type of resource (image, page, etc) appropreatly for that resource, for example:
Your static "About", "Contact" etc pages probably arn't going to change more than a few time a year, so you could easily put a cache time of a month on these pages.
Images used in these pages could have eternal cache times, as you are more likey to replace an image then to change one.
Avatar images might have an expiry time of a day.
Some resources need modified dates in their names. For example avatars, generated images, and the like.
Some things should never be caches, new pages, user content etc. In these cases you should cache on the server, but never on the client side.
In the end you need to carfully consider each type of resource to determine what cache time to instruct the browser to use, and always be conservitive if you are unsure. You can increase the time later, but it's much more pain to uncache something.
You might want to check out the approach taken by the grails "uiperformance" plugin, which you can find here. It does a lot of the things you mention, but automates them (set expiry time to a long time, then increments version numbers when files change).
So if you're using grails, you get this stuff for free. If you are not - maybe you can borrow the techniques employed.
Also - borrowed form the ui-performance page, - read the following 14 rules.
ETags seemingly provide a solution for this...
As per http://httpd.apache.org/docs/2.0/mod/core.html#fileetag, we can set the browser to generate ETags on file-size (instead of time/inode/etc). This generation should be constant across multiple server deployments.
Just enable it in (/etc/apache2/apache2.conf)
FileETag Size
& you should be good!
That way, you can simply reference your images as <img src='/path/to/foo.png' /> and still use all the goodness of HTTP caching.
I haven't had a huge opportunity to research the subject but I figure I'll just ask the question and see if we can create a knowledge base on the subject here.
1) Using subdomains will force a client side cache, is this by default or is there an easy way for a client to disable it? More curious about what kind of a percentage of users I should be expecting to affect.
2) What all will be cached? Images? Stylesheets? Flash SWFs? Javascripts? Everything?
3) I remember reading that you must use a subdomain or www in your URL for this to work, is this correct? (and does this mean SO won't allow it?)
I plan on integrating this onto all of my websites eventually but first I am going to try to do it for a network of flash game websites so I am thinking www.example.com for the website will remain the same but instead of using www.example.com/images, www.example.com/stylesheets, www.example.com/javascript, & www.example.com/swfs I will just create subdomains that point to them (img.example.com, css.example.com, js.example.com & swf.example.com respectively) -- is this the best course of action?
Using subdomains for content elements isn't so much to force caching, but to trick a browser into opening more connections than it might otherwise do. This can speed up page load time.
Caching of those elements is entirely down the HTTP headers delivered with that content.
For static files like CSS, JS etc, a server will typically tell the client when the file was modified, which allows a browser to ask for the file "If-Modified-Since" that timestamp. Specifics of how to improve on this by adding some extra caching headers would depend on which webserver you use. For example, with Apache you can use the mod_expires module to set the Expires header, or the Header directive to output other types of cache control headers.
As an example, if you had a subdirectory with your css files in, and wanted to ensure they were cached for at least an hour, you could place a .htaccess in that directory with these contents
ExpiresActive On
ExpiresDefault "access plus 1 hours"
Check out YSlow's documentation. YSlow is a plugin for Firebug, the amazing Firefox web development plugin. There is lots of good info on a number of ways to speed up your page loads, one of which is using one or more subdomains to encourage the browser to do more parallel object loads.
One thing I've done on two Django sites is to use a custom template tag to create pseudo-paths to images, css, etc. The path contains the time-last-modified as a pseudo directory. This path component is stripped out by an Apache .htaccess mod_rewrite rule. The object is then given a 10 year time-to-live (ExpiresDefault "now plus 10 years") so the browser will only load it once. If the object changes, the pseudo path changes and the browser will fetch the updated object.