Website loads image with 1-2 second delay. Could I increase the performance somehow? - page-load-time

recently I made a website for my photography. htttp://www.simotamas.com
I am a newbie, so its not the best site but it works fine for me, I got only one problem, when a site is loaded on a device for the first time, the gallery loading time takes up to 1-2 seconds.
Could you guys please check if I mess up something with the code?
Or should I made the pictures even smaller?
Any way I could increase the loading performance.
I would be really thankful for any advice.

Some points you can consider
Use thumbnails for preview (low resolution) , while clicking load actual image.
Load images of only visible part first then load the images in bottom. (May affect user experience)
if you have cpu power , use any libraries like cache tools or compression tools like
https://nielse63.github.io/php-image-cache/ . benchmark it carefully.
use gzip if you are not using gzip compression for your server.

The fact your website doesn't wait for the image to load is considered a plus (look into asynchronous web page content loading for a good read) that said you should compress your images before uploading them.. tinypng.com is a nice tool for it... But if it's a photography website doing so would reduce picture quality... Try to play with Photoshop save settings to find your ideal compromise between quality and size with respect to "memory" size... Pictures are heavy.. high definition and resolution will obviously result in heavier files to download
Update: another thing you could do is actually display (smaller) thumbnail and only load the full picture on request. I.e: user clicks and image opens in new tab

It would help if you create smaller thumb versions of your images so the browser can initially load these ones for the overview and no need for scaling way to big images down while rendering the page. An image should always be downloaded in the dimension it's going to be presented.

Related

Website speed diagnosis - expression engine

We can't seem to be given any answers as to why our website is so slow and continues to get slower. Can anyone tell me why http://www.kaboodle.com.au is slow and what we can do to fix it?
See report: https://gtmetrix.com/reports/www.kaboodle.com.au/yc1I3YXZ
I will break it down for you:
Scale and optimize the images you have. use photoshop "save for web", jpeg mini or if you use wordpress EWWWW image optimizer. Some images are much bigger than their display ratio on website.
Addthis can sometimes slow down websites. Try moving it to load on bottom
Your most problematic loading element is the dns. check with your dns provider if they can fix it. Anyway you should use cdn like CLOUDFLARE or MAXCDN
Do all these and I think your page will load much much faster. You can get to 2.7-3.5s easily

How do I make above-the-fold content load first on my website? (control the load order)

My website for architectural visualization: http://www.greenshell3d.com
I noticed on the networking tab / incognito it takes 15 seconds or so to load the above-the-fold content. (most notably the image slideshow.)
Some of the images in the slideshow load at the very end instead of the beginning of the website load process. Now I understand the browser handles this order, but perhaps there is another way. As it stands, the bounce-rate is too high and I expect it is because of load time.
I've seen a jquery snippet on github that allows one to control the order of image loads - do you think this is a good option? I'd be glad to hear any opinions before investing the time to fix this.
Any ideas? Thanks!
You said you are interested in any opinions as well, so first some general thoughts: There is no page fold. The web that we produce content for exists in so many different screen sizes + resolutions that it’s impossible to say "The fold is below this big image!". Yes, Google changed the pagespeed insights tool to make people load stuff on top of the page first, but I think their wording there is really bad.
Now to your image loading issue:
The first thing I would recommend is to reduce the size of all the images. They seem to be around 280 - 300 kb per image and you have a few of them. Since there is a translucent overlay over them anyways you can probably get away with reducing the image quality without people noticing it (because they don’t see the image directly). Play around with the values here.
I would then look into optimizing the code for the slider to load the first image first, then the rest of the page and the other images asynchronously maybe after that. Another trick could be to increase the slide fade time from the first slide to other slides so the slider doesn’t change if the next image isn’t ready yet. You said you found a jQuery script to implement that, that’s where I’d start.
As a general guideline: the position of requests in the source code usually determines the load order of things on the page. If your images are requested by JavaScript at the end of the page, that lead to the images being loaded later than you want them to be loaded.

Preload & site preformance: x amount of PDF's vs x * amount of pages PNG's

I have a question.
I am currently developing a website and I am nearly finished with the gallery section. The gallery section will display either a) a pdf file or b) several pngs(these pngs will be displayed under each other, it will look just like the pdf) when a user clicks on one of the gallery pictures.
I have the option of choosing between as said PDF or png.
The problem is the pdf files are up towards 5mb and I cannot do anything about this. I was thinking of preloading the pdf files or pngs in advance.
I am looking for one thing: site preformance.
Is it better to preload the one pdf (there will be up towards 20-30 pdfs in this gallery, so I will be preloading 20-30 pdf's) or is it better to go with the png approach (20-30*x amount of pages).
Or if anyone else has a better idea?
The best criteria will be to try, measure and see for yourself.
In any case, you might try to reduce the load by finding balance between amount of pre-loaded data and simplicity and load.
In case of PNGs you might try to split images into smaller parts and preload only images required to display one or two first pages. This will allow visitor to quickly open those pages (he might not go farther after all).
In case of PDFs you might linearize documents (optimize them for Fast Web View). For linearized documents you might try to preload beginning of the file only. There is a question about finding how much you should preload. Adobe plugin opens linearized files faster then non-linearized. I am not aware about other PDF plugins that make use of this optimization, though.

Time loading webpage

I'm trying many problems with the time loading of my web page:
www.alvaromillan.es.
I've tried to minify the js and the images but the problem is, as you can see, that my web site is only this page so every image and js is on this document...
The loading time is really high and even the smooth scrolling movement lasts a lot and the first time you use it it doesn´t go fine...
Please may any of you help me??
I took a quick look at the page just using the chrome developer tools and while there are probably several things you can do which YSlow would suggest, I think the biggest gain would come from optimizing and spriting your images. 131 of the 156 requests on your page are for images. Thats alot of images and many are fairly small. Also alot of the images seem quite large in bytes for their size. Here is what I would do:
Combine the images using several sprite sheets about 50k-100k per sheet.
Use the PNG format.
Quantize the sprite sheets to 8bit PNGs. My guess is that you will not experience perceptible quality loss by doing this. You could use spmething like pngquant to do this.
Use something like optipng to apply lossless compression on the quantized image.
I think this will yield dramatic improvements.
As skaffman suggests, do run yslow and/or google page speed test for more thorough suggestions. I also like using webpagetest.org which provides great metrics for optimizing pages.
Give the YSlow Firefox plugin a try. It will analyze the load times of your site and advise you the best course of action to take to fix it.
OK, here's some quick initial thoughts...
Flush the page after the head so that the browser can start downloading those resources sooner.
Remove the iframe
jquery appears to be loaded twice - once directly and once via google.load
Can you defer the loading of the javascript until later e.g. put it at the bottom of the page or load it asynchronously?
Rather than preloading the slideshow images - why not load them on demand when clicked on, or lazy load them after the page has finished load?
Also do you really want IE to emulate IE6???

How would you optimise/simulate 'random' loading of large image files?

We use large background images (hi-res photos, up to 700 KB) for our page design.
It's part of the experience of the site that as you browse around, you see different images.
At the moment a different (random) image is loaded on each page request, from a pool of ~15 images, which could grow over time.
I'm looking for a sane way to optimize this:
To avoid the user having to download a big image file on every page view
To reduce load on the server (is this an issue, will the server keep the images in memory?)
The ideas I have so far include:
A timer which loads a different image at set intervals
Progressively loading other images in the background with ajax
Associating images with specific content (pages, tags)
The question is, how to keep it feeling somewhat random, while minimizing page load times and server hit?
I usually avoid sites with huge images, I am very impatient. I would rethink your design.
As a first step you should make sure, that the images can be properly cached:
use sane urls (no session id's etc)
set appropriate http headers ETag
Firstly, hearing that the background-images alone are 700kb astounds me. In addition to the content ON screen...that is a pretty heavy site.
For starters, I would try to use image compression tools. Two tools come to mind Imagemagick and PNGCrush. PNGCrush is excellent in reducing all the extraneous metadata attached to photos, without compromising photo quality.
I only recommend this as compressing the images will assist you in enabling the user to download a smaller quantity of content, which means quicker load times, which...at the end of the day...is what users want.
I would also cache the images, such that when a user re-visits the site, the image is already cached on their end. This minimises the HTTP requests that are made each time a user visits your site.
An example of where this technique is used on a commercial site is www.reactive.com. If you look the /js/headerImages.js file, they make use of image caching. Funnily enough, you will find the same src code at: http://javascript.internet.com/miscellaneous/random-image.html
Considering that you have mentioned that images are randomly loaded, I am assuming you are using a Javascript library such as jQuery to create the effect.
If you are, you can minimize page load times by using a CDN as opposed to referencing to a local copy of the jQuery lib which is stored on your server. I have performed performance testing on a site I made for a client, and over an average of 20 hits, saved 1.6 seconds through this technique!
Hope that helps for now :)

Resources