I'm building a webGL app with three.js that loads a number of GLTF assets. The largest of these is 65MB.
All the smaller assets are cached correctly according to the cache-control header:
cache-control: max-age=31111213
But the large one is loaded from the network every time. Is there a max file size limit for Chrome cache?
Related
When loading images via Bitmap() into the stage/container is it relevant regarding the performance once they are loaded and scaled how big the original images were ?
Say I have some images of 800x800 but in the canvas they are only used at a maxiumum size of 400x400. Would it be better to initially onnly make them 400x400?
Yes, the images will become smaller in size, and that is a benefit for your bandwith and performance. Ofcourse the Canvas won't need to scale down the image anymore so the image will be more sharper. Nothing but benefits.
I'm trying to make my images optimized for google pagespeed test. I have an image with 1200x393 dimensions. When I optimize the image with Photoshop, its size is approximately 250kb and with Corel it becomes 100kb. Google doesn't accept either. It says Compressing and resizing ... .jpg could save 92.6KiB (90% reduction).
How can I pass pagespeed test?
From Image Optimization:
Image optimization boils down to two criteria: optimizing the number
of bytes used to encode each image pixel, and optimizing the total
number of pixels: the filesize of the image is simply the total number
of pixels times the number of bytes used to encode each pixel. Nothing
more, nothing less.
As a result, one of the simplest and most
effective image optimization techniques is to ensure that we are not
shipping any more pixels than needed to display the asset at its
intended size in the browser. Sounds simple, right? Unfortunately,
most pages fail this test for many of their image assets: typically,
they ship larger assets and rely on the browser to rescale them -
which also consumes extra CPU resources - and display them at a lower
resolution. ...
you should ensure that the number of unnecessary
pixels is minimal, and that your large assets in particular are
delivered as close as possible to their display size
Common error is to have big image in source and scale it down with width and height attributes on UI.
I have a question in regards to page speed of a website. See I have a picture with a original size of 100k that is use in 3 different location in a page. And in the 3 different location the size of the picture is different. The largest size of the 3 location being the 100k pic. When I ran Chrome page speed, it recommends me to serve scale image for the other 2 smaller size location.
I would like to ask if the picture has only 1 size of 100K, in 3 picture location does the browser fetch the image from the server 3 times or just once?
This lead to my question now is should I serve scaled image and have more dom elements but reduce page size OR should I serve the 100K image and the the browser fetch only once?
If I were to serve scaled image then the browser will have to request for 3 different size of the same image from the server which may increase the time to load rather than reducing it.
As long as you serve the image as cacheable (this is the default) the browser will only download it once. So it is better to reference only one version of the image.
I recently noticed that every image on this website - the logo, badge colors, up/down voting arrows — the list goes on - are actually part of a single sprite sheet, set as a background image, and repositioned based on the required state. What is the advantage of using this method over using multiple images?
Simple. You're sending less HTTP requests. One for all images, as opposed to one for each image.
Additionally, a large compressed sprite including all images can be compressed better, resulting in a smaller filesize than all the images on their own. E.g. if you have 10 files that are 20kB in size each, the sprite would normally be much less than 200kB.
With a sprite, the browser only has to make one HTTP request for the whole image, instead of N requests for N images. There is significant overhead and delay to creating the TCP connection (three-way handshake), so limiting this to just one request saves a lot of time.
It is known fact that safari on iPad reduces the dimension of large images automatically to save resource. The problem is I need to get the original size of image through javascript after image load. Is it possible?