Laravel blade. Best way to show an image size according to the size of the screen - image

For each record, I have two images, small and large size. What is the best way to show an image size according to the size of the screen?
I know I can use css and mediaquery and show the small or large image using a div display none / block, but in this way, I understand that the user has downloaded both images even if they are not displayed, is that correct? Then, the page load increases. If so, what is the best option to do this?

For this problem the most efficient way is using the User-Agent to detect if a client is desktop, tablet or mobile.
I prefer to use Mobile_Detect or jenssegers/agent (based on Mobile_Detect).
Edit
Recently I installed the Google Pagespeed Module on my server. There is a filter, resize_rendered_image_dimensions, that returns images with sizes of the current client view (exactly what you are looking for).
Some other implementation would be lazyloading with Javascript and adding dimensions to the requested url depending on the viewport sizes.

Related

Website loads image with 1-2 second delay. Could I increase the performance somehow?

recently I made a website for my photography. htttp://www.simotamas.com
I am a newbie, so its not the best site but it works fine for me, I got only one problem, when a site is loaded on a device for the first time, the gallery loading time takes up to 1-2 seconds.
Could you guys please check if I mess up something with the code?
Or should I made the pictures even smaller?
Any way I could increase the loading performance.
I would be really thankful for any advice.
Some points you can consider
Use thumbnails for preview (low resolution) , while clicking load actual image.
Load images of only visible part first then load the images in bottom. (May affect user experience)
if you have cpu power , use any libraries like cache tools or compression tools like
https://nielse63.github.io/php-image-cache/ . benchmark it carefully.
use gzip if you are not using gzip compression for your server.
The fact your website doesn't wait for the image to load is considered a plus (look into asynchronous web page content loading for a good read) that said you should compress your images before uploading them.. tinypng.com is a nice tool for it... But if it's a photography website doing so would reduce picture quality... Try to play with Photoshop save settings to find your ideal compromise between quality and size with respect to "memory" size... Pictures are heavy.. high definition and resolution will obviously result in heavier files to download
Update: another thing you could do is actually display (smaller) thumbnail and only load the full picture on request. I.e: user clicks and image opens in new tab
It would help if you create smaller thumb versions of your images so the browser can initially load these ones for the overview and no need for scaling way to big images down while rendering the page. An image should always be downloaded in the dimension it's going to be presented.

Magento Images are rotating when uploaded

This only happens on the product pages with images with a larger height than 500px approximately. Caching is disabled. Products display correctly at smaller sizes but i need a solution that doesn't require image resizing before uploading.
I believe its something to do with using multiple image resizer program and some of the meta information in the image.
Thanks
It sounds like there is EXIF data in the jpeg which records which way 'up' is. Either this info is getting ignored when you upload but is not ignored on your PC - explaining why the image looks the right way up when you view it on your desktop, but the wrong way up in Magento, or vice versa.
Can you use an art program or bulk convertor like XnView to either apply or remove the EXIF data before uploading? Then you might need to manually rotate some images.

Image reduce website content

My site have 2 pages and the 2 pages contain a similar picture. If a user comes to the first page, he downloads the picture and then come to the second page, if I make the website so that the picture is shared between the 2 pages then the user no need to download the picture again?
If I want to put the same picture but different in size on the webpage, is it better to make 2 pictures by using image software editor or using CSS to change the width and the height of the picture?
you have answered your question.
if the image is from the same source, and if you have configured the caching on webserver correctly (and if the client has enabled cache), then there are no re-requests sent to view the same resource.
you dont need to create multiple images for different sizes, use html image attributes to show it in the grid dimensions you wish to.
Exception: if the original image is quite large, and you are not sure if the user will want to view the image, then create smaller image for faster loading. Thumbnails on a photo album is a good example for that. There is a program called re-sizer which accepts a folder and create a new set of images with the required dimensions
Resources below
Image Resizer
HTTP Compression

Optimize display of a large number of images (1000+) for performance and ease of use in a web application

In our web application, the users need to review a large number of images. This is my current layout. 20 images will be displayed at a time, with a pagination bar above the thumbnails. Clicking a thumbnail will show the enlarged image to the left. The enlarged image will follow the scrollbar so it's always visible. Quite simple actually.
I was wondering what the best interface would be in this scenario:
One option is to implement an infinite scroll script which will lazy load thumbnails as the user scrolls. The thumbnails not visible will be removed from the DOM. But my concern with this approach is the number of changes in the DOM slowing down the page.
Another option could be something like Google's Fastflip.
What do you think is the best approach for this application? Radical ideas welcomed.
I think the question you have to ask is: what action is user supposed to do? What's the purpose of the site?
If "review images" entails rating every image, I'd rather go with a Fastflip approach where the focus is on the single image. A thumbnail gallery will distract from the desired action and might result in a smaller amount of pics rated/reviewed.
If the focus should rather be on the comparison of a given image against others, I'd say try the gallery approach, although I wouldn't impement an infinite scroll with thumbnails because user can quickly get lost in the abundance of choices. I think a standard pagination (whether static or ajaxified) would be better if you choose to go this route.
Just my 2c.
If you paginate thumbnails, you can pre-generate a single image containing all thumbnails for each page, then use an image map to handle mouseover text and clicking. This will reduce the number of HTTP requests and possibly lead to fewer bytes sent. The separation distance between images should be minimized for this to be most efficient. This would have some disadvantages.
To reduce image download size at the expense of preprocessing, you can try to save each image in the format (PNG or JPG) most efficient for its contents using an algorithm like the one in ImageGuide. Similarly, if the images are poorly compressed (like JPEGs from a cell phone camera), they can be recompressed at the cost of some quality.
Once the site has some testers, you can analyze patterns in which images tend to be clicked (if a pattern exists) and preload the full-size images, or even pre-load all of them once the thumbs are loaded.
You might play with JPEG2000 images (you did say "radical ideas welcomed"), which thumb very easily, because the thumbnail and main image needn't be sent as if they are separate files. This is an advantage of the compression format -- it isn't the same as the hack of telling the browser to resize the full size image to represent its own thumbnail.
You can take a look at Google's WebP image format.
At the server side, a separate image server optimized for static content delivery, perhaps using NginX or the Tux webserver.
I would show the thumbnails, since the user might want to skip some of the pictures. I would also stay away of pagination in the terms of
<<first <previuos n of x next> last>>
and go for something more easy to implement and efficient. A
load x more pictures.
No infinite scroll whatsoever and why not, even no scroll at all. Just load x more, previous x.
Although this answer might be a bit unradical and boring, I'd go with exactly your suggestion of asynchronously loading the thumbnails (and of course main picture), if they come into view. A similar technique is used by Google+ in the pane to add persons to circles. This way, you keep the server resources and bandwidth on the pictures that are needed by the client. As Google+ shows, the operations on the DOM tree are fast enough and don't slow down a computer of the past years.
You might also prebuilt a few lines of the thumbnail table ahead with a dummy image (e.g. a "loading circle" animated gif) and replace the image. That way, the table in view is already built and does not need to be rerendered, as the flow elements following the table would have to be, if no images are in there during scrolling.
Instead of paginating the thumbnails (as suggested by your layout scheme), you could also think about letting users filter the images by tag, theme, category, size or any other way to find their results faster.

Very large images in web browser

We would like to display very large (50mb plus) images in Internet Explorer. We would like to avoid compression as compression algorithms are not what CSI would have us believe that they are and the resulting files are too lossy.
As a result, we have come up with two options: Silverlight Deep Zoom or a Flash based solution (such as Zoomify). The issue is that both of these require conversion to a tiled output and/or conversion to a specific file type (Zoomify supports a single proprietary file type, PFF).
What we are wondering is if a solution exists which will allow us to view the image without a conversion before hand.
PS: I know that you can write an application to tile the images (as needed or after the load process) and output them; however, we would like to do this without chopping up the file.
The tiled approach really is the right way to do it.
Your users don't want to download a 50mb file before they can start viewing the image. You don't want to spend the bandwidth to serve 50 megs to every user who might only view a fraction of your image.
If you serve the whole file, users will eventually be able to load and view it, but it won't run smoothly for most of them.
There is no simple non-tiled way to serve just a portion of an image unless you want to use a server-side library like imagemagik or PIL to extract a specific subset of the image for each user. You probably don't want to do that because it will place a significant load on your server.
Alternatively, you might use something like google's map tool to provide zooming and scaling. Some comments on doing that are available here:
http://webtide.wordpress.com/2008/08/27/custom-google-maps/
Take a look at OpenSeadragon. To make a image can work with OpenSeadragon, you should generate a zoomable image format which mentioned here. Then follow starting guide here
The browser isn't going to smoothly load a 50 meg file; if you don't chop it up, there's no reasonable way to make it not lag.
If you dont want to tile, you could have the server open the file and render a screen sized view of the image for display in the browser at the particular zoom resolution requested. This way you arent sending 50 meg files across the line when someone only wants to get an overview of the image. That is, the browser requests a set of coordinates and an output size in pixels, the server opens the larger image and creates a smaller image that fits the desired view, and sends that back to the web browser.
As far as compression, you say its too lossy, but if thats what you are seeing you are probably using the wrong compression algorithm or setting for the type of image you have. The jpg format has quality settings to control lossiness, and PNG compression is lossless (the pixels you get after decompressing are the exact values you had prior to compression). So consider changing what you are using as compression, and dont just rely on the default settings in an image editor.

Resources