Chrome/Firefox Cutting off Base64 Image - image

I'm building a project in ASP/MVC, and I'm having an issue with some of my images. I pull them out of a database as bytecode, convert them into Base 64, then render them. This process works fine with about 90% of the images I have to upload, but for certain images for some reason, they render cut off, as here:
Unfortunately, this only seems to happen in Chrome/Firefox, so I have to assume it's an issue with rendering. In fact, here's the image as the base 64, you can view it by using it like a url in your browser, if security allows. Otherwise you can put it in html tags and try to view it that way. This will only render correctly on IE as far as I can tell.
http://pastebin.com/RYasvfj0
If I view this image, right click, then save it, the saved image will be correct:
I'm at a total loss here. It's A) Inconsistent that the images won't work correctly, but
B) Consistent at when in the image it gets cut off. It's always the same portion through the image that it ceases loading on FF/Chrome
Any ideas on what's triggering those browsers to fail, and how I can check for it?
e; The base 64 here is padded; it's generated by C# via Convert.ToBase64String

Related

Images in listview are not released from memory when out of view

I am displaying images from the internet in a vertical ListView. I fetch the images using http.get (not using cached network image cuz I do not want to cache the images). Then I insert the image Uint8List into image.memory(). What happens is that as the user scrolls the list and images are loading, the ram keeps increasing until the whole app crashes. Any ideas what to do ?
Yeah, this is the normal behavior. I don't know why exactly. My theory is that the images by default are disposed if the dart objects holding references to them are garbage collected rather then when the widgets are knocked off the widgets tree, but don't take my word for it- that's just my personal reasoning. It may be completely wrong.
To get around this, I use Extended Image pakcage It's constructors take a bool clearMemoryCacheWhenDispose which disposes of images in RAM in scroll lists. You may do that or visit the package code to see how he's doing it, and replicate it.
Other advice I can give is to make sure your images are of the appropriate size. If you are loading your images locally check this part of the documentation to have different dimensions selected for different screen sizes https://flutter.dev/docs/development/ui/assets-and-images#loading-images
If you are fetching your images from network which is more likely, make sure their sizes are appropriate, and have different sizes served if you can. If you don't have control over that set cacheWidth and cacheHeight in Image.memory these will reduce the cached image in memory (RAM). You see by default Flutter caches images in full resolution despite their apparent size in the device/app. For example if you want your image to display in 200x200 box then set cacheWidth to 200 * window.devicePixelRatio.ceil() you will find window in dart:ui, and if you only set cacheWidth, the ratio of your images will remain true. Same true if you only set cacheHeight. Also do use ListView.builder as suggested.
I am disappointed at how little is said about this in Flutter docs. Most of the time people discover this problem when their apps start crashing. Do check your dev tools regularly for memory consumption. It's the best indicator out there.
Cheers and good luck
I was having the same issue and found a fix thanks to #moneer!
Context:
Users in my app can create shared image galleries which can easily contain several hundred images. Those are all displayed in a SliverGrid widget. When users scrolled down the list, too many images were loaded into RAM and the app would crash.
Things I had already implemented:
Image resizing on the server side and getting the appropriate sized images on the client based on the device pixel ratio and the tile size in the gallery
I made sure that my image widgets were properly disposing when out of view, but the memory size kept building up as the user scrolled through all the images anyway
Implement cacheHeight to limit the size of the cached image to the exact size of the displayed image
All these things helped but the app would eventually still crash every time the user scrolled down far enough.
The fix:
After some googling I stumbled upon this thread and tried the extended_image_package as #moneer suggested. Setting the clearMemoryCacheWhenDispose flag to true fixed the issue of the app crashing as it was now properly clearing the images from memory when they were out of view. Hooray! However, in my app users can tap on an image and the app navigates to an image detail page with a nice Hero animation. When navigating back the image would rebuild and this would cause a rebuild 'flicker'. Not that nice to look at and kind of distracting.
I then noticed that there's also an enableMemoryCache flag. I tried setting this to false and that seems to work nicely for me. The Network tab in Dart DevTools seems to show that all images are only fetched from the network once when scrolling up and down the gallery multiple times. The app does not crash anymore.
I'll have to more testing to see if this leads to any performance issues (if you can think of any please let me know).
The final code:
ExtendedImage.network(
_imageUrl,
cacheHeight: _tileDimension,
fit: BoxFit.cover,
cache: true, // store in cache
enableMemoryCache: false, // do not store in memory
enableLoadState: false, // hide spinner
)
I had a similar issue when I loaded images from files in a ListView.
[left-side: old code, right-side: new code]
The first huge improvement for me: not to load the whole list at once
ListView(children:[...]) ---> ListView.builder(...).
The second improvement was that images are no longer loaded at full-size:
Image.file("/path") ---> Image.file("/path", cacheWidth: X, cacheHeight: Y)
These two things solved my memory problems completely
Ideally caching happens kind of by default after some conditions are fulfilled. So its upon your app to be responsible to handle and control how the caching will happen.
Checkout this answer
https://stackoverflow.com/a/24476099/12264865

Part of Image Missing From Data URL

Backstory to the below issue:
I'm using the jQuery plugin Cropit to produce an image which I get in data URL form (the user uploads an image and Cropit allows them to manipulate it, when the user is happy, Cropit exports the final image).
This data URL is attached to the product (this is a Shopify website) via Shopify properties (in a similar way you would attach text for an engraved product) and then when the order is created, I have an app listening for new orders and I pull the data URL from the order.
From testing, I can confirm that the data URL is wrong / corrupted / broken at the time the order is placed and not being broken in transit.
Original Question
I have a bit of a weird situation and I can't find any similar situations online.
I'm being sent an image in data URL format (from Shopify if it's relevant, I have written a private app and their webhook is sending me an image)
The image is in a data URL format that starts with, as an example,
data:image/png;base64,iVBORw0KGgoAAAANSU.....
The problem I am having is sometimes (and it's maybe less than 10% of the time) when I get the image and try to print it, it's missing the bottom chunk of the image. In a PDF, it considers the image corrupt, and in a web browser, it just sees the bottom of the image as transparent, however much is missing.
This is what it looks like in Inspect Element on Google Chrome when you hover over the image URL (image has been purpled out for anonymity)
My question is, does anyone know why?
We can't find a correlation with browser or device type. And I'm not sure if it's because part of the data URL is somehow missing (maybe a character limit, because it's a really long string!) or if it's the type of image. Might possibly be something going wrong in the upload process?
Is anyone able to shed any light? It's such a weird issue I'm not even sure what to google!
And just to confirm, the image absolutely has to be sent in this format for a whole series of reasons, mainly Shopify restrictions so I can't send the image in file format.

Why would I need image placeholder service or library?

Yesterday, I saw a tweet saying about holderJS library. When I read the usage, it says it will generate the image placeholder completely on client side. So I am wondering why in the life would I need a placeholder library?
What is the scenario in which rather than placing div of some size I would use image placeholder?
Image placeholders are generally meant for a page that is either in the process of dynamically loading a real image or the page is only partially designed and the placeholder image shows how the design will be laid out and how big the image should be even though the real image is not yet available. In this way, the HTML design can be nearly completed even though the final images are not yet available or done.
Wikipedia uses image placeholders when they know they want a particular image in a page, but are in search of an image they can use with the appropriate license.
Image placeholders are traditionally served up by a service on the web that automatically creates the placeholder images based on query parameters in a URL, but the holder.js library creates placeholder images entirely on the client (so no outside services are needed).
You can certainly achieve the same look as a placeholder with just a div with a background color and perhaps even some text in the div. But, when someone wanted to plug the final images into place, they would have the change the div tags to img tags. When using a placeholder image, all the HTML tags can be final and left as they are, only the .src values need to be plugged in to finish the design. So, placeholder images allow you to have a closer to complete version of the HTML even though the images are not yet done. It's a minor different, but one that is appreciated by some designers.

Image not showing up in Facebook shares

We are having problems with Facebook not showing images, which we noticed from 17 May.
We have images tagged in og:image, and this is showing up as a blank white space in the object debugger. Clicking on this blank white image/space shows the actual image expected in a new browser window, there are no issues with viewing the image within or outside the network (we tested using different devices and proxies to simulate access from outside Singapore, which is our main audience). The image is 300x225, but the debugger also show the "image not big enough" warning.
Sample scenario (HTTP removed from some links):
URL in question: www.hungrygowhere.com/dining-guide/what-to-eat/5-family-friendly-restaurants-to-book-online-*aid-53763f00/
og:image meta on the page: <meta property="og:image" content="http://staticc04.insing.com/images/f3/e0/10/00/pc_300x225.jpg" />
Warning on the debugger page: "Provided og:image is not big enough. Please use an image that's at least 200x200 px. Image 'staticc04.insing.com/images/f3/e0/10/00/pc_300x225.jpg' will be used instead."
Noticeably the src attribute on the debugger is similar to this: fbexternal-a.akamaihd.net/safe_image.php?d=AQAfsdJpTPWk1IqW&url=http%3A%2F%2Fstaticc05.insing.com%2Fimages%2Fe2%2Fdb%2F10%2F00%2Fpc_300x225.jpg and going directly to this image via the browser shows a blank 1x1 gif. We tested using a 600x600 image (after seeing a post about maintaining aspect ratio) but it's the same: external.ak.fbcdn.net/safe_image.php?d=AQDZBb-0AZjyOp_B&url=http%3A%2F%2Fstaticc03.insing.com%2Fimages%2F57%2Fde%2F10%2F00%2Fstb_600x600.jpg
Our hypothesis is that safe_image.php is not accepting the image it retrieves, but we have no idea what or why this is happening.
We have a ticket open on the Facebook bugs support forum for this (developers.facebook.com/bugs/647132798635052) but no response after the initial round of clarifications, but we are hard pressed to find a solution.
There were no changes to our image storage solution over this timeframe. No rewrites or redirects, the image URLs are files by their own right.
Edited to add:
It is strictly safe_image.php that is choking on our image. See a screenshot from the object debugger: http://i.imgur.com/kl336l9.png
The image on the right is the same og:image, but since it's accessed directly by FB (not going through safe_image.php), it shows up.
We have been checking logs and we are returning HTTP status 206 to Facebook (we have been doing this for a while now) but this was changed to return 200 at all times. No change, we are still not getting any images.

Very large images in web browser

We would like to display very large (50mb plus) images in Internet Explorer. We would like to avoid compression as compression algorithms are not what CSI would have us believe that they are and the resulting files are too lossy.
As a result, we have come up with two options: Silverlight Deep Zoom or a Flash based solution (such as Zoomify). The issue is that both of these require conversion to a tiled output and/or conversion to a specific file type (Zoomify supports a single proprietary file type, PFF).
What we are wondering is if a solution exists which will allow us to view the image without a conversion before hand.
PS: I know that you can write an application to tile the images (as needed or after the load process) and output them; however, we would like to do this without chopping up the file.
The tiled approach really is the right way to do it.
Your users don't want to download a 50mb file before they can start viewing the image. You don't want to spend the bandwidth to serve 50 megs to every user who might only view a fraction of your image.
If you serve the whole file, users will eventually be able to load and view it, but it won't run smoothly for most of them.
There is no simple non-tiled way to serve just a portion of an image unless you want to use a server-side library like imagemagik or PIL to extract a specific subset of the image for each user. You probably don't want to do that because it will place a significant load on your server.
Alternatively, you might use something like google's map tool to provide zooming and scaling. Some comments on doing that are available here:
http://webtide.wordpress.com/2008/08/27/custom-google-maps/
Take a look at OpenSeadragon. To make a image can work with OpenSeadragon, you should generate a zoomable image format which mentioned here. Then follow starting guide here
The browser isn't going to smoothly load a 50 meg file; if you don't chop it up, there's no reasonable way to make it not lag.
If you dont want to tile, you could have the server open the file and render a screen sized view of the image for display in the browser at the particular zoom resolution requested. This way you arent sending 50 meg files across the line when someone only wants to get an overview of the image. That is, the browser requests a set of coordinates and an output size in pixels, the server opens the larger image and creates a smaller image that fits the desired view, and sends that back to the web browser.
As far as compression, you say its too lossy, but if thats what you are seeing you are probably using the wrong compression algorithm or setting for the type of image you have. The jpg format has quality settings to control lossiness, and PNG compression is lossless (the pixels you get after decompressing are the exact values you had prior to compression). So consider changing what you are using as compression, and dont just rely on the default settings in an image editor.

Resources