Trying to cache uiImages in dictionary, doesn't seem to affect loading time - caching

I'm using a standard caching method to cache some UIImages loaded from the Documents directory. The images are being displayed in a UITableView. They're pretty large – the images themselves are up to 600x600 and are displayed in imageviews that are 240x180 (on a retina display, so the res discrepancy is not large).
Loading the images in realtime causes some lag when a new cell is about to come onscreen. So I've implemented a caching method in the object that handles the image:
- (UIImage *)imageWithStyle:(NSString *)styleName {
NSLog(#"looking for image %#", self.imageFileName);
/* if we have a cached image in dictionary, return it */
if (imageCache == nil) imageCache = [[NSMutableDictionary alloc] init];
UIImage *returnImage = [imageCache objectForKey:styleName];
if (returnImage != nil) {
NSLog(#"returning cached image");
return returnImage;
}
/* otherwise, look for image at path */
NSString *path = [self cacheFilePathWithStyle:styleName];
UIImage * originalImage = [[UIImage alloc] initWithContentsOfFile:path];
/* if image doesnt exist at path, start download and return nil */
if (originalImage == nil) {
NSLog(#"image not found. downloading.");
[self downloadImageFromS3];
return nil;
}
/* found image at path */
/* scale image for screen */
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)] && [[UIScreen mainScreen] scale] == 2){
returnImage = [UIImage imageWithCGImage:[[originalImage autorelease] CGImage] scale:2.0 orientation:UIImageOrientationUp];
NSLog(#"scaling image for retina");
} else {
returnImage = [originalImage autorelease];
NSLog(#"image scaled for standard resolution");
}
/* cache image in dictionary */
NSLog(#"caching image");
[imageCache setObject:returnImage forKey:styleName];
return returnImage;
}
Before the tableview appears on screen, I force all of the image handling objects to run this caching method to ensure that the images are present in the dictionary to be retrieved when they need to be displayed. By the NSLog's I can see that things are working as they should.
I'm getting lag-free performance now, but only after the image has appeared once on screen. So, when I initially see the tableview, I scroll down and the NSLogs tell me that the images are being retrieved from the cache, but still I get the same loading lag. After a cell has appeared on screen once, it thereafter loads up with no lag.
Is there something I'm missing here? Is there something more that I have to to to actually cache the image? Loading it and putting it in the dictionary doesn't seem to do the trick.
Thanks!
UPDATE
I've given up on this for now. There are some attempts out there to force the images to load by drawing them in a new context, but I'm not familiar with core graphics programming at this point. I've tried some code that folks have shared, but with no luck.
Instead, I'm going to display low-res versions of the images while the tableview is scrolling and load high-res versions when the tableview stops scrolling as announced through its delegate methods. At least I know this approach will work with any number of images.

From the documentation for -[UIImage initWithContentsOfFile:]:
This method loads the image data into memory and marks it as purgeable. If the data is purged and needs to be reloaded, the image object loads that data again from the specified path.
My guess is that, by loading all images into memory, your app consumes too much memory, causing the UIImage class to release the image data it can reload from files later.

If you use [UIImage imageNamed:], it'll do all this caching business for you. No need to roll your own cache and then wonder if it's working.
An upside and a downside to using that method. Upside: If you use the same image file twice, it won't actually load it twice, saving you that memory. Downside: Caching has a big memory impact, and you want to think pretty hard about doing it, whether you roll your own or not.

Related

Stop NSView drawRect clearing before drawing? (lockFocus no longer working on macOS 10.14)

I have an animation using two views where I would call lockFocus and get at the graphics context of the second NSView with [[NSGraphicsContext currentContext] graphicsPort], and draw. That worked fine until macOS 10.14 (Mojave).
Found a reference here:
https://developer.apple.com/videos/play/wwdc2018/209/
At 22:40 they talk about the "legacy" backing store that has changed. The above lockFocus and context pattern was big on the screen, saying that that won't work any more. And that is true. lockFocus() still works, and even gets you the correct NSView, but any drawing via the context does not work any more.
Of course the proper way to draw in a view is via Nsview's drawRect. So I rearranged everything to do just that. That works, but, drawRect has already automatically cleared the "dirty" area for you, prior to calling your drawRect. If you used setNeedsDisplayInRect: it will even clear only those areas for you. But, it will also clear areas made up of more than one dirty rectangle. And, it clears rectangular areas, while I draw roundish objects, so I end up with too much cleared away (black area):
Is there a way to prevent drawRect to clear the background?
If not I will have switch to using the NSView's layer instead, and use updateLayer, or something.
Update: I am playing around with using the layers of NSView, returning TRUE for wantsLayer and wantsUpdateLayer, and implementing:
- (void)drawLayer:(CALayer *)layer inContext:(CGContextRef)ctx
That is only called when I do:
- (BOOL) wantsLayer { return YES; }
- (BOOL) wantsUpdateLayer { return NO; }
and use setNeedsDisplayInRect: or setNeedsDisplay:
Which indeed makes drawRect no longer called, but the same automatic background erase has already taken place by the time my drawLayer: is called. So I have not made any progress there. It really is the effect of setNeedsDisplayInRect, and not my drawing. Just calling setNeedsDisplayInRect causes these erases.
If you set:
- (BOOL) wantsLayer { return YES; }
- (BOOL) wantsUpdateLayer { return YES; }
the only thing that is called is:
- (void) updateLayer
which does not provide me with a context to draw.
I had some hope for:
self.layerContentsRedrawPolicy = NSViewLayerContentsRedrawNever;
The NSView doc says:
Leave the layer's contents alone. Never mark the layer as needing
display, or draw the view's contents to the layer. This is how
developer created layers (layer-hosting views) are treated.
and it does exactly that. It doesn't notify you, or call delegates.
Is there a way to have complete control over the content of an NSView/layer?
UPDATE 2019-JAN-27:
NSView has a method makeBackingLayer that is not there for nothing, I guess. Implemented that, and it seems to work, basically, but no output shows on screen. Hence the followup question: nsview-makebackinglayer-with-calayer-subclass-displays-no-output
Using NSView lockFocus and unlockFocus, or trying to access the window's graphics contents directly not working in macOS 10.14 anymore.
You can create an NSBitmapImageRep object and update drawing in this context.
Sample code:
- (void)drawRect:(NSRect)rect {
if (self.cachedDrawingRep) {
[self.cachedDrawingRep drawInRect:self.bounds];
}
}
- (void)drawOutsideDrawRect {
NSBitmapImageRep *cmap = self.cachedDrawingRep;
if (!cmap) {
cmap = [self bitmapImageRepForCachingDisplayInRect:self.bounds];
self.cachedDrawingRep = cmap;
}
NSGraphicsContext *ctx = [NSGraphicsContext graphicsContextWithBitmapImageRep:cmap];
NSAssert(ctx, nil);
[NSGraphicsContext setCurrentContext:ctx];
// Draw code here
self.needsDisplay = YES;
}
You can try overriding method isOpaque and return YES from there. This will tell OS that we draw all pixels ourselves and it do not need to draw the views/window at the back of our view.
- (BOOL)isOpaque {
return YES;
}

SDWebImage with PHAssets

i have an application where i am loading large amount of images from database and showing them in my gallery, what i was doing earlier is i was getting photo url (local identifier) from PHAsset and then accessing the image using PHCacheManager, but it slows down my scrolling and also shows lags while updating images in new cell. I was trying to use SDWebImage here by passing the url to the sd_setImageWithURL: method, however i found out that the URL is actually a local identifier which is not supported by the method. Do we have other mechanism where i can use the identifier to load the images faster using SDWebImage or any other framework.
This is my code:
NSURL *photoURL = [NSURL URLWithString:photoPath];
[imageView sd_setImageWithURL:photoURL placeholderImage:defaultImage];
photoPath is the path to my asset "current image to be loaded"
i am getting the url something like:
73F05642-CAE6-49BE-879B-9B00BF15391F/L0/001
Please ask if more information is needed, i am new to iOS and all this stuff. Thanks in advance :)
it's late , but i am posting it so that it will be helpful for others .
SDWebImageManager * manager = [SDWebImageManager sharedManager];
[manager downloadImageWithURL:[NSURL URLWithString:localIdentifier] options:SDWebImageHighPriority targetLocalAssetSize:CGSizeMake(150, 150) progress:nil completed:^(UIImage *image, NSError *error, SDImageCacheType cacheType, BOOL finished, NSURL *imageURL) {
if (image) {
imageview.image = image;
}
}];

how can i have imageView.animationImages combined with UISwipeGestureRecognizer

I have a viewController displaying images and text and the images are loaded and being animated every 15 seconds, I'd like to allow the user to swipe on the image view allowing him to change the image without waiting for the time to pass.
this is what I tried the images are loaded and animated but the swipe gesture doesn't work
self.imageView.animationImages = [self loadImages];
self.imageView.animationDuration = 15;
[self.imageView startAnimating];
swipeRight = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(swipeAction:)];
[swipeRight setDirection:(UISwipeGestureRecognizerDirectionRight)];
[self.imageView addGestureRecognizer:swipeRight];
this code is placed in the viewDidLoad method, I saw a solution adding the gesture to the view but I need it only on the imageView
I'm answering this in case someone else spends too much time on this one it appears i forgot to enable user interaction add this line:
self.imageView.userInteractionEnabled = YES;

TTImageView refresh image

I have the following code
TTImageView* chart = [[[TTImageView alloc] initWithFrame:CGRectMake(2,2,30, 30)] autorelease];
chart.backgroundColor = [UIColor clearColor];
chart.defaultImage = nil;
chart.urlPath = #"http://test.com/test.png";
And I need to refresh the image, so I call reload function on the
chart. However the chart is not refreshed.
I found out that TTURLCache cached the image. So in app delegate when
app just started i do the following:
[TTURLCache sharedCache].disableImageCache = YES;
[[TTURLCache sharedCache] removeAll:YES];
However, the image is still not refreshed. Any help would be
appreciated. I also realized that whenever I do [chart reload], and check [chart isLoading] it is always true, which means that the request is not sent somehow.
Add a reload call after you set the URL and make sure you are calling from the main thread
[chart reload];
I ran into a similar issue when I tried loading my users avatars into a UIButton view. I needed to use TTImageView's delegate protocol to update the button's image. For example:
self.avatarImageView = [[TTImageView alloc] initWithFrame:CGRectMake(10, 10, 50, 50)];
self.avatarImageView.defaultImage = TTIMAGE(#"bundle://defaultAvatar.png");
self.avatarImageView.delegate = self;
self.avatarImageView.urlPath = #"http://www.site.com/path/to/image.png"
Once the urlPath is set, the request is made in the background. When you conform to the TTImageViewDelegate, you can access the following delegate methods.
/**
* Called when the image begins loading asynchronously.
*/
- (void)imageViewDidStartLoad:(TTImageView*)imageView {
NSLog(#"loading image...");
}
/**
* Called when the image finishes loading asynchronously.
*/
- (void)imageView:(TTImageView*)imageView didLoadImage:(UIImage*)image {
NSLog(#"loaded image!");
[self.avatarImageButton setImage:image forState:UIControlStateNormal];
}
/**
* Called when the image failed to load asynchronously.
* If error is nil then the request was cancelled.
*/
- (void)imageView:(TTImageView*)imageView didFailLoadWithError:(NSError*)error {
NSLog(#"error loading image - %#", error);
}
When the image has loaded successfully, I can use it for my UIButton as you'll see in the imageView:didLoadImage method.
Hope this helps shed some light on this issue. I know it's an old question, but maybe it'll help others later on.
After clearing the cache you should also reset the TTImageView. A simple way to do it would be to call [chart unsetImage] and then set the urlPath again.
Another way to do it without clearing the cache would be to add a dummy parameter to the url.
chart.urlPath = #"http://test.com/test.png?dummy=dummy";

how to get the path and url of the temp image?

I capture a image of webview that playing a flash. Because I want to show this image and use the IKSaveOptions to save the image .but I found the path of image is nil, Now I want to get the path of image , How to do ?
my code:
NSString *path = [[NSBundle mainBundle] pathForResource: ?? ofType:??];
NSURL *url = [NSURL fileURLWithPath: path];
then I use the url to show the image in the imageview,but Now I didn't get the path ,Thanks a lot!
thanks a lot. Now I think that the NSBundle is not my option. But I want to show the image of capturing it from a webview showing a flash. I can show it in a imagecell, but show in the imageview ,I use the code:
[imageView setImage:image imageProperties: mImageProperties];
and the mImageProperties come from ahead of the document of you take me the ImageKit documentation, the name is "Viewing an Image in an Image View" ,and in there the mImageProperties is the properties of image . code is :
mImageProperties = (NSDictionary*)CGImageSourceCopyPropertiesAtIndex(isr, 0, (CFDictionaryRef)mImageProperties);
still ues the url of the image, if I couldn't use the NsBundle , How I can do to get the properties of the image, by the way, I have already get the image of the capturing webview of showing a flash. code :
NSBitmapImageRep *imageRep = [webView bitmapImageRepForCachingDisplayInRect:[webView frame]];
[webView cacheDisplayInRect:[webView frame] toBitmapImageRep:imageRep];
NSImage *image = [[NSImage alloc] initWithSize:[webView frame].size];
[image addRepresentation:imageRep];
Okay, you are confusing a lot of things here. First off, IKSaveOptions does not save an imnage. It is a mechanism for presenting an interface to the user about the options the want for saving, but it does not actually save the file anywhere. To save the file you use the underlying CGImage mechanisms, as described in the ImnageKit documentation. I think if you read the example code there it will also be clear where the path to the saved file is.
Now, onto the second issue. You would never use pathForResource:ofType to get it. That gets resources that are in your application bundle. In other words, things that are a part of your application, that you include with it at build time. You should NEVER modify your bundle contents after build, aside from being complicated, it will invalidate codesigned applications. Instead you should probably use either CGImage or NSImage to read it in.
ok, I have the answer.
NSBitmapImageRep *imageRep = [webView bitmapImageRepForCachingDisplayInRect:tmpRect];
[webView cacheDisplayInRect:tmpRect toBitmapImageRep:imageRep];
NSImage *theimage = [[NSImage alloc] initWithSize:tmpRect.size];
[theimage addRepresentation:imageRep];
CGImageSourceRef isr = CGImageSourceCreateWithData((CFDataRef)[theimage TIFFRepresentation], NULL);
CGImageRef image = NULL;
if (isr)
{
image = CGImageSourceCreateImageAtIndex(isr, 0, NULL);
if (image)
{
mImageProperties = (NSDictionary*)CGImageSourceCopyPropertiesAtIndex(isr, 0, (CFDictionaryRef)mImageProperties);
}
CFRelease(isr);
}
if (image)
{
[imageView setImage:image imageProperties:mImageProperties];
CGImageRelease(image);
}

Resources