iOS7 screenshot not taking into consideration blur effect - uiimage

I'm taking screen shot with this code
- (UIImage *)screenshot {
UIGraphicsBeginImageContext(self.bounds.size);
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
but the resulting image doesn't have the alpha and blur effects showing properly
any way to fix this?

When you look into the documentation of "renderInContext" you can see it has some downsides when it comes to Animations and so one. Try it with this, if it isn't necessary to take a screenshot of the layer directly
- (UIImage *)screenshot {
UIGraphicsBeginImageContextWithOptions(self.view.frame.size, YES, 0);
[self.view drawViewHierarchyInRect:self.view.frame afterScreenUpdates:NO];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

Related

Take UIImage of UIViews layer property

I am Adding multiple CAShapeLayers to layer property of my UIView subclass by calling [self.layer addSublayer:shapeLayer];
To capture these layers in a UIImage I have implemented the following method which is meant to take an image of the layer property.
- (UIImage *)takeImage {
UIGraphicsBeginImageContext(self.frame.size);
CGContextRef context = UIGraphicsGetCurrentContext();
if (context) {
[self.layer renderInContext:context];
UIImage *snapshotImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return snapshotImage;
}
return nil;
}
My Problem is, that the resulting image is completely blank. Why is this so?
The image was great! Didn't set the IBOulet properly.

combine one transparent image to normal image

I want to add a transparent overlay to an image
The code I am using it :
- (UIImage *) overlayImage
{
VampirizerAppDelegate* app = (VampirizerAppDelegate *)[UIApplication sharedApplication].delegate;
UIImage* pSourceImage;
pSourceImage = pTempResult;
UIImage *topOverlayImg = [UIImage imageNamed:#"overlay-disco-lights-top.png"];
// size is taken from the background image
UIGraphicsBeginImageContext(pSourceImage.size);
[pSourceImage drawAtPoint:CGPointZero];
CGRect topImageRect = CGRectMake(0, 0, pSourceImage.size.width, pSourceImage.size.height);
[pSourceImage drawInRect:topImageRect blendMode:kCGBlendModeOverlay alpha:0.99];
NSLog(#"topImageRect : %#", NSStringFromCGRect(topImageRect));
UIImage *combinedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return combinedImage;
}
the problem is that the final image I am getting is not clear and is distorted.
Please help

CCLayer to UIImage - Anti-aliasing?

When I grab a snapshot of a CCLayer as an UIImage with the help of CCRenderTexture it seems like I'm loosing the anti-aliasing, resulting in the output image looking slightly different from what the screen actually looks like.
Is there a way of getting an output image that corresponds more exactly to what is shown on the screen?
This is how I'm getting my UIImage:
-(UIImage*)layerRepresentation {
CCLayer *layer1 = self;
CCRenderTexture *renderer01 = [CCRenderTexture renderTextureWithWidth:layer1.contentSize.width height:layer1.contentSize.height];
[renderer01 begin];
[self visit];
[renderer01 end];
UIImage *image = [renderer01 getUIImage];
return image;
}
When CCRenderTexture is created, it sendssetAliasTexParameters message to its texture. Try
[renderer01.sprite.texture setAntiAliasTexParameters];

image cropping an AVCaptureSession Image

So I have been at it all day to no luck and it has been needless to say quite frustrating, I have looked up many examples and downloadable categories which all tout being able to crop images flawlessly. Which they do, However the minute i try to do it from an image genrated via AVCaptureSession it does not work as well. I consulted both these sources
http://codefuel.wordpress.com/2011/04/22/image-cropping-from-a-uiscrollview/
http://vocaro.com/trevor/blog/2009/10/12/resize-a-uiimage-the-right-way/
and the project from the first link seems to work directly as advertised but as soon as i hack it to do the same magic on an av capture image...nope...
does anyone have insight into this? Also here is my code for reference.
- (IBAction)TakePhotoPressed:(id)sender
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
//NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
//NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
NSLog(#"%f",image.size.width);
NSLog(#"%f",image.size.height);
float scale = 1.0f/_scrollView.zoomScale;
CGRect visibleRect;
visibleRect.origin.x = _scrollView.contentOffset.x * scale;
visibleRect.origin.y = _scrollView.contentOffset.x * scale;
visibleRect.size.width = _scrollView.bounds.size.width * scale;
visibleRect.size.height = _scrollView.bounds.size.height * scale;
UIImage* cropped = [self cropImage:image withRect:visibleRect];
[croppedImage setImage:cropped];
[image release];
}
];
[croppedImage setHidden:NO];
}
cropImage function used above.
-(UIImage*)cropImage :(UIImage*)originalImage withRect :(CGRect) rect
{
CGRect transformedRect=rect;
if(originalImage.imageOrientation==UIImageOrientationRight)
{
transformedRect.origin.x = rect.origin.y;
transformedRect.origin.y = originalImage.size.width-(rect.origin.x+rect.size.width);
transformedRect.size.width = rect.size.height;
transformedRect.size.height = rect.size.width;
}
CGImageRef cr = CGImageCreateWithImageInRect(originalImage.CGImage, transformedRect);
UIImage* cropped = [UIImage imageWithCGImage:cr scale:originalImage.scale orientation:originalImage.imageOrientation];
[croppedImage setFrame:CGRectMake(croppedImage.frame.origin.x,
croppedImage.frame.origin.y,
cropped.size.width,
cropped.size.height)];
CGImageRelease(cr);
return cropped;
}
I am also tempted for verbosity and arming whomever might help me in my plight with as much information as possible to post my init of my scrollView and avcapture session. However That may be a bit too much so if you want to see it just ask.
Now as for results of what the code actually does?..
What it looks like before i take the picture
And After...
EDIT:
Well I have a few views now and no comment's so either no one has figured it out or it's so simple they thought i would have figured it out again...In any case i have not made any progress. So for anyone interested here is a small sample app with the code all set up and you can see what i am doing
https://docs.google.com/open?id=0Bxr4V3a9QFM_NnoxMkhzZTVNVEE
It seems that this little conundrum did not only have me stumped as after nearly a week,but a scant few of whoever viewed my question had no suggestions either. I must say for this particular problem i could not get it to work in this way, I pondered and tinkered and mused for a while to no avail. Until i did this
[self HideElements];
UIGraphicsBeginImageContext(chosenPhotoView.frame.size);
[chosenPhotoView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[self ShowElements];
And that's it, less code and it worked pretty much instantly. So instead of trying to crop an image via the scrollview I take a screenshot of the screen at that time then crop the image using the scrollviews frame variables. And the hide/show element functions hide any overlapping elements on the picture i want.

Setting image property of UIImageView causes major lag

Let me tell you about the problem I am having and how I tried to solve it. I have a UIScrollView which loads subviews as one scrolls from left to right. Each subview has 10-20 images around 400x200 each. When I scroll from view to view, I experience quite a bit of lag.
After investigating, I discovered that after unloading all the views and trying it again, the lag was gone. I figured that the synchronous caching of the images was the cause of the lag. So I created a subclass of UIImageView which loaded the images asynchronously. The loading code looks like the following (self.dispatchQueue returns a serial dispatch queue).
- (void)loadImageNamed:(NSString *)name {
dispatch_async(self.dispatchQueue, ^{
UIImage *image = [UIImage imageNamed:name];
dispatch_sync(dispatch_get_main_queue(), ^{
self.image = image;
});
});
}
However, after changing all of my UIImageViews to this subclass, I still experienced lag (I'm not sure if it was lessened or not). I boiled down the cause of the problem to self.image = image;. Why is this causing so much lag (but only on the first load)?
Please help me. =(
EDIT 3: iOS 15 now offers UIImage.prepareForDisplay(completionHandler:).
image.prepareForDisplay { decodedImage in
imageView.image = decodedImage
}
or
imageView.image = await image.byPreparingForDisplay()
EDIT 2: Here is a Swift version that contains a few improvements. (Untested.)
https://gist.github.com/fumoboy007/d869e66ad0466a9c246d
EDIT: Actually, I believe all that is necessary is the following. (Untested.)
- (void)loadImageNamed:(NSString *)name {
dispatch_async(self.dispatchQueue, ^{
// Determine path to image depending on scale of device's screen,
// fallback to 1x if 2x is not available
NSString *pathTo1xImage = [[NSBundle mainBundle] pathForResource:name ofType:#"png"];
NSString *pathTo2xImage = [[NSBundle mainBundle] pathForResource:[name stringByAppendingString:#"#2x"] ofType:#"png"];
NSString *pathToImage = ([UIScreen mainScreen].scale == 1 || !pathTo2xImage) ? pathTo1xImage : pathTo2xImage;
UIImage *image = [[UIImage alloc] initWithContentsOfFile:pathToImage];
// Decompress image
if (image) {
UIGraphicsBeginImageContextWithOptions(image.size, NO, image.scale);
[image drawAtPoint:CGPointZero];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
// Configure the UI with pre-decompressed UIImage
dispatch_async(dispatch_get_main_queue(), ^{
self.image = image;
});
});
}
ORIGINAL ANSWER: It turns out that it wasn't self.image = image; directly. The UIImage image loading methods don't decompress and process the image data right away; they do it when the view refreshes its display. So the solution was to go a level lower to Core Graphics and decompress and process the image data myself. The new code looks like the following.
- (void)loadImageNamed:(NSString *)name {
dispatch_async(self.dispatchQueue, ^{
// Determine path to image depending on scale of device's screen,
// fallback to 1x if 2x is not available
NSString *pathTo1xImage = [[NSBundle mainBundle] pathForResource:name ofType:#"png"];
NSString *pathTo2xImage = [[NSBundle mainBundle] pathForResource:[name stringByAppendingString:#"#2x"] ofType:#"png"];
NSString *pathToImage = ([UIScreen mainScreen].scale == 1 || !pathTo2xImage) ? pathTo1xImage : pathTo2xImage;
UIImage *uiImage = nil;
if (pathToImage) {
// Load the image
CGDataProviderRef imageDataProvider = CGDataProviderCreateWithFilename([pathToImage fileSystemRepresentation]);
CGImageRef image = CGImageCreateWithPNGDataProvider(imageDataProvider, NULL, NO, kCGRenderingIntentDefault);
// Create a bitmap context from the image's specifications
// (Note: We need to specify kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Little
// because PNGs are optimized by Xcode this way.)
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef bitmapContext = CGBitmapContextCreate(NULL, CGImageGetWidth(image), CGImageGetHeight(image), CGImageGetBitsPerComponent(image), CGImageGetWidth(image) * 4, colorSpace, kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Little);
// Draw the image into the bitmap context
CGContextDrawImage(bitmapContext, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);
// Extract the decompressed image
CGImageRef decompressedImage = CGBitmapContextCreateImage(bitmapContext);
// Create a UIImage
uiImage = [[UIImage alloc] initWithCGImage:decompressedImage];
// Release everything
CGImageRelease(decompressedImage);
CGContextRelease(bitmapContext);
CGColorSpaceRelease(colorSpace);
CGImageRelease(image);
CGDataProviderRelease(imageDataProvider);
}
// Configure the UI with pre-decompressed UIImage
dispatch_async(dispatch_get_main_queue(), ^{
self.image = uiImage;
});
});
}
I think, the problem could be Images themselves. For example - I got in one of my projects 10 images 640x600 layered with alpha transparency on each other. When I try to push or pop viewcontroller from this viewcontroller.. it lags a lot.
when I leave only few images or use quite smaller images - no lag.
P.S. tested on sdk 4.2 ios5 iphone 4.

Resources