I'm trying to merge two images to a background image. This works fine on non-Retina Macs.
The resulting image has the size of my original background image. But on a MacBook Pro Retina the resulting image is doubled. I read that one should use imageWithSize for solving that issue cause since 10.8 / Retina support images aren't measured in pixels anymore; as we know it's points.
I've no clue how I should change my code to solve this.
What i have so far:
NSSize size = background.size ;
GLsizei width = (GLsizei)size.width;
GLsizei height = (GLsizei)size.height;
NSBitmapImageRep* imgRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:NULL pixelsWide:width pixelsHigh:height bitsPerSample:8 samplesPerPixel:3 hasAlpha:NO isPlanar:NO colorSpaceName:NSDeviceRGBColorSpace bytesPerRow:width*3 bitsPerPixel:0];
#ifdef __MAC_10_8
NSImage* backgroundWrapper = [NSImage imageWithSize:size flipped:YES drawingHandler:^(NSRect dstRect){ return [imgRep drawInRect:dstRect]; }];
NSPoint backgroundPoint;
backgroundPoint.x = 0;
backgroundPoint.y = 0;
[backgroundWrapper lockFocusFlipped:YES];
[background drawAtPoint:backgroundPoint fromRect:NSZeroRect operation:NSCompositeSourceOver fraction:1.0];
[firstImage firstImage fromRect:NSZeroRect operation:NSCompositeSourceOver fraction:1.0];
[backgroundWrapper unlockFocus];
NSBitmapImageRep *bmpImageRep = [[NSBitmapImageRep alloc]initWithData:[backgroundWrapper TIFFRepresentation]];
[backgroundWrapper addRepresentation:bmpImageRep];
#else
[background lockFocusFlipped:YES];
[firstImage drawAtPoint:firstImagePoint fromRect:NSZeroRect operation:NSCompositeSourceOver fraction:1.0];
[background unlockFocus];
NSBitmapImageRep *bmpImageRep = [[NSBitmapImageRep alloc]initWithData:[background TIFFRepresentation]];
[background addRepresentation:bmpImageRep];
#endif
NSData *data = [bmpImageRep representationUsingType: NSPNGFileType properties: nil];
How can I get this to work on a Retina display?
edit
I also tried this:
- (NSImage )drawImage:(NSImage)background with:(NSImage*)firstImage {
return [NSImage imageWithSize:NSMakeSize(3200, 2000) flipped:NO drawingHandler:^BOOL(NSRect dstRect) {
NSPoint backgroundPoint; backgroundPoint.x = 0;
backgroundPoint.y = 0;
[background drawAtPoint:backgroundPoint fromRect:NSZeroRect operation:NSCompositeSourceOver fraction:1.0];
[firstImage drawAtPoint:backgroundPoint fromRect:NSZeroRect operation:NSCompositeSourceOver fraction:1.0];
return true;
}];
}
But the resulting image is 6400x4000px
Related
How to shift NSImage horizontally that shifted pixels appear at the other side so it looks like a loop?
Currently I am using drawInRect. Is there any CIFilter or smarter way to do this?
- (CIImage *)image:(NSImage *)image shiftedBy:(CGFloat)shiftAmount
{
NSUInteger width = image.size.width;
NSUInteger height = image.size.height;
NSBitmapImageRep *rep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:NULL
pixelsWide:width
pixelsHigh:height
bitsPerSample:8
samplesPerPixel:4
hasAlpha:YES
isPlanar:NO
colorSpaceName:NSDeviceRGBColorSpace
bytesPerRow:0
bitsPerPixel:0];
[rep setSize:NSMakeSize(width, height)];
[NSGraphicsContext saveGraphicsState];
NSGraphicsContext *context = [NSGraphicsContext graphicsContextWithBitmapImageRep:rep];
[NSGraphicsContext setCurrentContext:context];
CGRect rect0 = CGRectMake(0, 0, width, height);
CGRect leftSourceRect, rightSourceRect;
CGRectDivide(rect0, &leftSourceRect, &rightSourceRect, shiftAmount, CGRectMinXEdge);
CGRect rightDestinationRect = CGRectOffset(leftSourceRect, width - rightSourceRect.origin.x, 0);
CGRect leftDestinationRect = rightSourceRect;
leftDestinationRect.origin.x = 0;
[image drawInRect:leftDestinationRect fromRect:rightSourceRect operation:NSCompositingOperationSourceOver fraction:1.0];
[image drawInRect:rightDestinationRect fromRect:leftSourceRect operation:NSCompositingOperationSourceOver fraction:1.0];
[NSGraphicsContext restoreGraphicsState];
return [[CIImage alloc] initWithBitmapImageRep:rep];
}
I tried it with CIFilter however the performance hit is 3-4x slower. However, the code is more readable.
- (CIImage *)image:(NSImage *)image shiftXBy:(CGFloat)shiftX YBy:(CGFloat)shiftY
{
//avoid calling TIFFRepresentation here cause the performance hit is even bigger
CIImage *ciImage = [[CIImage alloc] initWithData:[image TIFFRepresentation]];
CGAffineTransform xform = CGAffineTransformIdentity;
NSValue *xformObj = [NSValue valueWithBytes:&xform objCType:#encode(CGAffineTransform)];
ciImage = [ciImage imageByApplyingFilter:#"CIAffineTile"
withInputParameters:#{kCIInputTransformKey : xformObj} ];
ciImage = [ciImage imageByCroppingToRect:CGRectMake(shiftX, shiftY, image.size.width, image.size.height)];
return ciImage;
}
For optimum performance you need to use CALayer. Here’s the basic concept:
Your NSView subclass should have wantsLayer and layerUsesCoreImageFilters set to true.
Assign your image to content property of NSView.layer (or add a new sublayer).
Create CIAffineTile filter and add it to the layer.
Now you can change values of the filter without reloading or redrawing the image. This all would be hardware accelerated.
Hi, I’m trying to resize a uipasteboard but I keep getting this error. Is there a way to fix this it?
This is the code I’m using:
- (NSImage *)imageResize:(NSImage*)anImage newSize:(NSSize)newSize {
NSImage *sourceImage = anImage;
[sourceImage setScalesWhenResized:YES];
// Report an error if the source isn't a valid image
if (![sourceImage isValid]){
NSLog(#"Invalid Image");
} else {
NSImage *smallImage = [[NSImage alloc] initWithSize: newSize];
[smallImage lockFocus];
[sourceImage setSize: newSize];
[[NSGraphicsContext currentContext] setImageInterpolation:NSImageInterpolationHigh];
[sourceImage drawAtPoint:NSZeroPoint fromRect:CGRectMake(0, 0, newSize.width, newSize.height) operation:NSCompositeCopy fraction:1.0];
[smallImage unlockFocus];
return smallImage;
}
return nil;
}
I want to take print of IKImageBrowserView with (content) images. I tried the following code
if (code == NSOKButton) {
NSPrintInfo *printInfo;
NSPrintInfo *sharedInfo;
NSPrintOperation *printOp;
NSMutableDictionary *printInfoDict;
NSMutableDictionary *sharedDict;
sharedInfo = [NSPrintInfo sharedPrintInfo];
sharedDict = [sharedInfo dictionary];
printInfoDict = [NSMutableDictionary dictionaryWithDictionary: sharedDict];
[printInfoDict setObject:NSPrintSaveJob
forKey:NSPrintJobDisposition];
[printInfoDict setObject:[sheet filename] forKey:NSPrintSavePath];
printInfo = [[NSPrintInfo alloc] initWithDictionary:printInfoDict];
[printInfo setHorizontalPagination: NSAutoPagination];
[printInfo setVerticalPagination: NSAutoPagination];
[printInfo setVerticallyCentered:NO];
printOp = [NSPrintOperation printOperationWithView:imageBrowser
printInfo:printInfo];
[printOp setShowsProgressPanel:NO];
[printOp runOperation];
}
because IKImageBrowserView is Inherits from NSView but print preview is showing null image. Please help me to over come this problem. Thanks in advance.....
/*
1) allocate a c buffer at the size of the visible rect of the image
browser
*/
NSRect vRect = [imageBrowser visibleRect];
NSSize size = vRect.size;
NSLog(#"Size W = %f and H = %f", size.width, size.height);
void *buffer = malloc(size.width * size.height * 4);
//2) read the pixels using openGL
[imageBrowser lockFocus];
glReadPixels(0,
0,
size.width,
size.height,
GL_RGBA,
GL_UNSIGNED_BYTE,
buffer);
[imageBrowser unlockFocus];
//3) create a bitmap with those pixels
unsigned char *planes[2];
planes[0] = (unsigned char *) (buffer);
NSBitmapImageRep *imageRep = [[NSBitmapImageRep alloc]
initWithBitmapDataPlanes:planes pixelsWide:size.width
pixelsHigh:size.height bitsPerSample:8 samplesPerPixel:4 hasAlpha:YES
isPlanar:NO colorSpaceName:NSDeviceRGBColorSpace bitmapFormat:0
bytesPerRow:size.width*4 bitsPerPixel:32];
/*
4) create a temporary image with this bitmap and set it flipped
(because openGL and the AppKit don't have the same pixels coordinate
system)
*/
NSImage *img = [[NSImage alloc] initWithSize:size];
[img addRepresentation:imageRep];
[img setFlipped:YES];
[imageRep release];
/*
5) draw this temporary image into another image so that we get an
image without any reference to our "buffer" buffer so that we can
release it after that
*/
NSImage *finalImage = [[NSImage alloc] initWithSize:size];
[finalImage lockFocus];
[img drawAtPoint:NSZeroPoint
fromRect:NSMakeRect(0,0,size.width,size.height)
operation:NSCompositeCopy fraction:1.0];
[finalImage unlockFocus];
//[NSString stringWithFormat:#"/tmp/%#.tiff", marker]
NSData *imageData = [finalImage TIFFRepresentation];
NSString *writeToFileName = [NSString stringWithFormat:#"/Users/Desktop/%#.png", [NSDate date]];
[imageData writeToFile:writeToFileName atomically:NO];
//6) release intermediate objects
[img release];
free(buffer);
After this I send imageData for print, which works great for me.
I'm trying to set a custom drag icon for use in an NSTableView. Everything seems to work but I've run into a problem due to my inexperience with Quartz.
- (NSImage *)dragImageForRowsWithIndexes:(NSIndexSet *)dragRows tableColumns:(NSArray *)tableColumns event:(NSEvent *)dragEvent offset:(NSPointPointer)dragImageOffset
{
NSImage *dragImage = [NSImage imageNamed:#"icon.png"];
NSString *count = [NSString stringWithFormat:#"%d", [dragRows count]];
[dragImage lockFocus];
[dragImage compositeToPoint:NSZeroPoint fromRect:NSZeroRect operation:NSCompositeSourceOver fraction:0.5];
[count drawAtPoint:NSZeroPoint withAttributes:nil];
[dragImage unlockFocus];
return dragImage;
}
Essentially what I'm looking to do is render my icon.png file with 50% opacity along with an NSString which shows the number of rows currently being dragged. The issue I'm seeing is that my NSString renders with a low opacity, but not my icon.
The issue is that you’re drawing your icon on top of itself. What you probably want is something like this:
- (NSImage *)dragImageForRowsWithIndexes:(NSIndexSet *)dragRows tableColumns:(NSArray *)tableColumns event:(NSEvent *)dragEvent offset:(NSPointPointer)dragImageOffset
{
NSImage *icon = [NSImage imageNamed:#"icon.png"];
NSString *count = [NSString stringWithFormat:#"%lu", [dragRows count]];
NSImage *dragImage = [[[NSImage alloc] initWithSize:[icon size]] autorelease];
[dragImage lockFocus];
[icon drawAtPoint:NSZeroPoint fromRect:NSZeroRect operation:NSCompositeSourceOver fraction:0.5];
[count drawAtPoint:NSZeroPoint withAttributes:nil];
[dragImage unlockFocus];
return dragImage;
}
I'm trying to do image scaling using Core Image, using Lanczos Scale Transform filter.
It is fine when I'm doing scaleup. But on scaledown and saving to JPEG I found a class of images which produces strange noise artifacts and behavior.
For inputScale multiples to 0.5: 0.5, 0.25, 0.125 etc it is always fine.
For others inputScale values,it's broken.
When I'm saving to TIFF,PNG,JPEG2000 or draw on screen - it's fine.
When I'm saving to JPG or BMP - it's broken.
I uploaded sample images:
(original, 4272x2848, 3.4Mb)
[http://www.zoomfoot.com/get/package/517e2423-7795-4239-a166-03d507ec51d8]
(scaled with noise, 1920x1280, 2.2Mb)
[http://www.zoomfoot.com/get/package/6eb64d33-3a30-4e8d-9953-67ce1e7d7ef9]
It also, pretty good reproducible on 'sunset' images.
I've tried couple more ways to scale images. Using CIAffineTransform and drawing using NSImage itself. Both provides results without any noise.
Here is the code I was using for scaling & saving:
================= Lanczos ==============
- (NSImage *) scaleLanczosImage:(NSString *)path
Width:(int) desiredWidth
Height:(int) desiredHeight
{
CIImage *image=[CIImage imageWithContentsOfURL:
[NSURL fileURLWithPath:path]];
CIFilter *scaleFilter = [CIFilter filterWithName:#"CILanczosScaleTransform"];
int originalHeight=[image extent].size.height;
int originalWidth=[image extent].size.width;
float yScale=(float)desiredHeight / (float)originalHeight;
float xScale=(float)desiredWidth / (float)originalWidth;
float scale=fminf(yScale, xScale);
[scaleFilter setValue:[NSNumber numberWithFloat:scale]
forKey:#"inputScale"];
[scaleFilter setValue:[NSNumber numberWithFloat:1.0]
forKey:#"inputAspectRatio"];
[scaleFilter setValue: image
forKey:#"inputImage"];
image = [scaleFilter valueForKey:#"outputImage"];
NSBitmapImageRep* rep = [[NSBitmapImageRep alloc] initWithCIImage:image];
NSMutableDictionary *options=[NSMutableDictionary dictionaryWithObject:[NSDecimalNumber numberWithFloat:1.0]
forKey:NSImageCompressionFactor];
[options setValue:[NSNumber numberWithBool:YES] forKey:NSImageProgressive];
NSData* jpegData = [rep representationUsingType:NSJPEGFileType
properties:options];
[jpegData writeToFile:#"/Users/dmitry/tmp/img_4389-800x600.jpg" atomically:YES];
}
================= NSImage ==============
- (void) scaleNSImage:(NSString *)path
Width:(int) desiredWidth
Height:(int) desiredHeight
{
NSImage *sourceImage = [[NSImage alloc] initWithContentsOfFile:path];
NSImage *resizedImage = [[NSImage alloc] initWithSize:
NSMakeSize(desiredWidth, desiredHeight)];
NSSize originalSize = [sourceImage size];
[resizedImage lockFocus];
[sourceImage drawInRect: NSMakeRect(0, 0, desiredWidth, desiredHeight)
fromRect: NSMakeRect(0, 0, originalSize.width, originalSize.height)
operation: NSCompositeSourceOver fraction: 1.0];
[resizedImage unlockFocus];
NSMutableDictionary *options=[NSMutableDictionary dictionaryWithObject:[NSDecimalNumber numberWithFloat:1.0]
forKey:NSImageCompressionFactor];
[options setValue:[NSNumber numberWithBool:YES] forKey:NSImageProgressive];
NSBitmapImageRep *rep = [[NSBitmapImageRep alloc]
initWithData:[resizedImage TIFFRepresentation]];
NSData* jpegData = [rep representationUsingType:NSJPEGFileType
properties:options];
[jpegData writeToFile:#"~/nsimg_4389-800x600.jpg" atomically:YES];
}
If anyone can explain or suggest something here, i'm really appreciate this.
Thanks.
There are some quirks with scaling with CIImage. Will this post by Dan Wood help?
My understanding is that due to its nature, the hardware Core Image renderer has some odd quirks when rendering for anything but on-screen display. So, you are recommended to use the software renderer for other situations.
http://developer.apple.com/mac/library/qa/qa2005/qa1416.html
That's definitely a CoreImage bug. File it at bugreport.apple.com