Drawing on NSBitmapImageRep - macos

I'm trying to create in-memory image, draw on it and save to disk.
Current code is:
NSBitmapImageRep *rep = [[NSBitmapImageRep alloc]
initWithBitmapDataPlanes:NULL
pixelsWide:256
pixelsHigh:256
bitsPerSample:8
samplesPerPixel:4
hasAlpha:YES
isPlanar:YES
colorSpaceName:NSDeviceRGBColorSpace
bitmapFormat:NSAlphaFirstBitmapFormat
bytesPerRow:0
bitsPerPixel:8
];
[NSGraphicsContext saveGraphicsState];
[NSGraphicsContext setCurrentContext:[NSGraphicsContext graphicsContextWithBitmapImageRep:rep]];
// Draw your content...
NSRect aRect=NSMakeRect(10.0,10.0,30.0,30.0);
NSBezierPath *thePath=[NSBezierPath bezierPathWithRect:aRect];
[[NSColor redColor] set];
[thePath fill];
[NSGraphicsContext restoreGraphicsState];
NSData *data = [rep representationUsingType: NSPNGFileType properties: nil];
[data writeToFile: #"test.png" atomically: NO];
when trying to draw in current context, I'm getting error
CGContextSetFillColorWithColor: invalid context 0x0
What is wrong here? Why context returned by NSBitmapImageRep is NULL?
What is the best way to create drawn image and save it?
UPDATE:
Finally came to following solution:
NSImage *image = [[NSImage alloc] initWithSize:NSMakeSize(256, 256)];
[image lockFocus];
NSRect aRect=NSMakeRect(10.0,10.0,30.0,30.0);
NSBezierPath *thePath=[NSBezierPath bezierPathWithRect:aRect];
[[NSColor redColor] set];
[thePath fill];
[image unlockFocus];
NSData *data = [image TIFFRepresentation];
[data writeToFile: #"test.png" atomically: NO];

Your workaround is valid for the task at hand, however, it's a more expensive operation than actually getting an NSBitmapImageRep to work! See http://cocoadev.com/wiki/NSBitmapImageRep for a bit of a discussion..
Notice that [NSGraphicsContext graphicsContextWithBitmapImageRep:] documentation says:
"This method accepts only single plane NSBitmapImageRep instances."
You are setting up your NSBitmapImageRep with isPlanar:YES, which therefore uses multiple planes... Set it to NO - you should be good to go!
In other words:
NSBitmapImageRep *rep = [[NSBitmapImageRep alloc]
initWithBitmapDataPlanes:NULL
pixelsWide:256
pixelsHigh:256
bitsPerSample:8
samplesPerPixel:4
hasAlpha:YES
isPlanar:NO
colorSpaceName:NSDeviceRGBColorSpace
bitmapFormat:NSAlphaFirstBitmapFormat
bytesPerRow:0
bitsPerPixel:0
];
// etc...

Related

Looking for simple example to write text into a NSImage using Cocoa

NSImage *myNewIconImage=[[NSImage imageNamed:#"FanImage"] copy];
[myNewIconImage lockFocus];
[#"15" drawAtPoint:NSZeroPoint withAttributes:nil];
[myNewIconImage unlockFocus];
[myNewIconImage setTemplate:YES];
[[NSApplication sharedApplication]] setApplicationIconImage:myNewIconImage];
I am looking for a way to simply write a String onto this image.... and coming up very short. This does not worker me.
The following code will place a mutable attributed string on an NSImage:
NSImageView *imageView = [[NSImageView alloc] initWithFrame:NSMakeRect( 0, 0, _wndW, _wndH )];
[[window contentView] addSubview:imageView];
NSImage *image = [NSImage imageNamed:#"myImage.jpg"];
[image lockFocus];
NSMutableDictionary *attr = [NSMutableDictionary dictionary];
[attr setObject:[NSFont fontWithName:#"Lucida Grande" size:36] forKey:NSFontAttributeName];
[attr setObject: [NSNumber numberWithFloat: 10.0] forKey: NSStrokeWidthAttributeName];
[attr setObject:[NSColor whiteColor] forKey:NSForegroundColorAttributeName];
NSString *myStr = #"Welcome to Cocoa";
NSMutableAttributedString *s = [[NSMutableAttributedString alloc] initWithString: myStr attributes:attr];
[s drawAtPoint:NSMakePoint(130,130)];
[image unlockFocus];
[imageView setImage:image];

How to shift image horizontally to create loop effect?

How to shift NSImage horizontally that shifted pixels appear at the other side so it looks like a loop?
Currently I am using drawInRect. Is there any CIFilter or smarter way to do this?
- (CIImage *)image:(NSImage *)image shiftedBy:(CGFloat)shiftAmount
{
NSUInteger width = image.size.width;
NSUInteger height = image.size.height;
NSBitmapImageRep *rep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:NULL
pixelsWide:width
pixelsHigh:height
bitsPerSample:8
samplesPerPixel:4
hasAlpha:YES
isPlanar:NO
colorSpaceName:NSDeviceRGBColorSpace
bytesPerRow:0
bitsPerPixel:0];
[rep setSize:NSMakeSize(width, height)];
[NSGraphicsContext saveGraphicsState];
NSGraphicsContext *context = [NSGraphicsContext graphicsContextWithBitmapImageRep:rep];
[NSGraphicsContext setCurrentContext:context];
CGRect rect0 = CGRectMake(0, 0, width, height);
CGRect leftSourceRect, rightSourceRect;
CGRectDivide(rect0, &leftSourceRect, &rightSourceRect, shiftAmount, CGRectMinXEdge);
CGRect rightDestinationRect = CGRectOffset(leftSourceRect, width - rightSourceRect.origin.x, 0);
CGRect leftDestinationRect = rightSourceRect;
leftDestinationRect.origin.x = 0;
[image drawInRect:leftDestinationRect fromRect:rightSourceRect operation:NSCompositingOperationSourceOver fraction:1.0];
[image drawInRect:rightDestinationRect fromRect:leftSourceRect operation:NSCompositingOperationSourceOver fraction:1.0];
[NSGraphicsContext restoreGraphicsState];
return [[CIImage alloc] initWithBitmapImageRep:rep];
}
I tried it with CIFilter however the performance hit is 3-4x slower. However, the code is more readable.
- (CIImage *)image:(NSImage *)image shiftXBy:(CGFloat)shiftX YBy:(CGFloat)shiftY
{
//avoid calling TIFFRepresentation here cause the performance hit is even bigger
CIImage *ciImage = [[CIImage alloc] initWithData:[image TIFFRepresentation]];
CGAffineTransform xform = CGAffineTransformIdentity;
NSValue *xformObj = [NSValue valueWithBytes:&xform objCType:#encode(CGAffineTransform)];
ciImage = [ciImage imageByApplyingFilter:#"CIAffineTile"
withInputParameters:#{kCIInputTransformKey : xformObj} ];
ciImage = [ciImage imageByCroppingToRect:CGRectMake(shiftX, shiftY, image.size.width, image.size.height)];
return ciImage;
}
For optimum performance you need to use CALayer. Here’s the basic concept:
Your NSView subclass should have wantsLayer and layerUsesCoreImageFilters set to true.
Assign your image to content property of NSView.layer (or add a new sublayer).
Create CIAffineTile filter and add it to the layer.
Now you can change values of the filter without reloading or redrawing the image. This all would be hardware accelerated.

iTunes -like status bar

EDITED:
I am trying to show an iTunes-style like information bar. This was subject was covered in detail earlier, for example at iTunes or Xcode style information box at top of window
I only slightly modified the code from the above referenced link, so make it compile under a recent XCode.
My code is below:
- (void)drawRect:(NSRect)dirtyRect
{
// Drawing code here.
static NSShadow *kDropShadow = nil;
static NSShadow *kInnerShadow = nil;
static NSGradient *kBackgroundGradient = nil;
static NSColor *kBorderColor = nil;
if (kDropShadow == nil) {
kDropShadow = [[NSShadow alloc] init];
[kDropShadow setShadowColor:[NSColor colorWithCalibratedWhite:.863 alpha:.75]];
[kDropShadow setShadowOffset:NSMakeSize(0.0, -1.0)];
[kDropShadow setShadowBlurRadius:1.0];
kInnerShadow = [[NSShadow alloc] init];
[kInnerShadow setShadowColor:[NSColor colorWithCalibratedWhite:0.0 alpha:0.52]];
[kInnerShadow setShadowOffset:NSMakeSize(0.0, -1.0)];
[kInnerShadow setShadowBlurRadius:4.0];
kBorderColor = [[NSColor colorWithCalibratedWhite:0.569 alpha:1.0] retain];
// iTunes style
// kBackgroundGradient = [[NSGradient alloc] initWithColorsAndLocations:[NSColor colorWithCalibratedRed:0.929 green:0.945 blue:0.882 alpha:1.0],0.0,[NSColor colorWithCalibratedRed:0.902 green:0.922 blue:0.835 alpha:1.0],0.5,[NSColor colorWithCalibratedRed:0.871 green:0.894 blue:0.78 alpha:1.0],0.5,[NSColor colorWithCalibratedRed:0.949 green:0.961 blue:0.878 alpha:1.0],1.0, nil];
// Xcode style
kBackgroundGradient = [[NSGradient alloc] initWithColorsAndLocations:[NSColor colorWithCalibratedRed:0.957 green:0.976 blue:1.0 alpha:1.0],0.0,[NSColor colorWithCalibratedRed:0.871 green:0.894 blue:0.918 alpha:1.0],0.5,[NSColor colorWithCalibratedRed:0.831 green:0.851 blue:0.867 alpha:1.0],0.5,[NSColor colorWithCalibratedRed:0.82 green:0.847 blue:0.89 alpha:1.0],1.0, nil];
}
NSRect bounds = [self bounds];
NSBezierPath *path = [NSBezierPath bezierPathWithRoundedRect:bounds xRadius:3.5 yRadius:3.5];
[NSGraphicsContext saveGraphicsState];
[kDropShadow set];
[path fill];
[NSGraphicsContext restoreGraphicsState];
[kBackgroundGradient drawInBezierPath:path angle:-90.0];
[kBorderColor setStroke];
[path stroke];
}
It is not working, however. I don't think drawRect() method ever gets called. What am I missing? Please advise.
Thank you
I had an extra line at the end:
[path fill];
Removing it does the trick.

NSImage and retina display confusion

I want to add crosses on a NSImage, here's my code:
-(NSSize)convertPixelSizeToPointSize:(NSSize)px
{
CGFloat displayScale = [[NSScreen mainScreen] backingScaleFactor];
NSSize res;
res.width = px.width / displayScale;
res.height = px.height / displayScale;
return res;
}
-(void)awakeFromNib
{
CGFloat scale = [[NSScreen mainScreen] backingScaleFactor];
NSLog(#"backingScaleFactor : %f",scale);
NSImage *img = [[[NSImage alloc]initWithContentsOfFile:#"/Users/support/Pictures/cat.JPG"] autorelease];
NSBitmapImageRep *imgRep = [NSBitmapImageRep imageRepWithData:[img TIFFRepresentation]];
NSSize imgPixelSize = NSMakeSize([imgRep pixelsWide],[imgRep pixelsHigh]);
NSSize imgPointSize = [self convertPixelSizeToPointSize:imgPixelSize];
[img setSize:imgPointSize];
NSLog(#"imgPixelSize.width: %f , imgPixelSize.height:%f",imgPixelSize.width,imgPixelSize.height);
NSLog(#"imgPointSize.width: %f , imgPointSize.height:%f",imgPointSize.width,imgPointSize.height);
[img lockFocus];
NSAffineTransform *trans = [[[NSAffineTransform alloc] init] autorelease];
[trans scaleBy:1.0 / scale];
[trans set];
NSBezierPath *path = [NSBezierPath bezierPath];
[[NSColor redColor] setStroke];
[path moveToPoint:NSMakePoint(0.0, 0.0)];
[path lineToPoint:NSMakePoint(imgPixelSize.width, imgPixelSize.height)];
[path moveToPoint:NSMakePoint(0.0, imgPixelSize.height)];
[path lineToPoint:NSMakePoint(imgPixelSize.width, 0.0)];
[path setLineWidth:1];
[path stroke];
[img unlockFocus];
[imageView setImage:img];
imgRep = [NSBitmapImageRep imageRepWithData:[img TIFFRepresentation]];
NSData *imageData = [imgRep representationUsingType:NSJPEGFileType properties:nil];
[imageData writeToFile:#"/Users/support/Pictures/11-5.JPG" atomically:NO];
}
on non-retina display the result is:
and console displayed:
2012-07-06 00:53:09.889 RetinaTest[8074:403] backingScaleFactor : 1.000000
2012-07-06 00:53:09.901 RetinaTest[8074:403] imgPixelSize.width: 515.000000 , imgPixelSize.height:600.000000
2012-07-06 00:53:09.902 RetinaTest[8074:403] imgPointSize.width: 515.000000 , imgPointSize.height:600.000000
but on retina display (I didn't use the real retina display but hidpi mode):
console:
2012-07-06 00:56:05.071 RetinaTest[8113:403] backingScaleFactor : 2.000000
2012-07-06 00:56:05.083 RetinaTest[8113:403] imgPixelSize.width: 515.000000 , imgPixelSize.height:600.000000
2012-07-06 00:56:05.084 RetinaTest[8113:403] imgPointSize.width: 257.500000 , imgPointSize.height:300.000000
If I omit these lines:
NSAffineTransform *trans = [[[NSAffineTransform alloc] init] autorelease];
[trans scaleBy:1.0 / scale];
[trans set];
However if I change [NSAffineTransform scaleBy] to 1.0 the result is right
NSAffineTransform *trans = [[[NSAffineTransform alloc] init] autorelease];
[trans scaleBy:1.0];
[trans set];
Console:
2012-07-06 01:01:03.420 RetinaTest[8126:403] backingScaleFactor : 2.000000
2012-07-06 01:01:03.431 RetinaTest[8126:403] imgPixelSize.width: 515.000000 , imgPixelSize.height:600.000000
2012-07-06 01:01:03.432 RetinaTest[8126:403] imgPointSize.width: 257.500000 , imgPointSize.height:300.000000
Could anyone give an explanation please ? is hidpi mode different from retina display ?
I think I've found the answer. If NSAffineTransform set to NSImage's context, it transforms the coordinate system to pixel dimension, which is 2 x point dimension. Even if it's empty like this:
NSAffineTransform *trans = [[[NSAffineTransform alloc] init] autorelease];
[trans set];
I don't know if it's a bug or it's the way it works though.
Resetting the transformation is no bug (referring to your own answer). Using hidpi makes the default transform be one that lets most high-level code work well as-is. Resetting to the identity transform undoes this to align the coordinate system 1:1 with pixels.
One should rarely need to do this though. Simplifying your code to take out the image reps and the transform change gives you get this:
NSImage *img = [[[NSImage alloc]initWithContentsOfFile:#"/Users/support/Pictures/cat.JPG"] autorelease];
NSSize size = img.size;
NSBezierPath *path = [NSBezierPath bezierPath];
[[NSColor redColor] setStroke];
[path moveToPoint:NSMakePoint(0.0, 0.0)];
[path lineToPoint:NSMakePoint(size.width, size.height)];
[path moveToPoint:NSMakePoint(0.0, size.height)];
[path lineToPoint:NSMakePoint(size.width, 0.0)];
[path setLineWidth:1];
[path stroke];
[img unlockFocus];
[imageView setImage:img];
...
This should work except the line would instead be 2 physical pixels wide (but of course similar width to the image on a regular screen). Try this simpler special case to fix that:
[path setLineWidth:(img.scale > 1) ? 0.5 : 1.0];

NSImage transparency

I'm trying to set a custom drag icon for use in an NSTableView. Everything seems to work but I've run into a problem due to my inexperience with Quartz.
- (NSImage *)dragImageForRowsWithIndexes:(NSIndexSet *)dragRows tableColumns:(NSArray *)tableColumns event:(NSEvent *)dragEvent offset:(NSPointPointer)dragImageOffset
{
NSImage *dragImage = [NSImage imageNamed:#"icon.png"];
NSString *count = [NSString stringWithFormat:#"%d", [dragRows count]];
[dragImage lockFocus];
[dragImage compositeToPoint:NSZeroPoint fromRect:NSZeroRect operation:NSCompositeSourceOver fraction:0.5];
[count drawAtPoint:NSZeroPoint withAttributes:nil];
[dragImage unlockFocus];
return dragImage;
}
Essentially what I'm looking to do is render my icon.png file with 50% opacity along with an NSString which shows the number of rows currently being dragged. The issue I'm seeing is that my NSString renders with a low opacity, but not my icon.
The issue is that you’re drawing your icon on top of itself. What you probably want is something like this:
- (NSImage *)dragImageForRowsWithIndexes:(NSIndexSet *)dragRows tableColumns:(NSArray *)tableColumns event:(NSEvent *)dragEvent offset:(NSPointPointer)dragImageOffset
{
NSImage *icon = [NSImage imageNamed:#"icon.png"];
NSString *count = [NSString stringWithFormat:#"%lu", [dragRows count]];
NSImage *dragImage = [[[NSImage alloc] initWithSize:[icon size]] autorelease];
[dragImage lockFocus];
[icon drawAtPoint:NSZeroPoint fromRect:NSZeroRect operation:NSCompositeSourceOver fraction:0.5];
[count drawAtPoint:NSZeroPoint withAttributes:nil];
[dragImage unlockFocus];
return dragImage;
}

Resources