Capturing an offline NSView to an NSImage - cocoa

I'm trying to make a custom animation for replacing an NSView with another.
For that reason I need to get an image of the NSView before it appears on the screen.
The view may contain layers and NSOpenGLView subviews, and therefore standard options like initWithFocusedViewRect and bitmapImageRepForCachingDisplayInRect do not work well in this case (they layers or OpenGL content well in my experiments).
I am looking for something like CGWindowListCreateImage, that is able to "capture" an offline NSWindow including layers and OpenGL content.
Any suggestions?

I created a category for this:
#implementation NSView (PecuniaAdditions)
/**
* Returns an offscreen view containing all visual elements of this view for printing,
* including CALayer content. Useful only for views that are layer-backed.
*/
- (NSView*)printViewForLayerBackedView;
{
NSRect bounds = self.bounds;
int bitmapBytesPerRow = 4 * bounds.size.width;
CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceSRGB);
CGContextRef context = CGBitmapContextCreate (NULL,
bounds.size.width,
bounds.size.height,
8,
bitmapBytesPerRow,
colorSpace,
kCGImageAlphaPremultipliedLast);
CGColorSpaceRelease(colorSpace);
if (context == NULL)
{
NSLog(#"getPrintViewForLayerBackedView: Failed to create context.");
return nil;
}
[[self layer] renderInContext: context];
CGImageRef img = CGBitmapContextCreateImage(context);
NSImage* image = [[NSImage alloc] initWithCGImage: img size: bounds.size];
NSImageView* canvas = [[NSImageView alloc] initWithFrame: bounds];
[canvas setImage: image];
CFRelease(img);
CFRelease(context);
return canvas;
}
#end
This code is primarily for printing NSViews which contain layered child views. Might help you too.

Related

How do I add a NSButton to a CALayer?

I am trying to add a NSButton on a layer inside a IKImageBrowserCell object. I found this post helpful but it doesn't get into the crux.
I've already tried this:
- (CALayer *) layerForType:(NSString*) type
{
CGColorRef color;
//retrieve some usefull rects
NSRect frame = [self frame];
NSRect imageFrame = [self imageFrame];
NSRect relativeImageFrame = NSMakeRect(imageFrame.origin.x - frame.origin.x, imageFrame.origin.y - frame.origin.y, imageFrame.size.width, imageFrame.size.height);
/* foreground layer */
if(type == IKImageBrowserCellForegroundLayer){
//no foreground layer on place holders
if([self cellState] != IKImageStateReady)
return nil;
//create a foreground layer that will contain several childs layer
CALayer *layer = [CALayer layer];
layer.frame = CGRectMake(0, 0, frame.size.width, frame.size.height);
//add a checkbox to tell whether to upload this one or not
NSRect checkFrame = NSMakeRect( ( frame.size.width/2)-5 , frame.size.height - 19, 18,18);
NSButton *uploadCheckBox = [[NSButton alloc] initWithFrame:checkFrame];
[uploadCheckBox setButtonType:NSSwitchButton];
[layer addSublayer :[uploadCheckBox layer]];
return layer;
}
//(...)
return nil;
}
But unfortunately the button doesn't appear on the layer. I think the placement of the button is fine, since it's based on an example code from Apple's app. I've got a feeling that this line is wrong:
layer addSublayer :[uploadCheckBox layer]];, since I should be adding entire NSButton, not just it's bitmap representation (a layer). Any help greatly appreciated!
You cannot add a NSView inside a CALayer. You should create a new layer for your button purpose and add to your holding layer.

How can I show an image in a NSView using an CGImageRef image

I want to show an image in NSview or in the NSImageView. In my header file I have
#interface FVView : NSView
{
NSImageView *imageView;
}
#end
here is what I been trying to do in my implementation file:
- (void)drawRect:(NSRect)dirtyRect
{
[super drawRect:dirtyRect];
(Here I get an image called fitsImage........ then I do)
//Here I make the image
CGImageRef cgImage = CGImageRetain([fitsImage CGImageScaledToSize:maxSize]);
NSImage *imageR = [self imageFromCGImageRef:cgImage];
[imageR lockFocus];
//Here I have the view context
CGContextRef ctx = (CGContextRef)[[NSGraphicsContext currentContext] graphicsPort];
//Here I set the via dimensions
CGRect renderRect = CGRectMake(0., 0., maxSize.width, maxSize.height);
[self.layer renderInContext:ctx];
[imageR unlockFocus];
CGContextDrawImage(ctx, renderRect, cgImage);
CGImageRelease(cgImage);
}
I don't get anything in the NSview window when I run the script. No errors at all I just can't see what I'm doing wrong. My Xcode version in 5.1.1
I'm trying to learn how to manipulate CGImageRef and view it in a window or nsview.
Thank you.
I'm not quite sure what exactly your setup is. Drawing an image in a custom view is a separate thing from using an NSImageView. Also, a custom view that may (or may not) be layer-backed is different from a layer-hosting view.
You have a lot of the right elements, but they're all mixed up together. In no case do you have to lock focus on an NSImage. That's for drawing into an NSImage. Also, a custom view that subclasses from NSView doesn't have to call super in its -drawRect:. NSView doesn't draw anything.
To draw an image in a custom view, try:
- (void) drawRect:(NSRect)dirtyRect
{
CGImageRef cgImage = /* ... */;
NSSize maxSize = /* ... */;
CGContextRef ctx = (CGContextRef)[[NSGraphicsContext currentContext] graphicsPort];
CGRect renderRect = CGRectMake(0., 0., maxSize.width, maxSize.height);
CGContextDrawImage(ctx, renderRect, cgImage);
CGImageRelease(cgImage);
}
If you have an NSImageView, then you don't need a custom view or any drawing method or code. Just do the following at the point where you obtain the image or the information necessary to generate it:
NSImageView* imageView = /* ... */; // Often an outlet to a view in a NIB rather than a local variable.
CGImageRef cgImage = /* ... */;
NSImage* image = [[NSImage alloc] initWithCGImage:cgImage size:/* ... */];
imageView.image = image;
CGImageRelease(cgImage);
If you're working with a layer-hosting view, you just need to set the CGImage as the layer's content. Again, you do this whenever you obtain the image or the information necessary to generate it. It's not in -drawRect:.
CALayer* layer = /* ... */; // Perhaps someView.layer
CGImageRef cgImage = /* ... */;
layer.contents = (__bridge id)cgImage;
CGImageRelease(cgImage);

NSView image corruption when dragging from scaled view

I have a custom subclass of NSView that implements drag/drop for copying the image in the view to another application. The relevant code in my class looks like this:
#pragma mark -
#pragma mark Dragging Support
- (NSImage *)imageWithSubviews
{
NSSize imgSize = self.bounds.size;
NSBitmapImageRep *bir = [[NSBitmapImageRep alloc] initWithFocusedViewRect:[self frame]];
[self cacheDisplayInRect:[self frame] toBitmapImageRep:bir];
NSImage* image = [[NSImage alloc]initWithSize:imgSize];
[image addRepresentation:bir];
return image;
}
- (void)mouseDown:(NSEvent *)theEvent
{
NSSize dragOffset = NSMakeSize(0.0, 0.0); // not used in the method below, but required.
NSPasteboard *pboard;
NSImage *image = [self imageWithSubviews];
pboard = [NSPasteboard pasteboardWithName:NSDragPboard];
[pboard declareTypes:[NSArray arrayWithObject:NSTIFFPboardType]
owner:self];
[pboard setData:[image TIFFRepresentation]
forType:NSTIFFPboardType];
[self dragImage:image
at:self.bounds.origin
offset:dragOffset
event:theEvent
pasteboard:pboard
source:self
slideBack:YES];
return;
}
#pragma mark -
#pragma mark NSDraggingSource Protocol
- (NSDragOperation)draggingSession:(NSDraggingSession *)session sourceOperationMaskForDraggingContext:(NSDraggingContext)context
{
return NSDragOperationCopy;
}
- (BOOL)ignoreModifierKeysForDraggingSession:(NSDraggingSession *)session
{
return YES;
}
This works as expected until I resize the main window. The main window only increases size/width in the same increments to maintain the proper ratio in this view. The view properly displays its content on the screen when the window is resized.
The problem comes when I resize the window more than about + 25%. While it still displays as expected, the image that is dragged off of it (into Pages, for example) is corrupt. It appears to have a portion of this image repeated on top of itself.
Here is what it looks like normally:
And here is what it looks like when dragged to Pages after resizing the main window to make it large (downsized to show here -- imagine it at 2-3x the size of the first image):
Note that I highlighted the corrupt area with a dotted rectangle.
A few more notes:
I have my bounds set like NSMakeRect(-200,-200,400,400) because it makes the symmetrical drawing a bit easier. When the window resizes, I recalculate the bounds to keep 0,0 in the center of the NSView. The NSView always is square.
Finally, the Apple docs state the following for the bitmapImageRep parameter in cacheDisplayInRect:toBitmapImageRep: should
An NSBitmapImageRep object. For pixel-format compatibility, bitmapImageRep should have been obtained from bitmapImageRepForCachingDisplayInRect:.
I've tried using bitmapImageRepForCachingDisplayInRect:, but then all I see is the lower-left quadrant of the pyramid in the upper-right quadrant of the image. That makes me think that I need to add an offset for the capture of the bitmapImageRep, but I've been unable to determine how to do that.
Here's what the code for imageWithSubviews looks like when I try that:
- (NSImage *)imageWithSubviews
{
NSSize imgSize = self.bounds.size;
NSBitmapImageRep *bir = [self bitmapImageRepForCachingDisplayInRect:[self bounds]];
[self cacheDisplayInRect:[self bounds] toBitmapImageRep:bir];
NSImage* image = [[NSImage alloc]initWithSize:imgSize];
[image addRepresentation:bir];
return image;
}
And this is how the resulting image appears:
That is a view of the lower left quadrant being drawn in the upper-right corner.
What is causing the corruption when I drag from the NSView after enlarging the window? How to I fix that and/or change my implementation of the methods that I listed above to avoid the problem?
More info:
When I change the imageWithSubviews method to:
- (NSImage *)imageWithSubviews
{
NSSize imgSize = self.bounds.size;
NSBitmapImageRep *bir = [[NSBitmapImageRep alloc] initWithFocusedViewRect:[self frame]];
[self cacheDisplayInRect:[self bounds] toBitmapImageRep:bir];
NSImage* image = [[NSImage alloc]initWithSize:imgSize];
[image addRepresentation:bir];
return image;
}
I get a corrupted image without scaling, where the bottom-left quadrant of the image is drawn again on top of the top-right quadrant, like this:
What in the world am I doing wrong?
Solution:
While it does not address the core problem of drawing with NSBitmapImageRep, the following -imageWithSubviews prevents the corruption and outputs the correct image:
- (NSImage *)imageWithSubviews
{
NSData *pdfData = [self dataWithPDFInsideRect:[self bounds]];
NSImage* image = [[NSImage alloc] initWithData:pdfData];
return image;
}
Based on some debugging above, we determined the problem was in -imageWithSubviews.
Instead of generating image data for the view using -cacheDisplayInRect:toBitmapImageRep:, changing it to -dataWithPDFInRect: fixed the issue.

how to get window with semi-transparent blurred background

I'd like to get a window that has a semi-transparent blurred background, just like what the Terminal can do. See this video, about 30 sec in, to see what I mean: http://www.youtube.com/watch?v=zo8KPRY6-Mk
See an image here: http://osxdaily.com/wp-content/uploads/2011/04/mac-os-x-lion-terminal.jpg
I've been googling for an hour, and can't get anything to work. I believe I need to somehow create a core animation layer and add a background filter, but I've been unsuccessful so far... I just see the gray background of my window. Here's the code I've got so far:
Code:
// Get the content view -- everything but the titlebar.
NSView *theView = [[self window] contentView];
[theView setAlphaValue:0.5];
// Create core animation layer, with filter
CALayer *backgroundLayer = [CALayer layer];
[theView setWantsLayer:YES];
[theView setLayer:backgroundLayer];
CIFilter *blurFilter = [CIFilter filterWithName:#"CIGaussianBlur"];
[blurFilter setDefaults];
[theView layer].backgroundFilters = [NSArray arrayWithObject:blurFilter];
[[theView layer] setBackgroundFilters:[NSArray arrayWithObject:blurFilter]];
Any tips or examples to do what I'm trying to do?
Thanks!
no need for layers and filters, NSWindow can do it itself
[mywindow setOpaque:NO];
[mywindow setBackgroundColor: [NSColor colorWithCalibratedHue:0.0 saturation:0.0 brightness:0.2 alpha:0.5]];
please do not use this, as it will alpha your title bar also (post it here just in case others need)
[mywindow setOpaque:NO];
[mywindow setBackgroundColor: [NSColor blackColor]];
[mywindow setAlphaValue:0.5];
For the transparency use Jiulong Zhao's suggestion.
For a blurred background use this
The call on a NSWindow :
[self enableBlurForWindow:self];
The function :
-(void)enableBlurForWindow:(NSWindow *)window
{
//!!!! Uses private API - copied from http://blog.steventroughtonsmith.com/2008/03/using-core-image-filters-onunder.html
CGSConnection thisConnection;
uint32_t compositingFilter;
int compositingType = 1; // Under the window
/* Make a new connection to CoreGraphics */
CGSNewConnection(NULL, &thisConnection);
/* Create a CoreImage filter and set it up */
CGSNewCIFilterByName(thisConnection, (CFStringRef)#"CIGaussianBlur", &compositingFilter);
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithFloat:2.0] forKey:#"inputRadius"];
CGSSetCIFilterValuesFromDictionary(thisConnection, compositingFilter, (__bridge CFDictionaryRef)options);
/* Now apply the filter to the window */
CGSAddWindowFilter(thisConnection, [window windowNumber], compositingFilter, compositingType);
}
NB: It uses a private API
For those reading this in 2017 and using Swift 4 and wanting to change your BG Alpha you can add the following to your custom NSWindow class:
self.backgroundColor = NSColor.black
self.backgroundColor = NSColor.init(calibratedHue: 0, saturation: 0, brightness: 0, alpha: 0.2)
p.s. I do not need the blur effect yet and when I do, I'll update the answer

Using layer backed NSView as NSDockTile contentView

Is there a way to use layer backed NSView as the contentView of a NSDockTile? Tried all sorts of tricks, but all I get is transparent area. Also tried going different route and get an image out of the CALayer and use that for [NSApp setApplicationIconImage:], but no luck either - I think the issue here is creating image representation for offscreen image.
As usual, I got my answer soon after posting the question :) I'll post it here for future reference: I solved it by creating NSImage out of the layer as described in Cocoa is my girlfriend blog post here http://www.cimgf.com/2009/02/03/record-your-core-animation-animation/
The only missing piece is that in order to have anything rendered, a view must be added to a window, so using example code from the post, my solution is:
NSView *myView = ...
NSWindow *window = [[NSWindow alloc] initWithContentRect:NSMakeRect(-1000.0, -1000.0, 256.0, 256.0) styleMask:0 backing:NSBackingStoreNonretained defer:NO];
[window setContentView:myView];
NSUInteger pixelsHigh = myView.bounds.size.height;
NSUInteger pixelsWide = myView.bounds.size.width;
NSUInteger bitmapBytesPerRow = pixelsWide * 4;
CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);
CGContextRef context = CGBitmapContextCreate(NULL, pixelsWide, pixelsHigh, 8, bitmapBytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast);
CGColorSpaceRelease(colorSpace);
[myView.layer.presentationLayer renderInContext:context];
CGImageRef image = CGBitmapContextCreateImage(context);
NSBitmapImageRep *bitmap = [[NSBitmapImageRep alloc] initWithCGImage:image];
CFRelease(image);
NSImage *img = [[NSImage alloc] initWithData:[bitmap TIFFRepresentation]];
[NSApp setApplicationIconImage:img];

Resources