How do I add a NSButton to a CALayer? - cocoa

I am trying to add a NSButton on a layer inside a IKImageBrowserCell object. I found this post helpful but it doesn't get into the crux.
I've already tried this:
- (CALayer *) layerForType:(NSString*) type
{
CGColorRef color;
//retrieve some usefull rects
NSRect frame = [self frame];
NSRect imageFrame = [self imageFrame];
NSRect relativeImageFrame = NSMakeRect(imageFrame.origin.x - frame.origin.x, imageFrame.origin.y - frame.origin.y, imageFrame.size.width, imageFrame.size.height);
/* foreground layer */
if(type == IKImageBrowserCellForegroundLayer){
//no foreground layer on place holders
if([self cellState] != IKImageStateReady)
return nil;
//create a foreground layer that will contain several childs layer
CALayer *layer = [CALayer layer];
layer.frame = CGRectMake(0, 0, frame.size.width, frame.size.height);
//add a checkbox to tell whether to upload this one or not
NSRect checkFrame = NSMakeRect( ( frame.size.width/2)-5 , frame.size.height - 19, 18,18);
NSButton *uploadCheckBox = [[NSButton alloc] initWithFrame:checkFrame];
[uploadCheckBox setButtonType:NSSwitchButton];
[layer addSublayer :[uploadCheckBox layer]];
return layer;
}
//(...)
return nil;
}
But unfortunately the button doesn't appear on the layer. I think the placement of the button is fine, since it's based on an example code from Apple's app. I've got a feeling that this line is wrong:
layer addSublayer :[uploadCheckBox layer]];, since I should be adding entire NSButton, not just it's bitmap representation (a layer). Any help greatly appreciated!

You cannot add a NSView inside a CALayer. You should create a new layer for your button purpose and add to your holding layer.

Related

subclass a programmatically created NSImage

I programatically create an NSImage like this:
NSImageView *IconBox = [[NSImageView alloc] initWithFrame:CGRectMake(0, 0, 350, 300)];
NSImage *capper = [[NSImage alloc] initWithData:[self.superview dataWithPDFInsideRect:[self.superview bounds]]];
IconBox.image = capper;
[self addSubview:IconBox];
This Image gets main Part of my Window and i want to make this window draggable anywhere, i know i need to set mouseDownCanMoveWindow, but this doesn't work, while this applies to the Windoow, but while my Image is the main visible part of the Window you cannot access its background, so i need to tell the Image to drag the Window,
i read i need to subclass the NSImage and override this Method, but how can i subclass a programatically created NSImage?
I already created the Class Files in my Project, but how can i tell the newly created Image to use this custom class?
Figured it out:
Class snap = NSClassFromString(#"snapper");
object_setClass(IconBox, [snap class]);
This does the trick, also ithe default Method moveWindowByMouseDown didn't work either, but this snippet did:
In the NSImage Subclass:
-(void)mouseDragged:(NSEvent *)theEvent
{
CGFloat deltax = [theEvent deltaX];
CGFloat deltay = [theEvent deltaY];
NSRect frame = [[self window] frame];
frame.origin.x += deltax;
frame.origin.y -= deltay;
[[self window] setFrameOrigin:frame.origin];
}

Why is my CALayer being clipped by my NSView?

The Core Animation Programming Guide led me to believe that a sublayer could, by default, extend outside the bounds of a host view without being clipped. But that's not what's happening for me.
Here's how I initialize my layer-hosting custom view:
- (id)initWithFrame:(NSRect)frame
{
self = [super initWithFrame:frame];
if (self != nil) {
mBaseImage = [[NSImage imageNamed: #"Button.png"] retain];
mSliderImage = [[NSImage imageNamed: #"Track.png"] retain];
mRootLayer = [CALayer layer];
mRootLayer.masksToBounds = NO;
[self setLayer: mRootLayer ];
[self setWantsLayer: YES];
mRootLayer.contents = mBaseImage;
mSliderLayer = [CALayer layer];
[mRootLayer addSublayer: mSliderLayer];
NSSize imSize = [mSliderImage size];
NSRect sliderBounds = NSMakeRect( 0.0f, 0.0f, imSize.width, imSize.height );
mSliderLayer.bounds = sliderBounds;
mSliderLayer.contents = mSliderImage;
mSliderLayer.position = NSMakePoint( 31.0f, 31.0f );
}
return self;
}
The host view, and the image used as its contents, are 62x62, while the sublayer image is 90x10. But the whole thing gets clipped to the 62x62 bounds. What am I missing?
The trick seems to be that the view must be contained in another view (maybe the content view of the window) that has a layer. (Is that documented?)

Capturing an offline NSView to an NSImage

I'm trying to make a custom animation for replacing an NSView with another.
For that reason I need to get an image of the NSView before it appears on the screen.
The view may contain layers and NSOpenGLView subviews, and therefore standard options like initWithFocusedViewRect and bitmapImageRepForCachingDisplayInRect do not work well in this case (they layers or OpenGL content well in my experiments).
I am looking for something like CGWindowListCreateImage, that is able to "capture" an offline NSWindow including layers and OpenGL content.
Any suggestions?
I created a category for this:
#implementation NSView (PecuniaAdditions)
/**
* Returns an offscreen view containing all visual elements of this view for printing,
* including CALayer content. Useful only for views that are layer-backed.
*/
- (NSView*)printViewForLayerBackedView;
{
NSRect bounds = self.bounds;
int bitmapBytesPerRow = 4 * bounds.size.width;
CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceSRGB);
CGContextRef context = CGBitmapContextCreate (NULL,
bounds.size.width,
bounds.size.height,
8,
bitmapBytesPerRow,
colorSpace,
kCGImageAlphaPremultipliedLast);
CGColorSpaceRelease(colorSpace);
if (context == NULL)
{
NSLog(#"getPrintViewForLayerBackedView: Failed to create context.");
return nil;
}
[[self layer] renderInContext: context];
CGImageRef img = CGBitmapContextCreateImage(context);
NSImage* image = [[NSImage alloc] initWithCGImage: img size: bounds.size];
NSImageView* canvas = [[NSImageView alloc] initWithFrame: bounds];
[canvas setImage: image];
CFRelease(img);
CFRelease(context);
return canvas;
}
#end
This code is primarily for printing NSViews which contain layered child views. Might help you too.

Cocos2D: openGL-es becoming transparent over UIView

For reasons outside the capabilities of Cocos2D, I needed to make it transparent and show UIView's behind it.
This works fine except under a certain circumstance. When a sprite's opacity becomes more transparent, it makes everything else in Cocos2D drawn under it become transparent for some reason which makes my UIViews show through.
Shown here on the first image is water tiles, drawn in Cocos2D with a UIView of the mountain and the sky showing under it. The second image is the same thing except in Cocos2D there's a black sprite covering the whole screen at half opacity, which for some reason, makes the UIView under it show through. Where this is most visible in my example is the mountain showing through the water tiles.
My delegate's applicationDidFinishLaunching code is as follows:
- (void) applicationDidFinishLaunching:(UIApplication*)application
{
// Init the window
window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
// Try to use CADisplayLink director
// if it fails (SDK < 3.1) use the default director
if( ! [CCDirector setDirectorType:kCCDirectorTypeDisplayLink] )
[CCDirector setDirectorType:kCCDirectorTypeDefault];
CCDirector *director = [CCDirector sharedDirector];
// Init the View Controller
viewController = [[RootViewController alloc] initWithNibName:nil bundle:nil];
viewController.wantsFullScreenLayout = YES;
EAGLView *glView = [EAGLView viewWithFrame:[window bounds]
pixelFormat:kEAGLColorFormatRGBA8 // kEAGLColorFormatRGBA8, kEAGLColorFormatRGB565
depthFormat:0 // GL_DEPTH_COMPONENT16_OES
];
// attach the openglView to the director
[director setOpenGLView:glView];
glView.opaque = NO;
#if GAME_AUTOROTATION == kGameAutorotationUIViewController
[director setDeviceOrientation:kCCDeviceOrientationPortrait];
#else
[director setDeviceOrientation:kCCDeviceOrientationLandscapeLeft];
#endif
[director setAnimationInterval:1.0/60];
[director setDisplayFPS:YES];
//color/gradient BG
bgLayer = [ColorBGView createGradientWithName:#"darkCave"];
CGRect screenRect = [[UIScreen mainScreen] bounds];
CGFloat screenWidth = screenRect.size.width;
CGFloat screenHeight = screenRect.size.height;
bgLayer.frame = CGRectMake(0, 0, screenHeight, screenWidth);
[viewController.view.layer insertSublayer:bgLayer atIndex:0];
// add custom parallax view
parallaxView = [[ParallaxView alloc] init];
parallaxView.frame = CGRectMake(0, 0, 1024, 768);
[viewController.view insertSubview:parallaxView atIndex:1];
[parallaxView release];
// make the OpenGLView a child of the view controller
[viewController.view insertSubview:glView atIndex:2];
viewController.view.opaque = YES;
viewController.view.backgroundColor = [UIColor blackColor];
// make the View Controllers childs of the main window
window.rootViewController = viewController;
foregroundLabelView = [[ForegroundLabelView alloc] init];
foregroundLabelView.frame = CGRectMake(0, 0, 1024, 768);
[viewController.view insertSubview:foregroundLabelView atIndex:3];
foregroundLabelView.hidden = YES;
[foregroundLabelView release];
[window makeKeyAndVisible];
// Default texture format for PNG/BMP/TIFF/JPEG/GIF images
// It can be RGBA8888, RGBA4444, RGB5_A1, RGB565
// You can change anytime.
[CCTexture2D setDefaultAlphaPixelFormat:kTexture2DPixelFormat_RGBA8888]; //
[glView setMultipleTouchEnabled:YES];
// Removes the startup flicker
[self removeStartupFlicker];
// Run the intro Scene
[[CCDirector sharedDirector] runWithScene: [BlankScene scene]];
}
Please let me know if you need anything else. Thank you.

CGGradient isn't visible (not using interface builder) and UIButtons can't be triggered

I have created a view that contains a CGGradient:
// Bar ContextRef
CGRect bar = CGRectMake(0, screenHeight-staffAlignment, screenWidth, barWidth);
CGContextRef barContext = UIGraphicsGetCurrentContext();
CGContextSaveGState(barContext);
CGContextClipToRect(barContext,bar);
// Bar GradientRef
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGFloat components[16] = { 1.0,1.0,1.0,0.0, 0.0,0.0,0.0,1.0, 0.0,0.0,0.0,1.0, 1.0,1.0,1.0,0.0};
CGFloat locations[4] = {0.95,0.85,0.15,0.05};
size_t count = 4;
CGGradientRef gradientRef = CGGradientCreateWithColorComponents(colorSpace, components, locations, count);
// Draw Bar
CGPoint startPoint = {0.0,0.0};
CGPoint endPoint = {screenWidth,0.0};
CGContextDrawLinearGradient(barContext, gradientRef, startPoint, endPoint, 0);
CGContextRestoreGState(barContext);
This code is called in the drawRect method of the UIView. I then use a UIViewController to access the created view.
- (void)loadView {
MainPageView *mpView = [[MainPageView alloc] initWithFrame:[window bounds]];
[self setView:mpView];
[mpView release];
}
and displayed on the screen through the appDelegate:
mpViewController = [[MainPageViewController alloc] init];
[window addSubview:[mpViewController view]];
[window makeKeyAndVisible];
The UIView contains more objects, such as UIButtons, that are visible. I am assuming because they are added as a subview. But I can't work out how to add the CGGradient as a subview? Does it need to be? Is there another reason CGGradient is not visible?
I also don't get the functionality on the UIButtons. I guess that is because of where I have added the UIButtons to the view. Do the buttons need to be added in the UIViewController or the appDelegate to have functionality. Sorry to ask what would seem like simple questions but I am trying to accomplish the programming without the Interface Builder and material on that is scarce. If anyone could point me in the right direction on both these problems I would really appreciate it.
Thanks!
The functionality on the buttons was lost because the frame was too large but the buttons were still visible because the background was clearColor

Resources