Why is my CALayer being clipped by my NSView? - macos

The Core Animation Programming Guide led me to believe that a sublayer could, by default, extend outside the bounds of a host view without being clipped. But that's not what's happening for me.
Here's how I initialize my layer-hosting custom view:
- (id)initWithFrame:(NSRect)frame
{
self = [super initWithFrame:frame];
if (self != nil) {
mBaseImage = [[NSImage imageNamed: #"Button.png"] retain];
mSliderImage = [[NSImage imageNamed: #"Track.png"] retain];
mRootLayer = [CALayer layer];
mRootLayer.masksToBounds = NO;
[self setLayer: mRootLayer ];
[self setWantsLayer: YES];
mRootLayer.contents = mBaseImage;
mSliderLayer = [CALayer layer];
[mRootLayer addSublayer: mSliderLayer];
NSSize imSize = [mSliderImage size];
NSRect sliderBounds = NSMakeRect( 0.0f, 0.0f, imSize.width, imSize.height );
mSliderLayer.bounds = sliderBounds;
mSliderLayer.contents = mSliderImage;
mSliderLayer.position = NSMakePoint( 31.0f, 31.0f );
}
return self;
}
The host view, and the image used as its contents, are 62x62, while the sublayer image is 90x10. But the whole thing gets clipped to the 62x62 bounds. What am I missing?

The trick seems to be that the view must be contained in another view (maybe the content view of the window) that has a layer. (Is that documented?)

Related

How do I add a NSButton to a CALayer?

I am trying to add a NSButton on a layer inside a IKImageBrowserCell object. I found this post helpful but it doesn't get into the crux.
I've already tried this:
- (CALayer *) layerForType:(NSString*) type
{
CGColorRef color;
//retrieve some usefull rects
NSRect frame = [self frame];
NSRect imageFrame = [self imageFrame];
NSRect relativeImageFrame = NSMakeRect(imageFrame.origin.x - frame.origin.x, imageFrame.origin.y - frame.origin.y, imageFrame.size.width, imageFrame.size.height);
/* foreground layer */
if(type == IKImageBrowserCellForegroundLayer){
//no foreground layer on place holders
if([self cellState] != IKImageStateReady)
return nil;
//create a foreground layer that will contain several childs layer
CALayer *layer = [CALayer layer];
layer.frame = CGRectMake(0, 0, frame.size.width, frame.size.height);
//add a checkbox to tell whether to upload this one or not
NSRect checkFrame = NSMakeRect( ( frame.size.width/2)-5 , frame.size.height - 19, 18,18);
NSButton *uploadCheckBox = [[NSButton alloc] initWithFrame:checkFrame];
[uploadCheckBox setButtonType:NSSwitchButton];
[layer addSublayer :[uploadCheckBox layer]];
return layer;
}
//(...)
return nil;
}
But unfortunately the button doesn't appear on the layer. I think the placement of the button is fine, since it's based on an example code from Apple's app. I've got a feeling that this line is wrong:
layer addSublayer :[uploadCheckBox layer]];, since I should be adding entire NSButton, not just it's bitmap representation (a layer). Any help greatly appreciated!
You cannot add a NSView inside a CALayer. You should create a new layer for your button purpose and add to your holding layer.

UIImageView filling entire UICollectionView cell

I have a UICollectionView displaying a horizontal layout of images. Under each image, I'm displaying a label. In the storyboard, I've extended the height of the cell so that the label will be displayed underneath the cell. I've also set the height of the UIImageView in storyboard to be 20pts less than the cell. However, no matter what I do, the images take up the entire cell and the label is displayed on top of the image. Should I be setting the size of the imageview elsewhere? I found this thread which I thought would help since it's basically, identical, but the solution did not help me.
Here is my code...
- (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath
{
static NSString *CellIdentifier = #"Cell"; // string value identifier for cell reuse
ImageViewCell *cell = [collectionView dequeueReusableCellWithReuseIdentifier:CellIdentifier forIndexPath:indexPath];
cell.layer.borderWidth = 1.0;
cell.layer.borderColor = [UIColor grayColor].CGColor;
NSString *myPatternString = [self.imageNames objectAtIndex:indexPath.row];
cell.imageView.image = [self.imagesArray objectAtIndex:indexPath.row];
cell.imageView.contentMode = UIViewContentModeScaleAspectFill;
cell.imageView.clipsToBounds = YES;
CGSize labelSize = CGSizeMake(CellWidth, 20);
UILabel *testLabel = [[UILabel alloc] initWithFrame:CGRectMake(cell.bounds.size.width/2, cell.bounds.size.height-labelSize.height, cell.bounds.size.width, labelSize.height)];
testLabel.text = myPatternString;
[cell.contentView addSubview:testLabel];
return cell;
}
Don't know if this will work for you, but I do this will a subclass of UIView which contains two subviews -- a UIImageView for the image and a UILabel for the label. Here is the essence of that subclass below (don't be bothered by the rounding of bottoms; in some areas I need the bottoms to be rounded, in others I don't). I just add this subview to the contentView of the cell. Don't know if it helps you but here it is. BTW, imageView is a class variable here but you might define it as a property.
- (id)initWithFrame:(CGRect)frame withImage:(UIImage *)img withLabel:(NSString *)lbl roundBottoms:(BOOL)roundBottoms;
{
self = [super initWithFrame:frame];
if (self)
{
imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0.0f, 0.0f, frame.size.width, frame.size.height-25.0f)];
if (roundBottoms)
{
imageView.layer.cornerRadius = 12;
imageView.layer.masksToBounds = YES;
}
else
{
CAShapeLayer * maskLayer = [CAShapeLayer layer];
maskLayer.path = [UIBezierPath bezierPathWithRoundedRect: imageView.bounds byRoundingCorners: UIRectCornerTopLeft | UIRectCornerTopRight cornerRadii: (CGSize){12.0, 12.}].CGPath;
imageView.layer.mask = maskLayer;
}
[imageView setImage:img];
imageView.contentMode = UIViewContentModeScaleAspectFill; // default
[self addSubview:imageView];
UILabel * label = [[UILabel alloc] initWithFrame:CGRectMake(0.0f, frame.size.height - 25.0f, frame.size.width, 25.0f)];
label.text = lbl;
labelText = lbl;
label.textAlignment = NSTextAlignmentCenter;
label.textColor = [UIColor blackColor];
label.adjustsFontSizeToFitWidth = YES;
label.minimumScaleFactor = 0.50f;
label.backgroundColor = [UIColor clearColor];
self.tag = imageItem; // 101
[self addSubview:label];
}
return self;
}
I have another version of it I use where I calculate the height of the label based on the overall frame height that is passed in. In this one, the cell sizes are static so the label heights can be as well.

NSView with gradient background AND shadow

I want my NSView to have both gradient background and shadow, so I subclassed NSView and overrode the - (void)awakeFromNib method like this:
[self setWantsLayer:YES]; // view's backing store is using a Core Animation Layer
CAGradientLayer *backgroundGradient = [[CAGradientLayer alloc] init];
backgroundGradient.colors = #[
(__bridge id)CGColorCreateGenericGray(0.85, 1.0),
(__bridge id)CGColorCreateGenericGray(0.94, 1.0)
];
backgroundGradient.masksToBounds = NO;
CALayer *shadowLayer = [CALayer layer];
shadowLayer.shadowColor = CGColorCreateGenericGray(0.5, 1.0);
shadowLayer.shadowOffset = NSMakeSize(0, 0.0);
shadowLayer.shadowRadius = 10.0;
shadowLayer.shadowOpacity = 1.0;
shadowLayer.masksToBounds = NO;
[backgroundGradient addSublayer:shadowLayer];
[self setLayer:backgroundGradient];
However, only the gradient was shown. In addition, if I try [self setLayer:shadowLayer] then the shadow is shown (but of course, no gradient).
How to show both the gradient and the shadow? Thanks a lot!

Cocos2D: openGL-es becoming transparent over UIView

For reasons outside the capabilities of Cocos2D, I needed to make it transparent and show UIView's behind it.
This works fine except under a certain circumstance. When a sprite's opacity becomes more transparent, it makes everything else in Cocos2D drawn under it become transparent for some reason which makes my UIViews show through.
Shown here on the first image is water tiles, drawn in Cocos2D with a UIView of the mountain and the sky showing under it. The second image is the same thing except in Cocos2D there's a black sprite covering the whole screen at half opacity, which for some reason, makes the UIView under it show through. Where this is most visible in my example is the mountain showing through the water tiles.
My delegate's applicationDidFinishLaunching code is as follows:
- (void) applicationDidFinishLaunching:(UIApplication*)application
{
// Init the window
window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
// Try to use CADisplayLink director
// if it fails (SDK < 3.1) use the default director
if( ! [CCDirector setDirectorType:kCCDirectorTypeDisplayLink] )
[CCDirector setDirectorType:kCCDirectorTypeDefault];
CCDirector *director = [CCDirector sharedDirector];
// Init the View Controller
viewController = [[RootViewController alloc] initWithNibName:nil bundle:nil];
viewController.wantsFullScreenLayout = YES;
EAGLView *glView = [EAGLView viewWithFrame:[window bounds]
pixelFormat:kEAGLColorFormatRGBA8 // kEAGLColorFormatRGBA8, kEAGLColorFormatRGB565
depthFormat:0 // GL_DEPTH_COMPONENT16_OES
];
// attach the openglView to the director
[director setOpenGLView:glView];
glView.opaque = NO;
#if GAME_AUTOROTATION == kGameAutorotationUIViewController
[director setDeviceOrientation:kCCDeviceOrientationPortrait];
#else
[director setDeviceOrientation:kCCDeviceOrientationLandscapeLeft];
#endif
[director setAnimationInterval:1.0/60];
[director setDisplayFPS:YES];
//color/gradient BG
bgLayer = [ColorBGView createGradientWithName:#"darkCave"];
CGRect screenRect = [[UIScreen mainScreen] bounds];
CGFloat screenWidth = screenRect.size.width;
CGFloat screenHeight = screenRect.size.height;
bgLayer.frame = CGRectMake(0, 0, screenHeight, screenWidth);
[viewController.view.layer insertSublayer:bgLayer atIndex:0];
// add custom parallax view
parallaxView = [[ParallaxView alloc] init];
parallaxView.frame = CGRectMake(0, 0, 1024, 768);
[viewController.view insertSubview:parallaxView atIndex:1];
[parallaxView release];
// make the OpenGLView a child of the view controller
[viewController.view insertSubview:glView atIndex:2];
viewController.view.opaque = YES;
viewController.view.backgroundColor = [UIColor blackColor];
// make the View Controllers childs of the main window
window.rootViewController = viewController;
foregroundLabelView = [[ForegroundLabelView alloc] init];
foregroundLabelView.frame = CGRectMake(0, 0, 1024, 768);
[viewController.view insertSubview:foregroundLabelView atIndex:3];
foregroundLabelView.hidden = YES;
[foregroundLabelView release];
[window makeKeyAndVisible];
// Default texture format for PNG/BMP/TIFF/JPEG/GIF images
// It can be RGBA8888, RGBA4444, RGB5_A1, RGB565
// You can change anytime.
[CCTexture2D setDefaultAlphaPixelFormat:kTexture2DPixelFormat_RGBA8888]; //
[glView setMultipleTouchEnabled:YES];
// Removes the startup flicker
[self removeStartupFlicker];
// Run the intro Scene
[[CCDirector sharedDirector] runWithScene: [BlankScene scene]];
}
Please let me know if you need anything else. Thank you.

Posting images from camera roll to facebook in xcode 4.5

I've got some functionality set up. But I'm lost on where to go from here. I can probably figure out what to do with facebook once I know how to actually use the images. I tried saving the image into NSDictionary and then redirecting to a different View Controller, but it won't let me redirect from within the imagePickerController method. So anybody have any idea how to use the image selected from camera roll?
My next idea was to save it in the NSDictionary and then just have a statement checking to see if the NSdictionary value changed but that's not an efficient way of doing it at all.
EDITED below to include the answer provided, but nothing happens, no image displays or anything. What am I missing?
- (void) useCameraRoll
{
if ([UIImagePickerController isSourceTypeAvailable:
UIImagePickerControllerSourceTypeSavedPhotosAlbum])
{
UIImagePickerController *imagePicker =
[[UIImagePickerController alloc] init];
imagePicker.delegate = self;
imagePicker.sourceType =
UIImagePickerControllerSourceTypePhotoLibrary;
imagePicker.mediaTypes = [NSArray arrayWithObjects:
(NSString *) kUTTypeImage,
nil];
imagePicker.allowsEditing = NO;
[self presentModalViewController:imagePicker animated:YES];
[imagePicker release];
newMedia = NO;
}
}
-(void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSString *mediaType = [info
objectForKey:UIImagePickerControllerMediaType];
[self dismissModalViewControllerAnimated:YES];
if ([mediaType isEqualToString:(NSString *)kUTTypeImage]) {
UIImage *image = [info
objectForKey:UIImagePickerControllerOriginalImage];
NSLog(#"image:%#",image);
displayPhoto.image = image;
[displayPhoto setFrame:CGRectMake(0, 0, 320, 480)];
[[self view] addSubview:displayPhoto];
}
else if ([mediaType isEqualToString:(NSString *)kUTTypeMovie])
{
// Code here to support video if enabled
}
}
Assuming you have called this from a view controller with a button that you have created, why not just add a UIImageView onto your view controller? Call it myPhotoImageView or something like that and then in the didFinishPickingMediaWithInfo method, just add the following line of code after
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
as
myPhotoImageView.image = image;
To size the image view so that the aspect looks nice do this.
CGSize size = image.size;
CGRect photoFrame;
if (size.width > size.height)
{
// Landscape
CGFloat scaleFactor = 320 / size.width;
photoFrame = CGRectMake(0, 0, 320, size.height*scaleFactor);
}
else
{
// Portrait
CGFloat scaleFactor = 320 / size.height;
photoFrame = CGRectMake(0, 0, size.width*scaleFactor, 320);
}
myPhotoImageView.frame = photoFrame;

Resources