CGImage gets stretched in CALayer - calayer

I have an NSMutableArray of CGImage objects. All the images have different dimensions ( 80x20 px, 200x200 px, 10x10 px, etc...)
I'm trying to animate these in CALayer which has 256x256 pixels size. The animation works, but my CGImages get stretched to the dimensions of the CALayer. How could I prevent my CGImages from getting scaled?
thank you.

To answer my own question. I had to set my layer's contentsGravity property to kCAGravityCenter and that did the trick. The default value of contentsGravity is kCAGravityResize which makes things stretched to fill in layer's bounds.

Related

Obtaining RGB value of a pixel or average RGB value of pixels in UIImageView behind a UILabel in Swift

I have a UILabel that is placed on top of a UIImageView. The UIImageView can change displaying a variety of images. Depending on the image displayed in the UIImageView, the UILabel's text color must be able to change dynamically. For instance, when a light colored image is displayed, a dark UILabel text color will be used, or when a dark image is used, a light text color will be used.
In Swift, what is the best, simple, most efficient method to extract the RGB value of a single pixel or average RGB value of a group of pixels directly behind the position of UILabel that is sitting above a UIImage?
Or even better, in Swift, is there a UILabel method that changes the text colour dynamically based on the background it is positioned above?
Thank you.
Honestly, I would not even grab RGB, you should know what image you are putting into UIImageView, so plan the label based on that.
If you must choose RGB, then do so,thing like this:
UIGraphicsBeginImageContext(CGSizeMake(1,1));
let context = UIGraphicsGetCurrentContext()
//We translate in the opposite direction so that when we draw to the canvas, it will draw our point at the first spot in readable memory
CGContextTranslateCTM(context, -x, -y);
// Make the CALayer to draw in our "canvas".
self.layer.renderInContext(context!);
let colorpoint = UnsafeMutablePointer<UInt32>(CGBitmapContextGetData(context));
UIGraphicsEndImageContext();
Where x and y is the point you want to grab, and self is your UIImageView.
Edit: #Mr.T posted a link to how this is done as well, if you need the average, just grab the amount of pixels needed by changing UIGraphicsBeginImageContext(CGSizeMake(1,1)); to UIGraphicsBeginImageContext(CGSizeMake(width,height)); and compute the average with that data

How to keep sprite border the same size when scaling sprite in Unity?

I've created an image in Photoshop to be used as a sprite in Unity and everything works fine while the sprite is scaled at X: 1; Y: 1.
The problem starts when I scale the image up as the border of the image stretches out with the rest of the image. Is there any way to scale an image from its centre or to ignore the image's border when it's scaled?
Here's the example now that I am able to show it:
The rectangle on top is the original image without being scaled up or down and the rectangle on the bottom is scaled at X:5, Y:0.5 but the borders are stretched.
I think that the borders are stretched because it's part of the image and when it's being scaled, the image (including the borders) is just being stretched.
Is there any way to stretch the sprite image but by ignoring the borders?
Are you trying to scale the image and keep the original ratio?
If so, here are the steps:
Hope this helps. Please let me know if you were trying to do something else.
You can use a sliced sprite. The center of the image is scaled to fit the control rectangle but the borders maintain their sizes regardless of the scaling. Check out the Unity doc here: Unity - Manual: Image

CALayer blurry for odd numbered sizes

I am setting the -contents property of a CALayer to various CGImageRefs and I noticed that when the image I set the contents to has an odd numbered size the image becomes blurry.
My layer frame itself is on pixel and I have it set to not resize its contents. Is there any way to force the layer to draw it's contents on pixel?
I had to set the anchorPoint to (0,0) and change my code to position my layer to account for half of the layer's size and make the new frames have integral locations.

cocoa: How can I draw a scaled up version of NSBitmapImageRep?

I want to use NSBitmapImageRep to construct a 64x64 pixel sprite in code, and then draw it to the screen, blown up very large. The result would be very large "pixels" on the screen. Think old school Mario Bros. or Minecraft. How can I do this?
Edit I want to draw to this off-screen bitmap and then render it later on a CALayer
Open an new image context with CGBitmapContextCreate and use
void CGContextSetInterpolationQuality (
CGContextRef c,
CGInterpolationQuality quality
);
to set the interpolation quality to kCGInterpolationNone.
Then draw the image into the context.

Zooming in on CIImage or NSImage?

I am taking photos from a webcam in my Cocoa application and I would like to zoom in on the centre of the image I receive. I start by receiving a CIImage and eventually save an NSImage.
How would I go about zooming in on either of these objects?
“Zoom” means a couple of things. You'll need at least to crop the image, and you may want to scale up. Or you may want to reserve scaling for display only.
CGImage
To crop it, use a CICrop filter.
To scale it, use either a CILanczosScaleTransform filter or a CIAffineTransform filter.
To crop and scale it, use both filters. Simply pass the output of the crop as the input of the scale.
NSImage
Crop and scale are the same operation here. You'll need to create a new, empty NSImage of the desired size (whether it's the size of the source crop if you won't zoom or an increased size if you will zoom), lock focus on it, draw the crop rectangle from the source image into the bounding rectangle of the destination image, and unlock focus.
If the destination rectangle is not the same size as the source (crop) rectangle, it will scale; if they are the same size, it will simply copy or composite pixel-to-pixel.

Resources