I am setting the -contents property of a CALayer to various CGImageRefs and I noticed that when the image I set the contents to has an odd numbered size the image becomes blurry.
My layer frame itself is on pixel and I have it set to not resize its contents. Is there any way to force the layer to draw it's contents on pixel?
I had to set the anchorPoint to (0,0) and change my code to position my layer to account for half of the layer's size and make the new frames have integral locations.
Related
This is my set up. I have 2 layers with transparency (I don't know if transparency matters here). Layers are the same size, 5x7 inches. Each layer has their image (say I draw a square on it and a circle on the other).
I want to resize ONLY the square.
The problem is when I scale the square I end up either scaling both, the circle AND the square, equally and they retain their layer size, or BOTH layers are rezise and no longer 5x7 inches. I've tried 'Tools-Transform-Scale' and 'Image-Resize canvas or image', but I can't find the tool to just resize ONE of the images.
Any ideas what I'm doing wrong?
Thanks
What you want is the Scale tool, and it will resize only the active layer if it is in Scale: layer mode (you seem to have it in Scale: image mode)(*).
Otherwise, to clear up things:
Image > Canvas size changes the size of the canvas, but nothing is stretched/compressed, the layers retain their size or are extended with transparency or white.
Image > Scale image scales everything in the image (layers, channels, paths...)
(*) Also,if what you apply a transform such as Scale to an item that has the chainlink, the same transform will be applied to all other chainlinked items (other layers, but also paths).
I have a viewer with a perspective camera. I know the size of the viewer and the pixel ratio. I have several sprites in my scene that use the .sizeAttenuation property to never change size.
With all of this, I want to be able to set the scale of the sprite instances to, for example, be 20px x 20px. Is that possible? Is there a known conversion from pixels to sprite scale?
What I am experiencing now is that the sprites will change size depending on the viewer size. I wish to know how to resize them when the viewer changes so they are consistently the same size.
thanks!
I know that frame is a view's frame relative to it's parent, and bounds is the view's internal bounds (with origin always [0, 0] except in the case of scroll views?).
However, I'm unclear under what conditions the frame and bounds size may differ, if at all. Is there official Apple documentation stating whether frame.size = bounds.size or frame.size ?= bounds.size?
You can do whatever you want with the bounds. Imagine that your view is a painting, which you can only view through a camera. Moving the origin will change the portion of the painting which you can currently see by moving the camera around. Shrinking the size zooms in, so that less of the painting is visible, but it appears larger. Expanding the size zooms out, so that more of the painting is visible, but it appears smaller.
From the documentation for bounds:
By default, the origin of the returned rectangle is (0, 0) and its size matches the size of the receiver’s frame rectangle (measured in points)....
If you explicitly change the origin or size of the bounds rectangle, this method does not return the default rectangle and instead returns the rectangle you set. If you add a rotation factor to the view, however, that factor is also reflected in the returned bounds rectangle.
And from the documentation for setBounds::
The bounds rectangle determines the origin and scale of the receiver’s coordinate system within its frame rectangle....
After calling this method, NSView creates an internal transform (or appends these changes to an existing internal transform) to convert from frame coordinates to bounds coordinates in your view. As long as the width-to-height ratio of the two coordinate systems remains the same, your content appears normal. If the ratios differ, your content may appear skewed.
I have an NSMutableArray of CGImage objects. All the images have different dimensions ( 80x20 px, 200x200 px, 10x10 px, etc...)
I'm trying to animate these in CALayer which has 256x256 pixels size. The animation works, but my CGImages get stretched to the dimensions of the CALayer. How could I prevent my CGImages from getting scaled?
thank you.
To answer my own question. I had to set my layer's contentsGravity property to kCAGravityCenter and that did the trick. The default value of contentsGravity is kCAGravityResize which makes things stretched to fill in layer's bounds.
I am taking photos from a webcam in my Cocoa application and I would like to zoom in on the centre of the image I receive. I start by receiving a CIImage and eventually save an NSImage.
How would I go about zooming in on either of these objects?
“Zoom” means a couple of things. You'll need at least to crop the image, and you may want to scale up. Or you may want to reserve scaling for display only.
CGImage
To crop it, use a CICrop filter.
To scale it, use either a CILanczosScaleTransform filter or a CIAffineTransform filter.
To crop and scale it, use both filters. Simply pass the output of the crop as the input of the scale.
NSImage
Crop and scale are the same operation here. You'll need to create a new, empty NSImage of the desired size (whether it's the size of the source crop if you won't zoom or an increased size if you will zoom), lock focus on it, draw the crop rectangle from the source image into the bounding rectangle of the destination image, and unlock focus.
If the destination rectangle is not the same size as the source (crop) rectangle, it will scale; if they are the same size, it will simply copy or composite pixel-to-pixel.