Autolayout - Resizing subviews proportional to their superview - uiimageview

I've got a view with two vertical constraints, a 1:1 aspect ratio costraint and a center alignment so it gets automatically resized depending on height of the screen.
Now, this view is filled with a bunch of smaller subviews in the form of UIImageViews. I'd like these image views to automatically resize proportional to the superview but can't figure out how to do that. I've tried countless versions of constraints inside of the superview but all of them ended in a mess.
Any ideas?

Here is an example with 1 UIImageView:
first set the position of imageView in superview(for simplicity I chose upper left corner):
NSDictionary* views = NSDictionaryOfVariableBindings(orangeView);
[superview addConstraints:[NSLayoutConstraint constraintsWithVisualFormat:#"H:|[imageView]"
options:0
metrics:nil
views:views]];
[superview addConstraints:[NSLayoutConstraint constraintsWithVisualFormat:#"V:|[imageView]"
options:0
metrics:nil
views:views]];
after that set the width and height of the imageView to be relative to its superview:
[superview addConstraint:[NSLayoutConstraint constraintWithItem:imageView
attribute:NSLayoutAttributeHeight
relatedBy:NSLayoutRelationEqual
toItem:redView
attribute:NSLayoutAttributeHeight
multiplier:0.4//between 0.0 and 1.0
constant:0]];
[superview addConstraint:[NSLayoutConstraint constraintWithItem:imageView
attribute:NSLayoutAttributeWidth
relatedBy:NSLayoutRelationEqual
toItem:redView
attribute:NSLayoutAttributeWidth
multiplier:0.4//between 0.0 and 1.0
constant:0]];

Related

How can CALayer image edges be prevented from stretching during resize?

I am setting the .contents of a CALayer to a CGImage, derived from drawing into an NSBitMapImageRep.
As far as I understand from the docs and WWDC videos, setting the layer's .contentsCenter to an NSRect like {{0.5, 0.5}, {0, 0}}, in combination with a .contentsGravity of kCAGravityResize should lead to Core Animation resizing the layer by stretching the middle pixel, the top and bottom horizontally, and the sides vertically.
This very nearly works, but not quite. The layer resizes more-or-less correctly, but if I draw lines at the edge of the bitmap, as I resize the window the lines can be seen to fluctuate in thickness very slightly. It's subtle enough to be barely a problem until the resizing gets down to around 1/4 of the original layer's size, below which point the lines can thin and disappear altogether. If I draw the bitmaps multiple times at different sizes, small differences in line thickness are very apparent.
I originally canvassed a pixel-alignment issue, but it can't be that because the thickness of the stationary LH edge (for example) will fluctuate as I resize the RH edge. It happens on 1x and 2x screens.
Here's some test code. It's the updateLayer method from a layer-backed NSView subclass (I'm using the alternative non-DrawRect draw path):
- (void)updateLayer {
id image = [self imageForCurrentScaleFactor]; // CGImage
self.layer.contents = image;
// self.backingScaleFactor is set from the window's backingScaleFactor
self.layer.contentsScale = self.backingScaleFactor;
self.layer.contentsCenter = NSMakeRect(0.5, 0.5, 0, 0);
self.layer.contentsGravity = kCAGravityResize;
}
And here's some test drawing code (creating the image supplied by imageForCurrentScaleFactor above):
CGFloat width = rect.size.width;
CGFloat height = rect.size.height;
NSBitmapImageRep *imageRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes: NULL
pixelsWide: width * scaleFactor
pixelsHigh: height * scaleFactor
bitsPerSample: 8
samplesPerPixel: 4
hasAlpha: YES
isPlanar: NO
colorSpaceName: NSCalibratedRGBColorSpace
bytesPerRow: 0
bitsPerPixel: 0];
[imageRep setSize:rect.size];
[NSGraphicsContext saveGraphicsState];
NSGraphicsContext *ctx = [NSGraphicsContext graphicsContextWithBitmapImageRep:imageRep];
[NSGraphicsContext setCurrentContext:ctx];
[[NSColor whiteColor] setFill];
[NSBezierPath fillRect:rect];
[[NSColor blackColor] setStroke];
[NSBezierPath setDefaultLineWidth:1.0f];
[NSBezierPath strokeRect:insetRect];
[NSGraphicsContext restoreGraphicsState];
// image for CALayer.contents is now [imageRep CGImage]
The solution (if you're talking about the problem I think you're talking about) is to have a margin of transparent pixels forming the outside edges of the image. One pixel thick, all the way around, will do it. The reason is that the problem (if it's the problem I think it is) arises only with visible pixels that touch the outside edge of the image. Therefore the idea is to have no visible pixels touch the outside edge of the image.
I have found a practical answer, but would be interested in comments filling in detail from anyone who knows how this works.
The problem did prove to be to do with how the CALayer was being stretched. I was drawing into a bitmap of arbitrary size, on the basis that (as the CALayer docs suggest) use of a .contentsCenter with zero width and height would in effect do a nine-part-image stretch, selecting the single centre pixel as the central stretching portion. With this bitmap as a layer's .contents, I could then resize the CALayer to any desired size (down or up).
Turns out that the 'artibrary size' was the problem. Something odd happens in the way CALayer stretches the edge portions (at least when resizing down). By instead making the initial frame for drawing tiny (ie. just big enough to fit my outline drawing plus a couple of pixels for the central stretching portion), nothing spurious makes its way into the edges during stretching.
The bitmap stretches properly if created with rect just big enough to fit the contents and stretchable center pixel, ie.:
NSRect rect = NSMakeRect(0, 0, lineWidth * 2 + 2, lineWidth * 2 + 2);
This tiny image stretches to any larger size perfectly.

NSBezierPath with transparent fill

I've got a NSBezierPath that needs to have a semi-transparent fill. When I fill it with a solid color, I get the expected result. However, when filled with a semi-transparent color I get a path with a rounded stroke but an odd, rectangular fill. It looks like:
Instead of filling the entire area, I get a filled rectangle inside the stoke with a small, unfilled boarder. I set up my path as follows:
NSBezierPath *menuItem = [NSBezierPath bezierPathWithRoundedRect:menuItemRect xRadius:3 yRadius:3]
[menuItem setLineWidth:4.0];
[menuItem setLineJoinStyle:NSRoundLineJoinStyle];
[[NSColor whiteColor] set];
[menuItem stroke];
[[NSColor colorWithCalibratedRed:0.000 green:0.000 blue:0.000 alpha:0.500] set];
[menuItem fill];
If anyones got any ideas, that would be great.
Thanks
The semi-transparent fill is overlapping with the border as NSBezierPath strokes from the middle of the path which is causing that small border. The white border and the other border should add up to 4 pixels. So to fix this I think you'll need to create another bezier path to prevent the 2 overlapping each other.

how to get NSBezier paths to stroke with a consistant line width

In my NSView subclass in drawRect I stroke a number of NSBezierPaths. I would like the lines drawn as a result of these strokes to have the exact same with, preferably just a couple of pixels wide no matter the scaling of the view. Here's my drawRect:
- (void)drawRect:(NSRect)dirtyRect
{
NSSize x = [self convertSize:NSMakeSize(1,1) fromView:nil];
printf("size = %f %f\n", x.width, x.height);
for(NSBezierPath *path in self.paths) {
[path setLineWidth:x.width];
[path stroke];
}
}
Here's a screenshot of what I am seeing:
(source: crb at www.sonic.net)
Can anyone suggest how I can get the crisp consistant path outlines that I am looking for?
Thanks.
Try to match the exact pixels of the device. (more difficult since iphone 5)
Do not use coordinates with on half points: like 0.5 (The work on retina, but on "non retina" they are unsharp).
Th eline width goes half to the left / or up, half to the right.
So if you have a lineWidth of 2 and coorinates at integer values it should be sharp.

PNG to NSImage and back causes jaggies near transparency

I'm resizing some PNG files from within a Cocoa app. The files are eventually loaded as OpenGL textures by another app, and a poorly-written shader is applied, which at one point, does the following:
texColor = mix(constant,vec4(texColor.rgb/texColor.a,texColor.a),texColor.a);
Dividing by alpha is a bad idea, and the solution is to ensure that the RGB components of texColor in that step never go above 1. However! For curiosity's sake:
The original PNGs (created in GIMP), surprisingly work fine, and resized versions created with GIMP work fine as well. However, resizing the files using the code below causes the textures to have jaggies near any transparent pixels, even if percent is 1.0. Any idea what it is that I'm unwittingly changing about these images that suddenly causes the shader's bug to present itself?
NSImage* originalImage = [[NSImage alloc] initWithData:[currentFile regularFileContents]];
NSSize newSize = NSMakeSize([originalImage size].width * percent, [originalImage size].height * percent);
NSImage* resizedImage = [[NSImage alloc] initWithSize:newSize];
[resizedImage lockFocus];
[originalImage drawInRect:NSMakeRect(0,0,newSize.width,newSize.height)
fromRect:NSMakeRect(0,0,[originalImage size].width, [originalImage size].height)
operation:NSCompositeCopy fraction:1.0];
[resizedImage unlockFocus];
NSBitmapImageRep* bits = [[[NSBitmapImageRep alloc] initWithCGImage:[resizedImage CGImageForProposedRect:nil context:nil hints:nil]] autorelease];
NSData* data = [bits representationUsingType:NSPNGFileType properties:nil];
NSFileWrapper* newFile = [[[NSFileWrapper alloc] initRegularFileWithContents:data] autorelease];
[newFile setPreferredFilename:currentFilename];
[folder removeFileWrapper:currentFile];
[folder addFileWrapper:newFile];
[originalImage release];
[resizedImage release];
I typically set image interpolation to high when doing these kinds of resizing operations. This may be your issue.
[resizedImage lockFocus];
[NSGraphicsContext saveGraphicsState];
[[NSGraphicsContext currentContext] setImageInterpolation:NSImageInterpolationHigh];
[originalImage drawInRect:...]
[NSGraphicsContext restoreGraphicsState];
[resizedImage unlockFocus];
Another thing to make sure you're doing, though it may not help (see below):
[[NSGraphicsContext currentContext] setShouldAntialias:YES];
This may not fix it because you can't anti-alias without knowing the target background. But it still might help. If this is the problem (that you can't anti-alias this soon), you may have to composite this resizing at the point that you're ready to draw the final image.
What is the DPI of your source PNG? You are creating the second image by assuming that the original image's size is in pixels, but size is in points.
Suppose you have an image that is 450 pixels by 100 pixels, with DPI of 300. That image is, in real world units, 1 1/2 inches x 1/3 inches.
Now, points in Cocoa are nominally 1/72 of an inch. The size of the image in points is 108 x 24.
If you then create a new image based on that size, there's no DPI specified, so the assumption is one pixel per point. You're creating a much smaller image, which means that fine features are going to have to be approximated more coarsely.
You will have better luck if you pick one of the image reps of the original image and use its pixelsWide and pixelsHigh values. When you do this, however, the new image will have a different real world size than the original. In my example, the original was 1 1/2 x 1/3 inches. The new image will have the same pixel dimensions (450 x 100) but at 72 dpi, so it will be 6.25 x 1.39 inches. To fix this, you'll need to set the size of the new bitmap rep in points to the size of the original in points.

NSBezierPath rounded rectangle has bad corners

I have an NSBezierPath that makes a rounded rectangle but the corners of it look choppy and appear brighter that the rest of the stroke when viewed at full scale. My code is:
NSBezierPath *path = [NSBezierPath bezierPath];
[path appendBezierPathWithRoundedRect:NSMakeRect(0, 0, [self bounds].size.width, [self bounds].size.height) xRadius:5 yRadius:5];
NSGradient *fill = [[NSGradient alloc] initWithColorsAndLocations:[NSColor colorWithCalibratedRed:0.247 green:0.251 blue:0.267 alpha:0.6],0.0,[NSColor colorWithCalibratedRed:0.227 green:0.227 blue:0.239 alpha:0.6],0.5,[NSColor colorWithCalibratedRed:0.180 green:0.188 blue:0.196 alpha:0.6],0.5,[NSColor colorWithCalibratedRed:0.137 green:0.137 blue:0.157 alpha:0.6],1.0, nil];
[fill drawInBezierPath:path angle:-90.0];
[[NSColor lightGrayColor] set];
[path stroke];
Heres a picture of 2 of the corners (Its not as obvious in a small picture):
Anyone know what's causing this? Am I just missing something?
Thanks for any help
The straight lines of the roundrect are exactly on the borders of the view, so half the width of each line is getting cut off. (As if they were on a subpixel.)
Try changing
NSMakeRect(0, 0, [self bounds].size.width, [self bounds].size.height)
to
NSMakeRect(0.5, 0.5, [self bounds].size.width - 1, [self bounds].size.height - 1)
If an NSBezierPath ever looks a bit weird or blurry, try shifting it over half a pixel.
Take a look at the setFlatness: method in the NSBezierPath docs. It controls how smooth rendered curves are. I believe setting it to a smaller number (the default being .6) will yield smoother curves, at the cost of more computation (though for simple paths, I doubt it matters a whole lot).

Resources