In my NSView subclass in drawRect I stroke a number of NSBezierPaths. I would like the lines drawn as a result of these strokes to have the exact same with, preferably just a couple of pixels wide no matter the scaling of the view. Here's my drawRect:
- (void)drawRect:(NSRect)dirtyRect
{
NSSize x = [self convertSize:NSMakeSize(1,1) fromView:nil];
printf("size = %f %f\n", x.width, x.height);
for(NSBezierPath *path in self.paths) {
[path setLineWidth:x.width];
[path stroke];
}
}
Here's a screenshot of what I am seeing:
(source: crb at www.sonic.net)
Can anyone suggest how I can get the crisp consistant path outlines that I am looking for?
Thanks.
Try to match the exact pixels of the device. (more difficult since iphone 5)
Do not use coordinates with on half points: like 0.5 (The work on retina, but on "non retina" they are unsharp).
Th eline width goes half to the left / or up, half to the right.
So if you have a lineWidth of 2 and coorinates at integer values it should be sharp.
Related
In macOS programming, We know that
Quartz uses a coordinate space where the origin (0, 0) is at the top-left of the primary display. Increasing y goes down.
Cocoa uses a coordinate space where the origin (0, 0) is the bottom-left of the primary display and increasing y goes up.
Now am using a Quartz API - CGImageCreateWithImageInRect to crop an image , which takes a rectangle as a param. The rect has the Y origin coming from Cocoa's mousedown events.
Thus i get crops at inverted locations...
I tried this code to flip my Y co-ordinate in my cropRect
//Get the point in MouseDragged event
NSPoint currentPoint = [self.view convertPoint:[theEvent locationInWindow] fromView:nil];
CGRect nsRect = CGRectMake(currentPoint.x , currentPoint.y,
circleSizeW, circleSizeH);
//Now Flip the Y please!
CGFloat flippedY = self.imageView.frame.size.height - NSMaxY(nsRectFlippedY);
CGRect cropRect = CGRectMake(currentPoint.x, flippedY, circleSizeW, circleSizeH);
But for the areas on the top, i wrong FlippedY coordinates.
If i click near top edge of the view, i get flippedY = 510 to 515
At the top edge it should be between 0 to 10 :-|
Can someone point me to the correct and reliable way to Flip
the Y coordinate in such circumstances? Thank you!
Here is sample project in GitHub highlighting the issue
https://github.com/kamleshgk/SampleMacOSApp
As Charles mentioned, the Core Graphics API you are using requires coordinates relative to the image (not the screen). The important thing is to convert the event location from window coordinates to the view which most closely corresponds to the image's location and then flip it relative to that same view's bounds (not frame). So:
NSView *relevantView = /* only you know which view */;
NSPoint currentPoint = [relevantView convertPoint:[theEvent locationInWindow] fromView:nil];
// currentPoint is in Cocoa's y-up coordinate system, relative to relevantView, which hopefully corresponds to your image's location
currentPoint.y = NSMaxY(relevantView.bounds) - currentPoint.y;
// currentPoint is now flipped to be in Quartz's y-down coordinate system, still relative to relevantView/your image
The rect you pass to CGImageCreateWithImageInRect should be in coordinates relative to the input image's size, not screen coordinates. Assuming the size of the input image matches the size of the view to which you've converted your point, you should be able to achieve this by subtracting the rect's corner from the image's height, rather than the screen height.
I am setting the .contents of a CALayer to a CGImage, derived from drawing into an NSBitMapImageRep.
As far as I understand from the docs and WWDC videos, setting the layer's .contentsCenter to an NSRect like {{0.5, 0.5}, {0, 0}}, in combination with a .contentsGravity of kCAGravityResize should lead to Core Animation resizing the layer by stretching the middle pixel, the top and bottom horizontally, and the sides vertically.
This very nearly works, but not quite. The layer resizes more-or-less correctly, but if I draw lines at the edge of the bitmap, as I resize the window the lines can be seen to fluctuate in thickness very slightly. It's subtle enough to be barely a problem until the resizing gets down to around 1/4 of the original layer's size, below which point the lines can thin and disappear altogether. If I draw the bitmaps multiple times at different sizes, small differences in line thickness are very apparent.
I originally canvassed a pixel-alignment issue, but it can't be that because the thickness of the stationary LH edge (for example) will fluctuate as I resize the RH edge. It happens on 1x and 2x screens.
Here's some test code. It's the updateLayer method from a layer-backed NSView subclass (I'm using the alternative non-DrawRect draw path):
- (void)updateLayer {
id image = [self imageForCurrentScaleFactor]; // CGImage
self.layer.contents = image;
// self.backingScaleFactor is set from the window's backingScaleFactor
self.layer.contentsScale = self.backingScaleFactor;
self.layer.contentsCenter = NSMakeRect(0.5, 0.5, 0, 0);
self.layer.contentsGravity = kCAGravityResize;
}
And here's some test drawing code (creating the image supplied by imageForCurrentScaleFactor above):
CGFloat width = rect.size.width;
CGFloat height = rect.size.height;
NSBitmapImageRep *imageRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes: NULL
pixelsWide: width * scaleFactor
pixelsHigh: height * scaleFactor
bitsPerSample: 8
samplesPerPixel: 4
hasAlpha: YES
isPlanar: NO
colorSpaceName: NSCalibratedRGBColorSpace
bytesPerRow: 0
bitsPerPixel: 0];
[imageRep setSize:rect.size];
[NSGraphicsContext saveGraphicsState];
NSGraphicsContext *ctx = [NSGraphicsContext graphicsContextWithBitmapImageRep:imageRep];
[NSGraphicsContext setCurrentContext:ctx];
[[NSColor whiteColor] setFill];
[NSBezierPath fillRect:rect];
[[NSColor blackColor] setStroke];
[NSBezierPath setDefaultLineWidth:1.0f];
[NSBezierPath strokeRect:insetRect];
[NSGraphicsContext restoreGraphicsState];
// image for CALayer.contents is now [imageRep CGImage]
The solution (if you're talking about the problem I think you're talking about) is to have a margin of transparent pixels forming the outside edges of the image. One pixel thick, all the way around, will do it. The reason is that the problem (if it's the problem I think it is) arises only with visible pixels that touch the outside edge of the image. Therefore the idea is to have no visible pixels touch the outside edge of the image.
I have found a practical answer, but would be interested in comments filling in detail from anyone who knows how this works.
The problem did prove to be to do with how the CALayer was being stretched. I was drawing into a bitmap of arbitrary size, on the basis that (as the CALayer docs suggest) use of a .contentsCenter with zero width and height would in effect do a nine-part-image stretch, selecting the single centre pixel as the central stretching portion. With this bitmap as a layer's .contents, I could then resize the CALayer to any desired size (down or up).
Turns out that the 'artibrary size' was the problem. Something odd happens in the way CALayer stretches the edge portions (at least when resizing down). By instead making the initial frame for drawing tiny (ie. just big enough to fit my outline drawing plus a couple of pixels for the central stretching portion), nothing spurious makes its way into the edges during stretching.
The bitmap stretches properly if created with rect just big enough to fit the contents and stretchable center pixel, ie.:
NSRect rect = NSMakeRect(0, 0, lineWidth * 2 + 2, lineWidth * 2 + 2);
This tiny image stretches to any larger size perfectly.
I'm resizing some PNG files from within a Cocoa app. The files are eventually loaded as OpenGL textures by another app, and a poorly-written shader is applied, which at one point, does the following:
texColor = mix(constant,vec4(texColor.rgb/texColor.a,texColor.a),texColor.a);
Dividing by alpha is a bad idea, and the solution is to ensure that the RGB components of texColor in that step never go above 1. However! For curiosity's sake:
The original PNGs (created in GIMP), surprisingly work fine, and resized versions created with GIMP work fine as well. However, resizing the files using the code below causes the textures to have jaggies near any transparent pixels, even if percent is 1.0. Any idea what it is that I'm unwittingly changing about these images that suddenly causes the shader's bug to present itself?
NSImage* originalImage = [[NSImage alloc] initWithData:[currentFile regularFileContents]];
NSSize newSize = NSMakeSize([originalImage size].width * percent, [originalImage size].height * percent);
NSImage* resizedImage = [[NSImage alloc] initWithSize:newSize];
[resizedImage lockFocus];
[originalImage drawInRect:NSMakeRect(0,0,newSize.width,newSize.height)
fromRect:NSMakeRect(0,0,[originalImage size].width, [originalImage size].height)
operation:NSCompositeCopy fraction:1.0];
[resizedImage unlockFocus];
NSBitmapImageRep* bits = [[[NSBitmapImageRep alloc] initWithCGImage:[resizedImage CGImageForProposedRect:nil context:nil hints:nil]] autorelease];
NSData* data = [bits representationUsingType:NSPNGFileType properties:nil];
NSFileWrapper* newFile = [[[NSFileWrapper alloc] initRegularFileWithContents:data] autorelease];
[newFile setPreferredFilename:currentFilename];
[folder removeFileWrapper:currentFile];
[folder addFileWrapper:newFile];
[originalImage release];
[resizedImage release];
I typically set image interpolation to high when doing these kinds of resizing operations. This may be your issue.
[resizedImage lockFocus];
[NSGraphicsContext saveGraphicsState];
[[NSGraphicsContext currentContext] setImageInterpolation:NSImageInterpolationHigh];
[originalImage drawInRect:...]
[NSGraphicsContext restoreGraphicsState];
[resizedImage unlockFocus];
Another thing to make sure you're doing, though it may not help (see below):
[[NSGraphicsContext currentContext] setShouldAntialias:YES];
This may not fix it because you can't anti-alias without knowing the target background. But it still might help. If this is the problem (that you can't anti-alias this soon), you may have to composite this resizing at the point that you're ready to draw the final image.
What is the DPI of your source PNG? You are creating the second image by assuming that the original image's size is in pixels, but size is in points.
Suppose you have an image that is 450 pixels by 100 pixels, with DPI of 300. That image is, in real world units, 1 1/2 inches x 1/3 inches.
Now, points in Cocoa are nominally 1/72 of an inch. The size of the image in points is 108 x 24.
If you then create a new image based on that size, there's no DPI specified, so the assumption is one pixel per point. You're creating a much smaller image, which means that fine features are going to have to be approximated more coarsely.
You will have better luck if you pick one of the image reps of the original image and use its pixelsWide and pixelsHigh values. When you do this, however, the new image will have a different real world size than the original. In my example, the original was 1 1/2 x 1/3 inches. The new image will have the same pixel dimensions (450 x 100) but at 72 dpi, so it will be 6.25 x 1.39 inches. To fix this, you'll need to set the size of the new bitmap rep in points to the size of the original in points.
I have an NSBezierPath that makes a rounded rectangle but the corners of it look choppy and appear brighter that the rest of the stroke when viewed at full scale. My code is:
NSBezierPath *path = [NSBezierPath bezierPath];
[path appendBezierPathWithRoundedRect:NSMakeRect(0, 0, [self bounds].size.width, [self bounds].size.height) xRadius:5 yRadius:5];
NSGradient *fill = [[NSGradient alloc] initWithColorsAndLocations:[NSColor colorWithCalibratedRed:0.247 green:0.251 blue:0.267 alpha:0.6],0.0,[NSColor colorWithCalibratedRed:0.227 green:0.227 blue:0.239 alpha:0.6],0.5,[NSColor colorWithCalibratedRed:0.180 green:0.188 blue:0.196 alpha:0.6],0.5,[NSColor colorWithCalibratedRed:0.137 green:0.137 blue:0.157 alpha:0.6],1.0, nil];
[fill drawInBezierPath:path angle:-90.0];
[[NSColor lightGrayColor] set];
[path stroke];
Heres a picture of 2 of the corners (Its not as obvious in a small picture):
Anyone know what's causing this? Am I just missing something?
Thanks for any help
The straight lines of the roundrect are exactly on the borders of the view, so half the width of each line is getting cut off. (As if they were on a subpixel.)
Try changing
NSMakeRect(0, 0, [self bounds].size.width, [self bounds].size.height)
to
NSMakeRect(0.5, 0.5, [self bounds].size.width - 1, [self bounds].size.height - 1)
If an NSBezierPath ever looks a bit weird or blurry, try shifting it over half a pixel.
Take a look at the setFlatness: method in the NSBezierPath docs. It controls how smooth rendered curves are. I believe setting it to a smaller number (the default being .6) will yield smoother curves, at the cost of more computation (though for simple paths, I doubt it matters a whole lot).
The use case: I am subclassing UIView to create a custom view that "mattes" a UIImage with a rounded rectangle (clips the image to a rounded rect). The code is working; I've used a method similar to this question.
However, I want to stroke the clipping path to create a "frame". This works, but the arc strokes look markedly different than the line strokes. I've tried adjusting the stroke widths to greater values (I thought it was pixelation at first), but the anti-aliasing seems to handle arcs and lines differently.
Here's what I see on the simulator:
This is the code that draws it:
CGContextSetRGBStrokeColor(context, 0, 0, 0, STROKE_OPACITY);
CGContextSetLineWidth(context, 2.0f);
CGContextAddPath(context, roundRectPath);
CGContextStrokePath(context);
Anyone know how to make these line up smoothly?
… but the anti-aliasing seems to handle arcs and lines differently.
No, it doesn't.
Your stroke width is consistent—it's 2 pt all the way around.
What's wrong is that you have clipped to a rectangle, and your shape's sides are right on top of the edges of this rectangle, so only the halves of the sides that are inside the rectangle are getting drawn. That's why the edges appear only 1 px wide.
The solution is either not to clip, to grow your clipping rectangle by 2 pt on each axis before clipping to it, or to move your shape's edges inward by 1 pt on each side. (ETA: Or, yeah, do an inner stroke.)
Just in case anyone is trying to do the same thing I am (round rect an image):
The UIImageView class has a property layer, of type CALayer . CALayer already has this functionality built-in (it WAS a little surprising to me I couldn't find it anywhere):
UIImageView *thumbnailView = [UIImage imageNamed:#"foo.png"];
thumbnailView.layer.masksToBounds = YES;
thumbnailView.layer.cornerRadius = 15.0f;
thumbnailView.layer.borderWidth = 2.0f;
[self.view addSubview:thumbnailView];
Also does the trick.