Updating Apple's LayerBackedOpenGLView example for high resolution - macos

When I run the LayerBackedOpenGLView example on a high resolution display, the OpenGL context is not rendered at high resolution (the layer's contentScale is 1.0).
I've followed the steps in Apple's documentation, but the contentScale is still 1.0.
Specifically, I would have thought adding the following in the MyOpenGLView's init method would give me a high-resolution layer:
[self setWantsBestResolutionOpenGLSurface:YES];
But contentScale is still 1.0.
What are the steps required to update the example to render OpenGL in high resolution?

I was able to get the appropriate content scale.
I added self.wantsLayer = YES in -[MyOpenGLView init]. Even though that is set in -[MainController awakeFromNib], it doesn't seem to be early enough.
Also I added the following method to MyOpenGLView
- (void)viewDidChangeBackingProperties
{
[super viewDidChangeBackingProperties];
self.layer.contentsScale = self.window.backingScaleFactor;
}
This was necessary despite the following claim in the documentation:
When it comes to high resolution, layer-backed views are scaled automatically by the system. You don’t have any work to do to get content that looks great on high-resolution displays.
UPDATE
According to an Apple dev, this is a known bug in OS X 10.10.
Hi. This is a known bug in 10.10.0. For the moment, you can workaround it by adding this to your NSOpenGLView subclass (note this code will be harmless on older versions of Mac OS X.) Sorry for the inconvenience!
static CGFloat scaleFactorForOpenGLView(NSView *view) {
if ([view wantsBestResolutionOpenGLSurface]) {
NSWindow *window = [view window];
if (window) {
return [window backingScaleFactor];
}
}
return 1;
}
- (void)viewDidChangeBackingProperties {
[super viewDidChangeBackingProperties];
[[self layer] setContentsScale:scaleFactorForOpenGLView(self)];
}
- (CALayer *)makeBackingLayer {
CALayer *layer = [super makeBackingLayer];
[layer setContentsScale:scaleFactorForOpenGLView(self)];
return layer;
}

Related

Quartz Composer Screensaver in Xcode

My aim is to bundle a Quartz Composer file into Xcode and build a .saver file. I am currently using the Xcode pre-made template but having problems getting the screensaver to work. I am importing the .qtz file into the project and using QCView to render it on screen, however when I test the built .saver file all I see is a black screen.
- (instancetype)initWithFrame:(NSRect)frame isPreview:(BOOL)isPreview
{
self = [super initWithFrame:frame isPreview:isPreview];
if (self) {
[self setAnimationTimeInterval:1/30.0];
NSRect viewBounds = [self bounds];
//create the quartz composition view
qcView = [[QCView alloc] initWithFrame: NSMakeRect(0, 0, viewBounds.size.width, viewBounds.size.height)];
//make sure it resizes with the screensaver view
[qcView setAutoresizingMask:(NSViewWidthSizable|NSViewHeightSizable)];
//match its frame rate to your screensaver
[qcView setMaxRenderingFrameRate:30.0f];
//get the location of the quartz composition from the bundle
NSString* compositionPath = [[NSBundle mainBundle] pathForResource:#"QuartzComposerFileName" ofType:#"qtz"];
//load the composition
[qcView loadCompositionFromFile:compositionPath];
//add the quartz composition view
[self addSubview:qcView];
}
return self;
}
I have tried doing the same thing using Xcode 6.4 and it appears that QCView is not working with Xcode 6.4. I put an image in the view and set the size of my QCView to be smaller than the frame. I see a black box for the subview but it never renders. This is true for even the most basic QC composition.
You might be able to get it to work by rendering the composition yourself using a QCRenderer. I gave up before trying this, but here is the documentation: https://developer.apple.com/library/mac/documentation/GraphicsImaging/Reference/QuartzFramework/Classes/QCRenderer_Class/

NSSearchfield's content misaligned

Since upgrading to 10.9 Mavericks, I noticed that the content of all NSSearchfield instances are misaligned: both the magnifying glass icon, the textfield itself and the clear button are moved down a little bit.
Any idea what could be the reason?
I temporarily could fix it by subclassing NSSearchField and choosing a custom class as cell class:
+ (void) load {
[super load];
[self setCellClass:[RMSearchFieldCell class]];
}
The RMSearchFieldCell moves the origin of the cells +1 by overwriting the searchTextRectForBounds:, searchButtonRectForBounds: and cancelButtonRectForBounds: methods:
- (NSRect) cancelButtonRectForBounds:(NSRect)rect {
NSRect superRect = [super cancelButtonRectForBounds:rect];
superRect.origin.y -=1;
return superRect;
}
However this is not the elegant way of doing it, and I'm still looking for the reason for the misalignment.

Opengl es high resolution iphone 4

I created an empty iOS project and then added a custom GLView class which is then added to AppDelegate. I have following questions:
1) How do I enable hi-res retina mode on iPhone 4? Currently I am using the following code to check for device:
CGRect screenBounds = [[UIScreen mainScreen] bounds];
self.window = [[[UIWindow alloc] initWithFrame:screenBounds] autorelease];
// Override point for customization after application launch.
_view = [[GLView alloc] initWithFrame:screenBounds];
if (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad) {
NSLog(#"iPad detected");
}
else {
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)] && [[UIScreen mainScreen] scale] == 2) {
NSLog(#"iPhone4 detected");
_view.contentScaleFactor = [[UIScreen mainScreen] scale];
}
else {
NSLog(#"iPhone detected");
}
}
self.window.backgroundColor = [UIColor whiteColor];
//self.window.rootViewController = [[[UIViewController alloc] initWithNibName:nil bundle:nil] autorelease];
[self.window addSubview:_view];
But even after setting content factor it is drawing pretty poor quality polygons with jagged edges as shown in the image below:
http://farm8.staticflickr.com/7358/8725549609_e2ed1e0e2a_b.jpg
Is there any way to set the resolution to 960x640 instead of the default 480x320 ?
Please note that I can not use "someImage#2x.png" because I am generating images at runtime in the render buffer.
2) Second problem I am having is this warning message:
"Application windows are expected to have a root view controller at the end of application launch"
Thank you for your time.
As for the first question I do not know the pipeline of GLView initializer but content scale must be set before the render buffer is made (usually before renderbufferStorage:: method). To see if dimensions of the buffer are correct (should be 960x640) use function:
GLint width;
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &width)
Even if the buffer is retina and dimensions are correct those polygons might still be jagged if you do not use any sort of anti-alias. The easiest way to make an antialiased GL view in iOS is probably multisampling, try searching for glResolveMultisampleFramebufferAPPLE() (you will need a few more lines beside this one though).

Two Finger Drag with IKImageView and NSScrollView in Mountain Lion

I have a Mac App that's been in the app store for a year or so now. It was first published with target SDK 10.7, Lion. Upon the update to Mountain Lion it no longer works.
The application displays large images in an IKImageView which is embedded in an NSScrollView. The purpose of putting it into a scrollview was to get two finger dragging working, rather than the user having to click to drag. Using ScrollViewWorkaround by Nicholas Riley, I was able to use two finger scrolling to show the clipped content after the user had zoomed in. Just like you see in the Preview app.
Nicholas Riley's Solution:
IKImageView and scroll bars
Now in Mountain Lion this doesn't work. After zooming in, pinch or zoom button, the image is locked in the lower left portion of the image. It won't scroll.
So the question is, what's the appropriate way to display a large image in IKImageView and have two finger dragging of the zoomed image?
Thank you,
Stateful
Well, Nicholas Riley's Solution is an ugly hack in that it addresses the wrong class; the issue isn't with NSClipView (which he subclassed, but which works just fine as is), but with IKImageView.
The issue with IKImageView is actually quite simple (God knows why Apple hasn't fixed this in what? … 7 years ...): Its size does not adjust to the size of the image it displays. Now, when you embed an IKImageView in an NSScrollView, the scroll view obviously can only adjust its scroll bars relative to the size of the embedded IKImageView, not to the image it contains. And since the size of the IKImageView always stays the same, the scroll bars won't work as expected.
The following code subclasses IKImageView and fixes this behavior. Alas, it won't fix the fact that IKImageView is crash-prone in Mountain Lion as soon as you zoom …
///////////////////// HEADER FILE - FixedIKImageView.h
#import <Quartz/Quartz.h>
#interface FixedIKImageView : IKImageView
#end
///////////////////// IMPLEMENTATION FILE - FixedIKImageView.m
#import "FixedIKImageView.h"
#implementation FixedIKImageView
- (void)awakeFromNib
{
[self setTranslatesAutoresizingMaskIntoConstraints:NO]; // compatibility with Auto Layout; without this, there could be Auto Layout error messages when we are resized (delete this line if your app does not use Auto Layout)
}
// FixedIKImageView must *only* be used embedded within an NSScrollView. This means that setFrame: should never be called explicitly from outside the scroll view. Instead, this method is overwritten here to provide the correct behavior within a scroll view. The new implementation ignores the frameRect parameter.
- (void)setFrame:(NSRect)frameRect
{
NSSize imageSize = [self imageSize];
CGFloat zoomFactor = [self zoomFactor];
NSSize clipViewSize = [[self superview] frame].size;
// The content of our scroll view (which is ourselves) should stay at least as large as the scroll clip view, so we make ourselves as large as the clip view in case our (zoomed) image is smaller. However, if our image is larger than the clip view, we make ourselves as large as the image, to make the scrollbars appear and scale appropriately.
CGFloat newWidth = (imageSize.width * zoomFactor < clipViewSize.width)? clipViewSize.width : imageSize.width * zoomFactor;
CGFloat newHeight = (imageSize.height * zoomFactor < clipViewSize.height)? clipViewSize.height : imageSize.height * zoomFactor;
[super setFrame:NSMakeRect(0, 0, newWidth - 2, newHeight - 2)]; // actually, the clip view is 1 pixel larger than the content view on each side, so we must take that into account
}
//// We forward size affecting messages to our superclass, but add [self setFrame:NSZeroRect] to update the scroll bars. We also add [self setAutoresizes:NO]. Since IKImageView, instead of using [self setAutoresizes:NO], seems to set the autoresizes instance variable to NO directly, the scrollers would not be activated again without invoking [self setAutoresizes:NO] ourselves when these methods are invoked.
- (void)setZoomFactor:(CGFloat)zoomFactor
{
[super setZoomFactor:zoomFactor];
[self setFrame:NSZeroRect];
[self setAutoresizes:NO];
}
- (void)zoomImageToRect:(NSRect)rect
{
[super zoomImageToRect:rect];
[self setFrame:NSZeroRect];
[self setAutoresizes:NO];
}
- (void)zoomIn:(id)sender
{
[super zoomIn:self];
[self setFrame:NSZeroRect];
[self setAutoresizes:NO];
}
- (void)zoomOut:(id)sender
{
[super zoomOut:self];
[self setFrame:NSZeroRect];
[self setAutoresizes:NO];
}
- (void)zoomImageToActualSize:(id)sender
{
[super zoomImageToActualSize:sender];
[self setFrame:NSZeroRect];
[self setAutoresizes:NO];
}
- (void)zoomImageToFit:(id)sender
{
[self setAutoresizes:YES]; // instead of invoking super's zoomImageToFit: method, which has problems of its own, we invoke setAutoresizes:YES, which does the same thing, but also makes sure the image stays zoomed to fit even if the scroll view is resized, which is the most intuitive behavior, anyway. Since there are no scroll bars in autoresize mode, we need not add [self setFrame:NSZeroRect].
}
- (void)setAutoresizes:(BOOL)autoresizes // As long as we autoresize, make sure that no scrollers flicker up occasionally during live update.
{
[self setHasHorizontalScroller:!autoresizes];
[self setHasVerticalScroller:!autoresizes];
[super setAutoresizes:autoresizes];
}
#end

Why is my CAOpenGLLayer updating slower than my previous NSOpenGLView?

I have an application which renders OpenGL content on Mac OS X. Originally it was rendering to an NSOpenGLView, then I changed it to render to a CAOpenGLLayer subclass.
When I did so I saw a huge performance loss: halved framerate, lower mouse responsivity, stuttering (stops from time to time, up to a second, during which profiler activity reports waiting on mutex for data to load on GPU ram), and doubled CPU usage.
I'm investigating this issue and had a few questions:
Has a similar performance hit been seen by someone else?
Am I doing something wrong with my CAOpenGLLayer setup?
How is CAOpenGLLayer and the Core Animation framework implemented, i.e. what path does my OpenGL content do from my glDrawElements calls up to my screen, and how should I do things on my side to optimize performance with such setup?
Here's my code for CAOpenGLLayer setup:
// my application's entry point (can't be easily changed):
void AppUpdateLogic(); //update application logic. Will load textures
void AppRender(); //executes drawing
void AppEventSink(NSEvent* ev); //handle mouse and keyboard events.
//Will do pick renderings
#interface MyCAOpenGLLayer: CAOpenGLLayer
{
CGLPixelFormatObj pixelFormat;
CGLContextObj glContext;
}
#end
#implementation MyCAOpenGLLayer
- (id)init {
self = [super init];
CGLPixelFormatAttribute attributes[] =
{
kCGLPFAAccelerated,
kCGLPFAColorSize, (CGLPixelFormatAttribute)24,
kCGLPFAAlphaSize, (CGLPixelFormatAttribute)8,
kCGLPFADepthSize, (CGLPixelFormatAttribute)16,
(CGLPixelFormatAttribute)0
};
GLint numPixelFormats = 0;
CGLChoosePixelFormat(attributes, &pixelFormat, &numPixelFormats);
glContext = [super copyCGLContextForPixelFormat:mPixelFormat];
return self;
}
- (void)drawInCGLContext:(CGLContextObj)inGlContext
pixelFormat:(CGLPixelFormatObj)inPixelFormat
forLayerTime:(CFTimeInterval)timeInterval
displayTime:(const CVTimeStamp *)timeStamp
{
AppRender();
[super drawInCGLContext:inGlContext
pixelFormat:inPixelFormat
forLayerTime:timeInterval
displayTime:timeStamp ]
}
- (void)releaseCGLPixelFormat:(CGLPixelFormatObj)pixelFormat {
[self release];
}
- (CGLPixelFormatObj)copyCGLPixelFormatForDisplayMask:(uint32_t)mask {
[self retain];
return pixelFormat;
}
- (CGLContextObj)copyCGLContextForPixelFormat:(CGLPixelFormatObj)pixelFormat {
[self retain];
return glContext;
}
- (void)releaseCGLContext:(CGLContextObj)glContext {
[self release];
}
#end
#interface MyMainViewController: NSViewController {
CGLContextObj glContext;
CALayer* myOpenGLLayer;
}
-(void)timerTriggered:(NSTimer*)timer;
#end
#implementation MyMainViewController
-(void)viewDidLoad:(NSView*)view {
myOpenGLLayer = [[MyCAOpenGLLayer alloc] init];
[view setLayer:myOpenGLLayer];
[view setWantsLayer:YES];
glContext = [myOpenGLLayer copyCGLContextForPixelFormat:nil];
[NSTimer scheduledTimerWithTimeInterval:1/30.0
target:self
selector:#selector(timerTriggered:)
userInfo:nil
repeats:YES ];
}
- (void)timerTriggered:(NSTimer*)timer {
CGLContextObj oldContext = CGLContextGetCurrent();
CGLContextSetCurrent(glContext);
CGLContextLock(glContext);
AppUpdateLogic();
[myOpenGLLayer setNeedsDisplay:YES];
CGLContextUnlock(glContext);
CGLContextSetCurrent(oldContext);
}
- (void)mouseDown:(NSEvent*)event {
CGLContextObj oldContext = CGLContextGetCurrent();
CGLContextSetCurrent(glContext);
CGLContextLock(glContext);
AppEventSink(event);
CGLContextUnlock(glContext);
CGLContextSetCurrent(oldContext);
}
#end
It may be useful to know my video card isn't very powerful (Intel GMA with 64 MB of shared memory).
In one of my applications, I switched from NSOpenGLView to a CAOpenGLLayer, then ended up going back because of a few issues with the update mechanism on the latter. However, that's different from the performance issues you're reporting here.
In your case, I believe that the way you're performing the update of your layer contents may be to blame. First, using NSTimer to trigger a redraw does not guarantee that the update events will align well with the refresh rate of your display. Instead, I'd suggest setting the CAOpenGLLayer's asynchronous property to YES and using the –canDrawInCGLContext:pixelFormat:forLayerTime:displayTime: to manage the update frequency. This will cause the OpenGL layer to update in sync with the display, and it will avoid the context locking that you're doing.
The downside to this (which is also a problem with your NSTimer approach) is that the CAOpenGLLayer delegate callbacks are triggered on the main thread. If you have something that blocks the main thread, your display will freeze. Likewise, if your OpenGL frame updates take a while, they may cause your UI to be less responsive.
This is what caused me to use a CVDisplayLink to produce a triggered update of my OpenGL content on a background thread. Unfortunately, I saw some rendering artifacts when updating my CAOpenGLLayer with this, so I ended up switching back to an NSOpenGLView. Since then, I've encountered a way to potentially avoid these artifacts, but the NSOpenGLView has been fine for our needs so I haven't switched back once again.

Resources