poor performance using layer-backed NSOpenGLView - macos

I'm updating my OS X app to use layer-backed views, so I can have Cocoa views on top of my OpenGL rendering (I haven't added any Cocoa views yet). Without layers, I get 60fps. With layers, less than 15fps. My iOS version, where everything is layer-backed, gets 60fps.
I enabled layer backing thusly:
I also had to set the layer's background color to black to avoid bad compositing (I use OpenGL blending, so the final alpha values in the buffer aren't correct for CA compositing).
According to Instruments, most of the rendering time is spent in CA::Transaction::commit:
I tried clearing the alpha channel as recommended here, but this seems to be just an alternative to setting the layer's background color to black.
FWIW, here are the settings for my OpenGL view:
I'm on OS X 10.10 Yosemite.

According to posts on the Apple developer forums, this appears to be a bug in OS X 10.10.

Related

Properly handling HighDPI on MacOS with SDL and OpenGL

My attempts at High-DPI rendering on MacOS for my game, Bitfighter, always end up looking bad, like a scaled-up version of a low-res game.
The game uses SDL2 + OpenGL and I have correctly enabled the SDL_WINDOW_ALLOW_HIGHDPI window flag as well as making it HighDPI aware in the Info.plist. This all works and I get the higher-res title bar just fine. I use SDL_GL_GetDrawableSize and it correctly returns the 2x larger pixel size than the window size, but the following techniques to re-scale it don't yield good results:
Using glViewport using window coords
Using glViewport with drawable coords, then glOrtho to scale (as suggested in the SDL doc README-ios.md)
Both show pixely vector graphics. What can I do to get OpenGL to draw better with MacOS High-DPI?
Thanks.

Force a retina iPad to use non-retina images

I'm creating an app using OpenGL...
I have a retina iPad, and I'm using a texture that stores #2x images. When my app starts up, the OpenGL framebuffer is created and querying the size of it comes back as 1536x2048 - so far so good.
I also have a texture for non-retina iPad display (768x1024), but I do not have a non-retina iPad. I'd like to force my retina iPad to use the non-retina graphics (i.e, for it to use scale == 1.0). So I'd like it to create a 768x1024 OpenGL framebuffer. The problem is that it always creates a 1536x2048 frame buffer, and scale is always 2.0.
Is there a way of forcing it to use a scale of 1.0, and creating a smaller framebuffer? The base iOS version for the app is 8.0, but since iOS8.0 still works on the iPad2, I'd like to test that resolution as well.
I've tried using UILaunchImages, but that doesn't seem to work? In the past, when an app was written for a non-retina screen, a retina device used to scale the lower resolution to fit the higher-resolution screen, and that's what I want - at least so I can test....
Is UILaunchImages the right way to go to try and get iOS to think that only low-res graphics are available?
IIRC, simply by having an #2x resource (splash, icon, etc..) used to signal iOS that you are running an app with support for Retina. I think since then Apple has added a NSHighResolutionCapable key that you can try to set false in the plist. So try removing all #2x resources from your build and setting that key to false.
Another approach is to change your glViewPort and/or the Projection matrix to scale your logical resolution to 1024x768. I have the opposite problem in that I need to scale non-retina images up to the retina backing scale. I use a Scaling matrix in my stack to fix this.
Ask me how I mix and match retina and non-retina images!

OS X: update draw during fullscreen animation

I am currently programmatically enabling fullscreen in an OS X 10.7+ app, via the techniques described in this apple guide, that uses OpenGL to renderer its views. Is it possible to enable per-frame screen updates during the full screen animation? Currently, it seems like a screenshot is taken before and after fullscreen is entered and there is an automatic alpha fade between the two.
I would like to instead redraw the content at every frame so that there is a smooth fade between the two sizes.
You probably want to look at the Custom Full-Screen Presentation Animations section in NSWindowDelegate documentation

What happens when you connect non-retina display to MBP with retina?

AFAIK Mac OSX has unified screen space. So it's unclear to me what happens when you have multiple displays with different backing scale factor. Will retina display go into low-res mode (so -[NSScreen backingScaleFactor] returns the lowest value among all attached displays)? If no, how an app will be rendered if it's placed so part of it appears on display with retina and another part appears on display without retina?
I'm working with some HiDPI issues at the moment, and I have my not retina MBP and thunder display with turned on HiDPI mode on it.
On connected display, it has 2.0 backing scale factor, and when I drag the window to MBP display (which has usual DPI), next is happening:
When the major part of window is on HiDPI, it renders all the window with backing scale factor 2.0.
When the major part moves to non HiDPI display, it rerenders all graphics with backing scale factor equal 1.0, and shows new rendered window on both displays.
Hope it will help you)
EDIT: screenshots added.
Screenshots near the displays border (Usual on the Left, Retina on the Right):
backingScalefactor 1.0:
https://dl.dropbox.com/u/51547223/Backing1.0.png
backingScalefactor 2.0:
https://dl.dropbox.com/u/51547223/Backing2.0.png

Why does this text look "less bold" in OS X 10.5 compared to 10.7?

I built a very simple application with nothing but a single NSTextView in it in xcode / interface builder. I've done nothing to the text view other than change the font face to "Arial" and increase the font size. However, it looks a lot less bold on OS X 10.5 than it does on 10.7. What's going on here?
(10.5 in the top window, 10.7 in the bottom window)
I've tested with Helvetica and got the same results, so it's not something to do with this specific font.
If you enlarge that image enough, you'll see color fringing on the 10.7 sample. This indicates that the text was rendered with subpixel antialiasing, more specifically, using a RGB subpixel ordering. The 10.5 sample uses grayscale antialiasing. While it is generally a good idea to use subpixel rendering, it can look bad on low-resolution screens or CRTs, and it can't be used in a multi-monitor setup where there is more than one subpixel arrangement to contend with. This means that some users will have subpixel rendering enabled for their system, and some won't. Don't try to circumvent that setting in any way, either by overriding the system preference for font smoothing, or by pre-rendering the text into an image. Users are far more likely to notice the one app whose text looks wrong on their system than they are to notice the difference in rendering between two different machines.

Resources