Creating NSOpenGLView inside an NSDocument (OS X) - osx-mountain-lion

I am creating a music-eduaction app that reads in musical scores - not audio files - and will need to present an animated graphical screen. I created a document-based app to make file access easy, and I have it now reading and parsing the files, and I have all the song data stored in my Obj-C classes. I also have a textview in my xib that I can write song attributes and other text tidbits to. Now I want a second view, which needs to be graphical and animatable, for the music. I am an Xcode novice, but have some openGL experience. My setup is latest OS and Xcode versions.
When I try to drag the OpenGL View into my window in IB, I get a weird error/warning that says "Unsupported Configuration - NSOpenGLView in One Shot memory enabled window" (so that is weird), and the openGL view does not appear when I run the app.
I can't find much reference to OpenGL Views in NSdocuments on this site, or anywhere else, which makes me think I might be trying to do something that is not meant to be done. Does anyone have any advice for me? Should I not use a document-based app? Should I use something other than openGL? Or maybe I need to build the openGL View and View Controller 100% programmatically in this case? Any advice or pointers to some applicable samples/tutorials would be a huge help.

Try disabling the "One Shot" option from the Windows's memory attributes in Interface Builder.
From NSWindow documentation:
setOneShot: Sets whether the window device that the window manages
should be freed when it’s removed from the screen list.
- (void)setOneShot:(BOOL)oneShot
Parameters
oneShot YES to free the window’s window device when it’s removed from the screen list (hidden)
and to create another one when it’s returned to the screen; NO to
reuse the window device.
Discussion
Freeing the window device when
it’s removed from the screen list can result in memory savings and
performance improvement for NSWindow objects that don’t take long to
display. It’s particularly appropriate for NSWindow objects the user
might use once or twice but not display continually.

Related

OpenGL Context with Multiple Devices (Monitors)

In OpenGL I implicitly create a graphics context with something like GLUT when I create a window. Suppose I drag my window into a monitor driven by a different video card (e.g. Intel embedded graphics on one and NVidia on another). Who renders the window? I.e. which device runs the graphics pipeline for each of the cases below.
The glGetString(GL_RENDERER) seems to always return the primary display (where the GLUT window was created) even if I drag the window fully into one window or the other. (I am guessing it all gets done by the primary...) Can someone help me understand this?
Note, using Windows 10, GLUT, OpenGL, but I ask the questions in general if it matters.
GL knows nothing about windows, only about contexts. GL renders to the framebuffer in the current context.
You may code a way of asking the OS about where a window is, and use two context, and set as current the proper one depending on OS answer.

High-Level App Design/Architecture

I've done a fair amount of iOS development in the past couple of years, so I'm pretty familiar with iOS architecture and app design (everything's a ViewController that you either push, pop, or stick into tab bars). I've recently started exploring proper Mac app development and feel a little lost. I'd like to really just have a sanity check and maybe some advice as to what the proper way to build an app like this is:
I'd like to build a library-style, single window app, that will spawn additional windows during its operation, but not as full-blown documents. The main window will be laid out much like OS X Lion's Mail.app, with a three-wide split view containing:
A source list, or high-level topic selection
A list view of items pertaining to the topic selected in the first pane
A detail view, which shows the details of the object selected in the middle pane
Like I said, really similar to Mail.app as far as looks go.
My question is really how to glue all this together from inside XCode. Here's where my confusion lies so far:
The default project generated a NIB with a main menu and window. I like to encapsulate functionality, so should I make a window controller for this window and somehow hook it up in Interface Builder, or does window-specific functionality belong somewhere else?
If possible, I'd like each of my three panes to be separate view controllers. I created three NSViewController subclasses (XCode automatically generated NIBs), and added (to the main menu/window NIB) view controller objects with each class specified, hooking up each one's view property to one of the three Custom View generic NSView objects I dropped into the NSSplitView. When I tried to set each view controller's NIB, only the main menu/window NIB appeared in the drop-down, and typing the desired one by hand seemed to have no effect (the view's contents didn't actually appear when running the app). This makes me think I'm doing something wrong.
I'm a little fuzzy on what types of views I should use for each of the first two panes. I'll obviously build a custom one for the final pane, but it seems like the first two should be present in the Cocoa framework already.
Anyway, if I'm doing completely the wrong thing, don't bother addressing my questions; just tell me what I should be doing instead. I think I just need a proper Mac developer to point me in the right direction.
With regard to your first question, you don't need to use the main window that Apple supplies in MainMenu.xib. If you want, you are free to delete that window from the nib and then instantiate an NSWindowController in your applicationDidFinishLaunching: delegate method which then loads and controls the main window.
You are definitely confused about NSViewController, which is not really all that surprising, since you might assume that it works like UIViewController.
In fact, NSViewController is completely different to UIViewController and does not have the same level of Interface Builder support. You can't place a view controller in a window in IB, for example, whereas this is standard practice on iOS. NSViewController is a relatively new class on the Mac and generally you use it to load views programmatically and manage the view content.
The class that most closely maps to UIViewController on the Mac is NSWindowController. This has been around a lot longer than NSViewController and in fact many Mac apps don't use NSViewController at all.
Generally, each window in your app should have a window controller managing it. You can use subclasses of NSWindowController to handle a lot of the functionality for each window.
If you want to use NSViewController, then you should use your window controller to manage those view controller objects. This is generally done programmatically due to the aforesaid lack of Interface Builder support. Each NSViewController instance loads its view from a specific nib file. You generally don't add view controllers in Interface Builder.
For your source list you would generally use an NSOutlineView if you have multiple sections or an NSTableView. These two objects are used whenever you need a list of items. NSOutlineView is hierarchical, whereas NSTableView is flat.
I hope this helps.

Accessing OSX Scrollbar preferences programmatically

I need to programmatically access the state of some of the settings in my SystemPreferences. In particular the scrollbar settings (for 10.7 whether they're floating or not and for 10.6/10.5 the scroll button placement). I know there's these .plist files, but I'd much rather access something fast from memory if possible. I'm curious as well if there's away to be notified when they change, so that I don't have to read them so often.
Read NSScroller reference. The change in the setting is automatically communicated to all instances of NSScroller by calling the appropriate setArrowsPosition: etc. You just need to implement them in your NSScroller subclass.

Kiosk Applications - OS X programming - Multiple monitors

I've learnt Cocoa + Objective C primarily for iPhone development, and I need to utilize this skill set to build a very basic kiosk application for OS X in a couple of days. The application is basically as follows :
The setup has two touch screen monitors, the app must be running full screen mode. The monitor on the right acts as a detail view to a list of options on the left. There are 3 options on the monitor on the left. Picking one will play a movie on the right, Picking two will take you to a quiz, Picking 3 will pull up a Webview.
The user may not use any other operations on the PC. (I've started reading about OS X application development and realized Cocoa provides a kiosk mode for these types of apps)
My questions briefly are
Firstly, any help on how to get my app running in a kiosk mode is much appreciated! I'm under a bit of a time crunch (2 days to get all this done, talk about life in startups!), so completely static content is fine, I'm slightly worried about how OS X will handle full screen mode if an app has been written in a smaller window size. (Scaling etc.)
Next, assuming there are two windows, one on each screen, how do I deal with focus? If the user suddenly gets bored with content on the right and touches the window on the left, the first touch will probably act to focus the window and the second will act as a click on the button. I'd like to avoid this scenario!)
What are the navigation paradigms in OS X ? I'm guessing it's not as simple as [navigationController pushViewController]? In short, how do I display a new view over an existing view?
Thanks,
Teja
Firstly, any help on how to get my app running in a kiosk mode is much appreciated!
http://developer.apple.com/library/mac/#technotes/KioskMode/
Next, assuming there are two windows, one on each screen, how do I deal with focus? If the user suddenly gets bored with content on the right and touches the window on the left, the first touch will probably act to focus the window and the second will act as a click on the button. I'd like to avoid this scenario!)
Click-through is the default. If you have any custom views, respond to acceptsFirstMouse: with YES to support click-through in them.
What are the navigation paradigms in OS X ?
Typically either window-based or source-list-based. Your application is atypical.
I'm guessing it's not as simple as [navigationController pushViewController]?
It's simpler and more complex at the same time. There is no stack to manage; you can have multiple windows up at the same time. It gets more complex when you want everything in one window (as in your kiosk-mode app), in which case you end up using tab views (with or without tabs) to enable the user to switch from one view to another.
In short, how do I display a new view over an existing view?
You don't. Layering one view over another in the same superview is barely supported at all in AppKit, and almost always wrong.
In a normal application, you should make multiple windows. In an app like yours, you'll need to use tab views. View controllers may help you here, although NSViewControllers are very different from UIViewControllers (as I mentioned, no view stack); they're more similar to NSWindowControllers.

Change cursor in GL fullscreen on OSX using cocoa?

Can anyone provide a sample / link to a sample Cocoa app that changes the 'hardware' cursor in a fullscreen OpenGL Cocoa app? I have been able to create a full screen GL app and an app that changes the cursor by overriding NSView::resetCursorRects but I have not been able to get both to work simultaneously. I've also refitted some of the Apple GL samples (CocoaGL, Custom Cocoa OpenGL, etc) by overriding NSView::resetCursorRects and I haven't been able to get the cursor to change in fullscreen in them either. I have the book "OpenGL Programming on Mac OS X" which also avoids the problem.
#Christopher: I hadn't tried [NSCursor set]. Good call but I made a run at it and no luck. It still returns to the system cursor. I'd say that perhaps something is overriding it in my calls that switch to fullscreen but I've actually tried reseting the mouse cursor in my NSView's draw routine (which gets called repeatedly) and the cursor never switches from the system default.
Try using NSCursor directly, the NSView cursor rect methods depend on things such as a properly sized and visible NSWindow to work properly which aren't necessarily the case in full screen mode.

Resources