Enable interaction with NSWindow when sheet is opened - cocoa

I'm creating an app that calls a sheet, however, the interaction with the window must stay enabled while the sheet is open. Here's a mock-up:
The user must be able to use the play and record buttons. Does anyone knows a way to keep it enabled?

This seems a slightly questionable UI. But if you really want to do it, think the only solution will be to either:
Subclass NSWindow to force handling of the events
Run the event loop for that window while the sheet is visible, and dispatch desired events yourself

Sheets are intentionally designed to block interaction with the window they're attached to. If you don't want that behavior, you shouldn't be using a sheet.

Related

Controlling the "Open in ..." button in the Quick Look (preview) panel

The QLPreviewPanel window has a button that will allow the user to open the quick look document they are currently previewing by launching its original application.
Is it possible to (a) disable this button for some documents and (b) learn if the user has clicked that button.
My problem is that some of the QLPreviewItem objects I'm passing to QLPreviewPanel are actually placeholders that aren't intended to be opened, while others are temporary documents that get created spontaneously.
In the later case, I normally delete these when the preview is done, but obviously I don't want to do this if the user has opened them in an application.
I've looked at the API for QLPreviewItem, QLPreviewPanel, and QLPreviewPanelDelegate and don't see any notifications or messages that occur when the user opens an item.
If there's no API, I might just try to hack the UI by searching the QLPreviewPanel for an NSButton and hooking its action, but I don't like hacks and I'm sure this would be a fragile one.

Status Item blocking the main thread (NSMenu blocking NSSpeechRecognizer from detecting sound)

I have a NSMenu coming down under a NSStatusItem. I also have a NSSpeechRecognizer. When the NSMenu is open, the speech recognizer does not function properly. It will constantly show that it's receiving sound, until I close the menu. I need it to detect sound properly even while the menu is open.
How can I make the speech recognizer detect sound even while the menu is open? Does it need to become a "first responder" and take precedence over the menu?
I tried setting [speechRecognizer setListensInForegroundOnly: NO] and it still won't work.
If you don't understand, I am more than happy to provide clarification.
Here are some similar situations, but I don't yet fully understand.
The problem is most likely that the menu is running a modal run loop as long as it is open (for the purposes of tracking the mouse, etc...) and this is blocking the NSSpeechRecognizer's ability to function normally.
You can confirm this by bringing up the menu and then pausing into the debugger. You'll likely see two run loops; the outer, normal, one and one deeper down the stack that is running the modal loop.
In general, this is kind of an odd thing to do from a user interaction perspective. The whole point of a pop-up menu is to offer the user some commands that will be done after the corresponding menu item is selected.
If you really need "click this thing and recognize voice", I'd recommend a button that, maybe, pops up a bit of UI and then interacts with the speech recognizer without using a menu?

How do I create a custom modal NSWindow?

I want to create a custom NSWindow that acts as a modal dialog. By custom I mean it has normal user controls in the window, with a "OK" and "Cancel" buttons. The dialog will contain read only information, and have a few checkboxes, secure edit fields, etc.
The MainMenu.xib file will have the normal Window visible at launch, plus include the custom NSWindow (which is NOT visible at launch).
I am trying to find example code to launch the window in modal mode (after the app initializes and launches main window), and on "OK" run a process, and on success of that process hide the dialog. Or on failure, keep the dialog up, but show an error sheet on the dialog.
Any help is appreciated, thanks.
You want to look at NSApplication’s -runModalForWindow: and/or -runModalSession: methods. Note that using modal windows is generally a bad idea and if it’s at all possible to avoid doing so, you should; that said, sometimes needs must.
As far as launching a process, waiting for it to finish and so on, you can probably do what you need with NSTask, although you don’t provide sufficient detail to be certain. You’d probably want to observe NSTaskDidTerminateNotification to tell you when the task had finished.
See
http://developer.apple.com/library/mac/#documentation/Cocoa/Conceptual/OperatingSystem/OperatingSystem.html
for more on NSTask and
http://developer.apple.com/library/mac/#documentation/Cocoa/Conceptual/WinPanel/Concepts/UsingModalWindows.html%23//apple_ref/doc/uid/20000223-CJBEADBA
for more about modal NSWindow usage.
Have a look at NSApplication's -runModalForWindow: method, and "Using Application-Modal Dialogs."

Cocoa accessibility API, can I click a window in the background without activating it?

I've been searching forever for a solution to this, so I thought I'd seek out the brainpower of greater minds than mine. I'm developing a Cocoa app that uses the Accessibility API to manipulate another program (it's a hotkey app). The app I'm controlling typically has multiple windows open, with some hidden behind others. What I would like to do, if it's possible, is to send mouse events to windows using the Accessibility API in a way that presses a button in the window without bringing it to the foreground (interact with the window but don't activate it). The reason I'm trying to do this is that sending the mouse event to this other window will force it to the foreground and disrupt the user's interaction with the foremost window.
This is possible on Windows - apparently, because apps similar to mine do it there - but I'm getting the feeling that this isn't possible with Cocoa, given the way the window manager works. Am I mistaken?
Accessibility is higher-level than that. You send, for example, AXPress actions to AXButton objects, but “press” is not necessarily a click—pressing the space bar while a view is focused, for example, is also a “press”. AXPress is a high-level action that means “do your thing”, which obviously has meaning for some views (such as buttons) and not others (such as fields).
Accessibility activating the application does make sense when you look at it from its intended purpose: Assistive devices for disabled users. If the user “presses” something by whatever means, they probably intend to activate the application and work in it.
Quartz Event Services will get you almost there: You can create an event tap for the process you want to control, and you can forge events and send them to a tap. The catch is that you can only send events to a tap when the tap fires—i.e., when the application already has an event to deal with. When it doesn't, you're stuck.

How do I get keyboard events in an NSStatusWindowLevel window while my application is not frontmost?

After creating a translucent window (based on example code by Matt Gemmell) I want to get keyboard events in this window. It seems that there are only keyboard events when my application is the active application while I want keyboard events even when my application isn't active but the window is visible.
Basically I want behavior like that provided by the Quicksilver application (by blacktree).
Does anybody have any hints on how to do this?
There are two options:
Use GetEventMonitorTarget() with a tacked-on Carbon run loop to grab keyboard events. Sample code is available on this page at CocoaDev.
Register an event trap with CGEventTapCreate. Sample code can be found in this thread from the Apple developer mailing list.
Edit: Note that these methods only work if you check off “Enable access for assistive devices” in the Universal Access preference pane.
A simpler route that may work better for you is to make your app background-only. The discussion on CocoaDev of the LSUIElement plist key explains how to set it up. Basically, your application will not appear in the dock or the app switcher, and will not replace the current application's menu bar when activated. From a user perspective it's never the 'active' application, but any windows you open can get activated and respond to events normally. The only caveat is that you'll never get to show your menu bar, so you'll probably have to set up an NSStatusItem (one of those icon menus that show up on the right side of the menu bar) to control (i.e. quit, bring up prefs, etc.) your application.
Edit: I completely forgot about the Non-Activating Panel checkbox in Interface Builder. You need to use an NSPanel instead of an NSWindow to get this choice. This setting lets your panel accept clicks and keyboard input without activating your application. I'm betting that some mix of this setting and the Carbon Hot Keys API is what QuickSilver is using for their UI.
Update:
Apple actually seems to have changed everything again starting with 10.5 BTW (I recently upgraded and my sample code did not work as before).
Now you can indeed only capture keydown events setting up an event tap if you are either root or assistive devices are enabled, regardless on which level you plan to capture and regardless if you selected to capture (which allows you to modify and even discard events) or to be listen only. You can still get information when flags have changed (actually even change these) and other events, but keydown under no other circumstances.
However, using the carbon event handler and the method RegisterEventHotKey() allows you to register a hotkey and you'll get notified when it is pressed, you neither need to be root for that nor do you need anything like assistive devices enabled. I think Quicksilver is probably doing it that way.

Resources