How to disable all user interaction for NSApplication/NSWindow without blocking the main thread? - cocoa

I have a Cocoa app that presents a folder structure to the user and allows basic file system operations such as move/copy/rename.
I perform all file system access on a background queue and use file coordination via NSFileCoordinator.
Let's imagine the user drags the file "Notes.txt" into the folder "Project B". Here is what I do:
I schedule the file operation of moving the file on the background queue.
Once that's done, it calls a block on the main queue where I update the outline view
Let's assume the background operation moving the file takes some time. I need to prevent that the user continues to move files around or performs other actions until the first operation completes.
Hence I would like to:
Disable all user interaction with the window
Disable most menu items and keyboard shortcuts
Keep certain menus such as [Quit Cmd-Q] working
Don't block the main thread
One solution that comes to mind: use a modal sheet?.
I don't want to do this, because most of the time, the operation will finish quickly and the sheet would only be shown for a fraction of a second, which is distracting.
I keep track of how long the operation takes would switch from "interaction blocked mode" to "modal sheet" if it takes more than 500 ms.
The question is: how can I implement such a "user interaction blocked mode" without actually presenting a modal sheet?
Alternative idea: disabling all controls
I considered setting isEnabled = false for all controls, but that's not an option, because the UI can be arbitrarily complex and it won't prevent actions via the menu or keyboard shortcuts.

You may find helpful The Responder Chain article in Apple's Documentation. Proper solution depends on what you exactly need. Overriding NSMenu's - (BOOL)validateMenuItem:(NSMenuItem *)menuItem; let you control over enable menu items and its shortcuts. You may also put transparent NSView over the area you want to prevent from user's interactions.

Related

Implementing a Custom Cocoa Event Tracking Loop

I'm working on a custom cross platform UI library that needs a synchronous "ShowPopup" method that shows a popup, runs an event loop until it's finished and automatically cancels when clicking outside the popup or pressing escape. Keyboard, mouse and scroll wheel events need to be dispatched to the popup but other events (paint, draw, timers etc...) need to be dispatched to their regular targets while the loop runs.
Edit: for clarification, by popup, I mean this kind of menu style popup window, not an alert/dialog etc...
On Windows I've implemented this fairly simply by calling GetMessage/DispatchMessage and filtering and dispatching messages as appropriate. Works fine.
I've much less experience with Cocoa/OS X however and finding the whole event loop/dispatch paradigm a bit confusing. I've seen the following article which explains how to implement a mouse tracking loop which is very similar to what I need:
http://stpeterandpaul.ca/tiger/documentation/Cocoa/Conceptual/EventOverview/HandlingMouseEvents/chapter_5_section_4.html
but... there's some things about this that concern me.
The linked article states: "the application’s main thread is unable to process any other requests during an event-tracking loop and timers might not fire". Might not? Why not, when not, how to make sure they do?
The docs for nextEventMatchingMask:untilDate:inMode:dequeue: states "events that do not match one of the specified event types are left in the queue.". That seems a little odd. Does this mean that if an event loop only asks for mouse events then any pressed keys will be processed once the loop finishes? That'd be weird.
Is it possible to peek at a message in the event queue without removing it. eg: the Windows version of my library uses this to close the popup when it's clicked outside, but leaves the click event in the queue so that clicking outside the popup on a another button doesn't require a second click.
I've read and re-read about run loop modes but still don't really get it. A good explanation of what these are for would be great.
Are there any other good examples of implementing an event loop for a popup. Even better would be pseudo-code for what the built in NSApplication run loop does.
Another way of putting all this... what's the Cocoa equivalent of Windows' PeekMessage(..., PM_REMOVE), PeekMessage(..., PM_NOREMOVE) and DispatchMessage().
Any help greatly appreciated.
What exactly is a "popup" as you're using the term? That term means different things in different GUI APIs. Is it just a modal dialog window?
Update for edits to question:
It seems you just want to implement a custom menu. Apple provides a sample project, CustomMenus, which illustrates that technique. It's a companion to one of the WWDC 2010 session videos, Session 145, "Key Event Handling in Cocoa Applications".
Depending on exactly what you need to achieve, you might want to use an NSAlert. Alternatively, you can use a custom window and just run it modally using the -runModalForWindow: method of NSApplication.
To meet your requirement of ending the modal session when the user clicks outside of the window, you could use a local event monitor. There's even an example of just such functionality in the (modern, current) Cocoa Event Handling Guide: Monitoring Events.
All of that said, here are (hopefully no longer relevant) answers to your specific questions:
The linked article states: "the application’s main thread is unable to process any other requests during an event-tracking loop and
timers might not fire". Might not? Why not, when not, how to make
sure they do?
Because timers are scheduled in a particular run loop mode or set of modes. See the answer to question 4, below. You would typically use the event-tracking mode when running an event-tracking loop, so timers which are not scheduled in that mode will not run.
You could use the default mode for your event-tracking loop, but it really isn't a good idea. It might cause unexpected re-entrancy.
Assuming your pop-up is similar to a modal window, you should probably use NSModalPanelRunLoopMode.
The docs for nextEventMatchingMask:untilDate:inMode:dequeue:
states "events that do not match one of the specified event types are
left in the queue.". That seems a little odd. Does this mean that if
an event loop only asks for mouse events then any pressed keys will be
processed once the loop finishes? That'd be weird.
Yes, that's what it means. It's up to you to prevent that weird outcome. If you were to read a version of the Cocoa Event Handling Guide from this decade, you'd find there's a section on how to deal with this. ;-P
Is it possible to peek at a message in the event queue without removing it. eg: the Windows version of my library uses this to close
the popup when it's clicked outside, but leaves the click event in the
queue so that clicking outside the popup on a another button doesn't
require a second click.
Yes. Did you notice the "dequeue:" parameter of nextEventMatchingMask:untilDate:inMode:dequeue:? If you pass NO for that, then the event is left in the queue.
I've read and re-read about run loop modes but still don't really get it. A good explanation of what these are for would be great.
It's hard to know what to tell you without knowing what you're confused about and how the Apple guide failed you.
Are you familiar with handling multiple asynchronous communication channels using a loop around select(), poll(), epoll(), or kevent()? It's kind of like that, but a bit more automated. Not only do you build a data structure which lists the input sources you want to monitor and what specific events on those input sources you're interested in, but each input source also has a callback associated with it. Running the run loop is like calling one of the above functions to wait for input but also, when input arrives, calling the callback associated with the source to handle that input. You can run a single turn of that loop, run it until a specific time, or even run it indefinitely.
With run loops, the input sources can be organized into sets. The sets are called "modes" and identified by name (i.e. a string). When you run a run loop, you specify which set of input sources it should monitor by specifying which mode it should run in. The other input sources are still known to the run loop, but just ignored temporarily.
The -nextEventMatchingMask:untilDate:inMode:dequeue: method is, more or less, running the thread's run loop internally. In addition to whatever input sources were already present in the run loop, it temporarily adds an input source to monitor events from the windowing system, including mouse and key events.
Are there any other good examples of implementing an event loop for a popup. Even better would be pseudo-code for what the built in
NSApplication run loop does.
There's old Apple sample code, which is actually their implementation of GLUT. It provides a subclass of NSApplication and overrides the -run method. When you strip away some stuff that's only relevant for application start-up or GLUT, it's pretty simple. It's just a loop around -nextEventMatchingMask:... and -sendEvent:.

Cocoa listening key events and responding them without view

First of all hi guys!
I was trying to write a mouse controller app for mac os x which is reading inputs from keyboard and moves the mouse accordingly. By garbage input i will describe the input was intented for a mouse event but it creates text on screen.
Before anyone points to the fact that there is a built in one, It was laggy even in shortest lag setting and cannot registers more than two buttons at the same time (you have to press diagonals to go to the diagonal.) If you accidentally press another button when release of the accident button your motion stops. My first and last reaction was "rubbish!". Adding customization and extra features is my goal.
I want to create a key combination that will block the garbage input to be passed to other programs while it was held. But global monitoring and seems like it always passes the event. And unfortunately I see qqqqqqqwwwwwww like text in unwanted places.
I want to see that when i press q w and up, it will make the mouse go up. But i create qqqqqqqwwwwww mess on the way. My first idea was creating a view on popover and handle events there, but whenever I want to use my mouse from keyboard seeing a popover is anoying and I couldn't find a way to show the popover without leaving any garbage keyboard input.
What should i do in this situation?
You will want to use Quartz Event Taps. Note that for an application to tap keyboard events, it has to be trusted for accessibility (as in System Preferences > Security & Privacy > Privacy > Accessibility). Your app can ask to be made trusted using AXIsProcessTrustedWithOptions().

block ALL keyboard access, mouse access and keyboard shortcut events

In order to block ALL keyboard access, mouse access and keyboard shortcut events in one of my projects, I:
Created a full screen transparent borderless window, in front of other windows, but invisible.
Handle all keyboard and mouse events with simple return; the window itself.
Make the window modal [NSApp runModalForWindow:myWindow] in order to block keyboard shortcuts.
Release window from touchpad's gesture events only.
But this guy made it look simple in a tiny app -MACIFIER:
How did he do it?
not really sure if this would be usable, but you could use the program hotkeynet (generally used for gaming, but I have had success using other methods) and map every single key/mouse action to do nothing. I did something similar by blocking access to a specific program with it in about 20-30 minutes.
not sure if it will help; but it might be the solution you need?
I believe you can use Quartz Event Services. In particular, have a look at CGEventTapCreate, and note the 4th parameter, which allows you to specify what kinds of events you'd like to intercept. The available kinds of events are listed in the CGEventType enum.
If you set your tap to be an active filter, returning NULL from the callback will delete the event.

Is it possible to design Dialogs within a Dialog, both created with ressource editor?

Is it possible to create a dialog ressource with a resource editor and then put this dialog (possibly multiple times) into another dialog?
Here's some background. I need to create a C++ program (Windows). The user needs to input a set of similar data on a dialog. Say, for simplicity's sake, an element of this data-set consist of an edit control and a scrollbar. Since this combination (edit + scrollbar) needs to be put onto the dialog for each element for the data-set, I thought I could create a simple dialog with just one edit control and one scrollbar, and then put this dialog mutliple times onto its "parent" dialog.
So, is this possible at all. Any pointers will be greatly appreciated.
Yes, you can do this.
In the dialog editor, set the "Control Parent" flag on the parent dialog. (This will ensure the tab key works to cycle through items in the child dialogs as if they were part of the parent dialog.)
Make sure the child dialog(s) have the "Child" flag set in the dialog editor. Visually, they'll look like dialogs without any border at all in the editor.
At runtime, create the child dialogs as children of the parent dialog using CreateDialog (or CreateDialogParam, etc.). When calling CreateDialog you specify the dialogproc for each window.
I often make the child dialog procs do little more than forward messages to the main window's dialog proc (calling it directly; not via SendMessage), but you have to be careful, obviously. You have to be especially careful if you are creating multiple copies of the same dialog in a single parent, since obviously the control IDs within that dialog will all be the same and you need to differentiate them (perhaps by the parent's hWnd).
You don't have to forward messages to the parent, though. I just do usually do that so that most of the dialog's logic is in one place instead of spread out.
EDIT: Corrected statements about creating the child dialogs, window classes etc. I was mixing up dialogs and normal windows, making things more complex than they are in this case. Sorry about that!

Why is a modal/modeless dialog called modal/modeless?

I always have trouble remembering whether the modal or modeless dialog is the one blocking operations in other parts of the application.
Does anyone know why they are called that way?
With a modal dialog, you set your application in a particular mode (a different "state" if you will), whereby only actions pertaining to that "mode" are accepted, hence preventing UI actions outside of the dialog.
At Andreas' prompting I thought I may have to dig dusty Windows API books, as often, the etymology/origin of a word or expression that has became broadly accepted is only found in early documentation, but in fact we still see this referenced in an online glossary from MS. The Modal entry reads (emphasis is mine):
modal
Restrictive or limited interaction due to operating in a mode. Modal often describes a secondary window that restricts a user's interaction with the owner window. See also: modeless.
A modal system is one with multiple "modes of operation". Such a system switches between modes by using key strokes, for example "Esc" "CTRL+S". A good example is the Vim text editor which switches between "edit text mode" and "navigate text mode".
A modal dialog is thus one which blocks the main program by switching it to a different mode for the duration of the operation.
I believe this is a tip to Linguistic Modality. "Modal" dialogs are used (typically) to present information that falls into the typical modals of:
Declarative
Interrogative
Exclamatory
Part of why I feel this is the case, although I'm searching for a more definitive answer, is the way modal dialogs are discussed. For example, take MSDN - their criteria is "Dialog boxes that display important messages should always be modal.", which could easily be rewritten as "Dialog boxes whose content is of a declarative modal [linguistic definition of modal here] should be created as Modal windows."
There is other precident for this. For example, the word "dialog" in dialog box - it's called a "Dialog box" because it's supposed to be presenting a dialog, or conversation, between the system and user - another throwback to linguistic terminology for a computational process.
Looks like the only reason is that it is related to modes and mode errors.

Resources