Cocoa accessibility API, can I click a window in the background without activating it? - cocoa

I've been searching forever for a solution to this, so I thought I'd seek out the brainpower of greater minds than mine. I'm developing a Cocoa app that uses the Accessibility API to manipulate another program (it's a hotkey app). The app I'm controlling typically has multiple windows open, with some hidden behind others. What I would like to do, if it's possible, is to send mouse events to windows using the Accessibility API in a way that presses a button in the window without bringing it to the foreground (interact with the window but don't activate it). The reason I'm trying to do this is that sending the mouse event to this other window will force it to the foreground and disrupt the user's interaction with the foremost window.
This is possible on Windows - apparently, because apps similar to mine do it there - but I'm getting the feeling that this isn't possible with Cocoa, given the way the window manager works. Am I mistaken?

Accessibility is higher-level than that. You send, for example, AXPress actions to AXButton objects, but “press” is not necessarily a click—pressing the space bar while a view is focused, for example, is also a “press”. AXPress is a high-level action that means “do your thing”, which obviously has meaning for some views (such as buttons) and not others (such as fields).
Accessibility activating the application does make sense when you look at it from its intended purpose: Assistive devices for disabled users. If the user “presses” something by whatever means, they probably intend to activate the application and work in it.
Quartz Event Services will get you almost there: You can create an event tap for the process you want to control, and you can forge events and send them to a tap. The catch is that you can only send events to a tap when the tap fires—i.e., when the application already has an event to deal with. When it doesn't, you're stuck.

Related

Is there any method to programmatically switch focus out of metro mode?

If I have a program running in the background and it needs the user to see it (like a dialog box) when it pops up, can I take the user out of Metro Mode (in Windows 8) for him to be able to see this notification?
I highly doubt it, such a capability would spawn a bunch of apps that would essentially try to take over and be very jarring for the user. Your desktop app though could generate a toast notification that would alert the user there is some action to take, see this MSDN topic for details.
I agree with Jim: switching context automatically from the desktop to Metro (or whatever they're calling it now) would be visually jarring and user-hostile. I realize the OS itself does this, like when you launch a desktop app from the Start screen. That doesn't make it good design.
Besides, when it does it, the user (presumably) wanted to interact with the newly-launched application. That's not necessarily the case when you're just showing a notification. There may not even be action required.
Instead, I recommend that you use Toast, the notification framework designed explicitly for this purpose. There's a sample application available for download: Sending toast notifications from desktop apps.
Note, however, that in order for Toast notifications to work from desktop applications, you must install a shortcut to your desktop application in the Start screen, with a System.AppUserModel.ID. This should be handled by your installer. More information is here.
Of course, the user can disable this by either turning off notifications or removing your app's shortcut from their Start screen. That's perfectly okay—if they take either of these actions, you can assume that they no longer want to receive notifications from your app.

Possible to tell if NSApplicationActivationPolicyProhibited application is active?

Using JUCE with TUIO, I'm developing a multi-touch utility to send "hot keys" commands to other applications (I am using a usb touch frame that sends TUIO messages). For instance, I provide an interface through which users can touch-and-hold to program a key combo and then tap that button to send the programmed key combo to another app. They way I accomplish this on OSX is by running my utility as a "background only" application (NSApplicationActivationPolicyProhibited). I use [NSWindow setCanHide: NO] so the GUI of my utility is visible even though it runs as a background app.
It works well except in the case that a window from another application is on top of mine. What happens is that touches get passed through that other app into mine- causing unintentional button pushes in my app. Normally, I could have my app only listen to the TUIO touch callback whenever is is the active application, [NSApp isActive]. But, since my app is background only, it is never active and I have no way to tell if another window is covering it to prevent touches.
So, is there any way for a "background only" app to be able to tell if it is on top of all other windows? Or, is there a way from within my app to get a list of all Cocoa windows from other applications and be able to tell if they are appearing on top of my "background only" app?
Also, does anyone know how I would go about all of the above in Windows? In other words, what is the Windows equivalent of NSApplicationActivationPolicyProhibited and would I be able to tell if it is covered by other applications' windows?

VC++ mouse events

I want to write a console program for mouse events (Only mouse scroll). How do I do it in VC++? The application will listen only to scroll events.
Description: If the user scrolls down, the Desktop window fades down, and fades-in when user scrolls up.
Here I just need to know to to listen to mouse events in console app.
Note: I am developing using win32 API, and for development environment I am using VS2010.
I've never actually done this myself. It seems that a console application responding to mouse events almost belies its nature and intended purpose. Generally, you would only need to respond to keyboard input from a console app and leave the mouse stuff to a GUI app.
That being said, this tutorial indicates that it is in fact possible to capture these mouse events from a Win32 console application. Generally, the suggestion is to use the ReadConsoleInput function and extract the information of interest from the INPUT_RECORD structure that it fills. The only tricky thing is that the call to ReadConsoleInput is a blocking call, which means it will not return until there is an input event fired. You'll need to structure your application's code accordingly. Mouse events are covered in detail about 3/4 of the way down the page.

How to assume/steal another process's windows as my own?

I'd like to show another app's windows under my app's taskbar button. It's a background app that reports another process's windows as my app's own. Is there any universal way to do this, e.g. each "new" window, alert glow, progressmeter, and other taskbar features, show under my own app's button?
For example, Winfox runs under its own process and steals Firefox's windows. It also adds features, but that's irrelevant -- I just want to support another app's existing taskbar features under my own app's button -- multiple windows, progressmeter, alert flashing, error flashing, mini-icons, etc. Is there a near-universal way to steal an app, or is it largely app-specific? Thanks!
You should be able to use SetParent() to take ownership of a window, but I'm not sure how much this will help you in your attempt to add taskbar features to the legacy app.

How do I get keyboard events in an NSStatusWindowLevel window while my application is not frontmost?

After creating a translucent window (based on example code by Matt Gemmell) I want to get keyboard events in this window. It seems that there are only keyboard events when my application is the active application while I want keyboard events even when my application isn't active but the window is visible.
Basically I want behavior like that provided by the Quicksilver application (by blacktree).
Does anybody have any hints on how to do this?
There are two options:
Use GetEventMonitorTarget() with a tacked-on Carbon run loop to grab keyboard events. Sample code is available on this page at CocoaDev.
Register an event trap with CGEventTapCreate. Sample code can be found in this thread from the Apple developer mailing list.
Edit: Note that these methods only work if you check off “Enable access for assistive devices” in the Universal Access preference pane.
A simpler route that may work better for you is to make your app background-only. The discussion on CocoaDev of the LSUIElement plist key explains how to set it up. Basically, your application will not appear in the dock or the app switcher, and will not replace the current application's menu bar when activated. From a user perspective it's never the 'active' application, but any windows you open can get activated and respond to events normally. The only caveat is that you'll never get to show your menu bar, so you'll probably have to set up an NSStatusItem (one of those icon menus that show up on the right side of the menu bar) to control (i.e. quit, bring up prefs, etc.) your application.
Edit: I completely forgot about the Non-Activating Panel checkbox in Interface Builder. You need to use an NSPanel instead of an NSWindow to get this choice. This setting lets your panel accept clicks and keyboard input without activating your application. I'm betting that some mix of this setting and the Carbon Hot Keys API is what QuickSilver is using for their UI.
Update:
Apple actually seems to have changed everything again starting with 10.5 BTW (I recently upgraded and my sample code did not work as before).
Now you can indeed only capture keydown events setting up an event tap if you are either root or assistive devices are enabled, regardless on which level you plan to capture and regardless if you selected to capture (which allows you to modify and even discard events) or to be listen only. You can still get information when flags have changed (actually even change these) and other events, but keydown under no other circumstances.
However, using the carbon event handler and the method RegisterEventHotKey() allows you to register a hotkey and you'll get notified when it is pressed, you neither need to be root for that nor do you need anything like assistive devices enabled. I think Quicksilver is probably doing it that way.

Resources