Intercept keyboard input in OSX - cocoa

I'm trying to write an application that prevents certain key signals from propagating beyond the OS in OSX. To clarify, I want to make it so that it almost seems to the user that the key they are pushing on their keyboard is broken. So, the associated letter won't show up in a textarea, the key won't activate a function in another application, etc. Any ideas? Thanks in advance.

You probably want to look into Quartz Event Taps. Note that your process will need to be running with "root" privileges to intercept events at the system level.
See also OSX Quartz Event Taps: event types and how to edit events

Related

Global Keyboard Hook for OSX 10.10.3

I am trying to get a global keyboard hook for OSX 10.10.3. It would ideally be neatly packaged in a Java library, but at this point I just want something that works.
I've tried two routes, and both produce the same results: I am able to read touchpad activity, external mouse activity, and keypresses of the "control", "option", "command", and "shift" keys. Keypresses on all other keys do not trigger any activity.
Both JNativeHook and a native application using quartz event taps produce that result, so I assume at some level they hit the same API. Is there somewhere else I should be looking?
Another way is the Cocoa method +[NSEvent addGlobalMonitorForEventsMatchingMask:handler:]. For either this or event taps to see keyboard events, your app must be "trusted for accessibility access". For instance look up AXIsProcessTrustedWithOptions.

Cocoa accessibility API, can I click a window in the background without activating it?

I've been searching forever for a solution to this, so I thought I'd seek out the brainpower of greater minds than mine. I'm developing a Cocoa app that uses the Accessibility API to manipulate another program (it's a hotkey app). The app I'm controlling typically has multiple windows open, with some hidden behind others. What I would like to do, if it's possible, is to send mouse events to windows using the Accessibility API in a way that presses a button in the window without bringing it to the foreground (interact with the window but don't activate it). The reason I'm trying to do this is that sending the mouse event to this other window will force it to the foreground and disrupt the user's interaction with the foremost window.
This is possible on Windows - apparently, because apps similar to mine do it there - but I'm getting the feeling that this isn't possible with Cocoa, given the way the window manager works. Am I mistaken?
Accessibility is higher-level than that. You send, for example, AXPress actions to AXButton objects, but “press” is not necessarily a click—pressing the space bar while a view is focused, for example, is also a “press”. AXPress is a high-level action that means “do your thing”, which obviously has meaning for some views (such as buttons) and not others (such as fields).
Accessibility activating the application does make sense when you look at it from its intended purpose: Assistive devices for disabled users. If the user “presses” something by whatever means, they probably intend to activate the application and work in it.
Quartz Event Services will get you almost there: You can create an event tap for the process you want to control, and you can forge events and send them to a tap. The catch is that you can only send events to a tap when the tap fires—i.e., when the application already has an event to deal with. When it doesn't, you're stuck.

Qt::X11BypassWindowManagerHint functionality on Windows

I'm currently developing a cross-plataform virtual keyboard. In linux i was able to do whatever i want, but in Windows i'm having problems to prevent the widget to obtain the keyboard focus.
In linux, using the window flag
Qt::X11BypassWindowManagerHint
the widget never gets the keyboard input, but of course, that flag does not work on Windows
Is there something equivalent to that flag or some method i can use instead?
any ideas would be appreciated
thanks in advance
I posted an answer to a similar question over in Make a floating QDockWidget unfocusable. On Win32 you don't really have the choice of bypassing the window manager completely, but you should be able to get most of the behavior you want by intercepting nativeEvent to handle WM_MOUSEACTIVATE.
I would try to ignore the event. I believe you need to ignore FocusIn on the main application window - not sure about the actual event, you might need to prototype it. You can do ignore events by either installing an event filter or manually re-implementing one of the event methods (possibly event itself). I don't know which is the preferred way though but I'd attempt the event filter first for this task: http://doc.trolltech.com/4.6/qobject.html#eventFilter
I've never tried to capture the keyboard focus event, but I have been able to successfully ignore escape keys in a QDialog to prevent users from accidentally closing the window. I believe it should be possible.

Send keypresses across the network

I'd like to be able to send key presses from one computer to the other. I have a voice application on one system which I use for my headset, and the other system is my main system. The voice application uses a Push-to-talk (PTT) system, which I'd rather keep.
So what I'd like to do is press a key on my main system and have it sent across the network to my secondary system. At this stage all I know is how to get the key across the network, the specifics of actually detecting the key press on my main system and emulating the press on the secondary system is my problem.
The key I'd like to capture (when held down) and send to my secondary system is the right control key. I think the best way is to add a keyboard hook.
How can I do this in such a way that I can hit right control in any application on my main system and have this application pick that up and send it? When my secondary system receives the key, how do I send it to the entire system (rather than trying to find a specific application)? I'm fine with using low-level Win32 calls in unmanaged C++, I'd just like to know how to get this to work.
Thanks in advance.
It seems like you're already halfway there to your own custom solution, but as an alternate you might want to check out Synergy an open source keyboard and mouse extender.
I found the answer: I wrote a small keyboard hook to pick up the PTT press, and then send it via the network to the secondary system. The secondary system takes this keypress and uses the SendInput function to inject the key into the system input queue. I just tested it with Teamspeak and it works brilliantly.

How do I get keyboard events in an NSStatusWindowLevel window while my application is not frontmost?

After creating a translucent window (based on example code by Matt Gemmell) I want to get keyboard events in this window. It seems that there are only keyboard events when my application is the active application while I want keyboard events even when my application isn't active but the window is visible.
Basically I want behavior like that provided by the Quicksilver application (by blacktree).
Does anybody have any hints on how to do this?
There are two options:
Use GetEventMonitorTarget() with a tacked-on Carbon run loop to grab keyboard events. Sample code is available on this page at CocoaDev.
Register an event trap with CGEventTapCreate. Sample code can be found in this thread from the Apple developer mailing list.
Edit: Note that these methods only work if you check off “Enable access for assistive devices” in the Universal Access preference pane.
A simpler route that may work better for you is to make your app background-only. The discussion on CocoaDev of the LSUIElement plist key explains how to set it up. Basically, your application will not appear in the dock or the app switcher, and will not replace the current application's menu bar when activated. From a user perspective it's never the 'active' application, but any windows you open can get activated and respond to events normally. The only caveat is that you'll never get to show your menu bar, so you'll probably have to set up an NSStatusItem (one of those icon menus that show up on the right side of the menu bar) to control (i.e. quit, bring up prefs, etc.) your application.
Edit: I completely forgot about the Non-Activating Panel checkbox in Interface Builder. You need to use an NSPanel instead of an NSWindow to get this choice. This setting lets your panel accept clicks and keyboard input without activating your application. I'm betting that some mix of this setting and the Carbon Hot Keys API is what QuickSilver is using for their UI.
Update:
Apple actually seems to have changed everything again starting with 10.5 BTW (I recently upgraded and my sample code did not work as before).
Now you can indeed only capture keydown events setting up an event tap if you are either root or assistive devices are enabled, regardless on which level you plan to capture and regardless if you selected to capture (which allows you to modify and even discard events) or to be listen only. You can still get information when flags have changed (actually even change these) and other events, but keydown under no other circumstances.
However, using the carbon event handler and the method RegisterEventHotKey() allows you to register a hotkey and you'll get notified when it is pressed, you neither need to be root for that nor do you need anything like assistive devices enabled. I think Quicksilver is probably doing it that way.

Resources