As an experiment, I am trying to achieve the following:
Let spacebar work as a modifier key - like the Shift key - where holding the spacebar key down and typing keys print different letters. Releasing the spacebar would set the state back to normal, and just pressing it behaves like a normal space key.
I was thinking of handling the keydown and keyup event, but apparently handleEvent:client: in IMKServerInput Protocol seems to only catch key down and mouse events.
Without much experience with cocoa, I’ve tried some methods with no success:
went through the Technical Note 2128 via internet archive, which gave me the suitable explanations of plist items. Still, nothing about keyup.
tried adding NSKeyUpMask to recognizedEvents: in IMKStateSetting Protocol, but that didn’t seem to catch the event either.
tested a bit with addLocalMonitorForEventsMatchingMask:handler: but nothing happens.
failed to find a way to make NSFlagsChanged event fire with spacebar.
read about Quartz Event Service and CGEventTap which seems to handle user inputs in lower level. Didn’t go further to this route, yet.
IOHIDManager?
I reached to a conclusion that IMKit is only capable of passively receiving events.
Since it is not an application, there is no keyUp: method to override - AFAIK, IMKit does not inherit NSResponder class.
Unfortunately cocoa is way too broad and has much less (or overflowed with non-helping) documentations for a novice like me to dive in.
Can anyone help me to the right direction?
I tried all possible alternatives one by one, and eventually achieved it by creating a global EventTap
with CGEventTap.
The code basically looks like this:
// Create an event tap.
CGEventMask eventMask = ((1 << kCGEventKeyDown) | (1 << kCGEventKeyUp));
CFMachPortRef eventTap = CGEventTapCreate(kCGSessionEventTap,
kCGHeadInsertEventTap,
0,
eventMask,
myCGEventCallback,
NULL);
if (!eventTap) {
NSLog(#"failed to create event tap\n");
return NO;
} else {
// Create a run loop source.
runLoopSource = CFMachPortCreateRunLoopSource(kCFAllocatorDefault, eventTap, 0);
// Add to the current run loop.
CFRunLoopAddSource(CFRunLoopGetCurrent(), runLoopSource, kCFRunLoopCommonModes);
// Enable the event tap.
CGEventTapEnable(eventTap, true);
return YES;
}
where myCGEventCallback handles the global states.
Meanwhile here are some of what I've found out:
According to The Key-Input Message Sequence document, the application only passes the keydown event to the Input Method Kit, after trying other bunch of handlers in the chain. You cannot let IMKServerInput 'catch' the NSKeyUp event. Just adding an NSKeyUpMask to recognizedEvents: would not work.
addLocalMonitorForEventsMatchingMask:handler: and CGEventTapCreateForPSN would not catch the event. I suppose this is because though an Input Method may run as a separate process, the event itself is fired from the application, like TextEdit, and handed over to the Input `Method.
IOHIDManager: is for adding new hardware devices and making drivers.
Creating a global EventTap requires either running the process with sudo privilege -- copying the Input Method to /Library/Input Methods does not run with sudo privilege --, or registering the application to the Accessibility control. That's in System Preferences → Security & Privacy → Privacy tab → Accessibility, in Mavericks.
IMKStateSetting protocol's recognizedEvents: method should enable NSKeyUp events:
- (NSUInteger)recognizedEvents:(id)sender {
return NSKeyDownMask | NSKeyUpMask;
}
A client calls this method to check whether an input method supports an event. The default implementation returns NSKeyDownMask.
However, in testing my Input Method still does not catch NSKeyUp events. This seems like a bug.
I've filed the following Radar, and hope others will duplicate it:
rdar://21376535
Related
I'm implementing a scripting language where the user might be causing an endless loop by accident. I want to give the user the opportunity to cancel such a runaway loop by holding down the command key while typing the period (".") key.
Currently, once for every line, I check for cancellation with this code:
NSEvent * evt = [[NSApplication sharedApplication] nextEventMatchingMask: NSKeyDownMask untilDate: [NSDate date] inMode: WILDScriptExecutionEventLoopMode dequeue: YES];
if( evt )
{
NSString * theKeys = [evt charactersIgnoringModifiers];
if( (evt.modifierFlags & NSCommandKeyMask) && theKeys.length > 0 && [theKeys characterAtIndex: 0] == '.' )
{
// +++ cancel script execution here.
}
}
The problem with this is that it eats any keyboard events that the user might be typing while the script is running, even though scripts should be able to check for keypresses. Also, it doesn't dequeue the corresponding NSKeyUp events. But if I tell it to dequeue key up events as well, it might dequeue the keyUp for a keypress that was being held before my script started and my application might never find out the key was released.
Also, I would like to not dequeue any events until I know it is actually a cancel event, but there is no separate dequeue call, and it feels unreliable to just assume the frontmost event on a second call will be the same one. And even if it is guaranteed to be the first, that would mean that the user typing an 'a' and then Cmd-. would mean I only ever see the 'a' and never the Cmd-. behind it if I don't dequeue events.
Is there a better option than going to the old Carbon stand-by GetKeys()? Fortunately, that seems to be available in 64 bit.
Also, I'm thinking about adding an NSStatusItem that adds a button to cancel the script to the menu bar or so. But how would I process events in a way that doesn't let the user e.g. select a menu while a script expects to be ruler of the main thread?
Any suggestions? Recommendations?
Using -addLocalMonitorForEventsMatchingMask: as Dave suggests is probably the easiest way to go about this, yes.
I just wanted to add that despite your unreliable feeling, the event queue is really a queue, and events don't change order. It is perfectly safe (and standard practice in event loops) to call -nextEventMatchingMask:inMode:dequeue:NO, examine the event, determine that it is one you want to deal with, and then call -nextEventMatchingMask:inMode:dequeue:YES in order to consume it. Just make sure that your mask and mode are identical between the two calls.
I would suggest using an event monitor. Since you're asking NSApp for events, it would seem that you're running the script in the current process, so you only have to monitor events in your own process (and not globally).
There are several ways to do this (subclassing NSApplication and overriding -sendEvent:, putting in an event tap, etc), but the easiest way to do this would be with a local event monitor:
id eventHandler = [NSEvent addLocalMonitorForEventsMatchingMask:NSKeyDown handler:^(NSEvent *event) {
// check the runloop mode
// check for cmd-.
// abort the script if necessary
return event;
}];
When you're all done monitoring for events, don't forget to unregister your monitor:
[NSEvent removeMonitor:eventHandler];
So there's +[NSEvent modifierFlags] which is intended as a replacement for GetKeys(). It doesn't cover your use case of the period key though, sadly.
The core problem here with the event queue is you want to be able to search it, which isn't something the API exposes. The only workaround to that I can think of is to dequeue all events into an array, checking for a Command-. event, and then re-queue them all using postEvent:atStart:. Not pretty.
Perhaps as an optimisation you could use +[NSEvent modifierFlags] to only check the event queue when the command key is held down, but that sounds open to race conditions to me.
So final suggestion, override -postEvent:atStart: (on either NSApplication or NSWindow) and see if you can fish out the desired info there. I think at worst it could be interesting for debugging.
I have a basic keystroke converter app in development. The conversion works with the following:
CFRunLoopSourceRef runLoopSource = NULL;
CFMachPortRef eventTap = CGEventTapCreate(kCGHIDEventTap, kCGHeadInsertEventTap, kCGEventTapOptionDefault, kCGEventMaskForAllEvents, myCGEventCallback, NULL);
runLoopSource = CFMachPortCreateRunLoopSource(kCFAllocatorDefault, eventTap, 0);
CFRunLoopAddSource(CFRunLoopGetCurrent(), runLoopSource, kCFRunLoopCommonModes);
CGEventTapEnable(eventTap, true);
As you might expect, kCGEventMaskForAllEvents is constantly firing for any mouse movement or click in addition to the keyboard, and I suspect tying up system resources. I tried substituting CGEventMaskBit(kCGEventKeyDown), which best I can tell from Quartz Event doc on Event Types is what I want, and would weed out mouse movements and clicks. Unfortunately, using this seems to just eat the keystrokes, rather than convert them.
What am I doing wrong?
The following works, but I still don't understand why CGEventMaskBit(kCGEventKeyUp) by itself isn't the correct implementation.
CGEventMaskBit(kCGEventKeyUp) | CGEventMaskBit(kCGEventKeyDown) | CGEventMaskBit(NX_SYSDEFINED)
because a keystroke key press consists of a keydown and a key up
The discussion section of the CGEventTapCreate doc page says:
Event taps receive key up and key down events if one of the following conditions is true:
The current process is running as the root user.
Access for assistive devices is enabled. In OS X v10.4, you can enable this feature using System Preferences, Universal Access panel, Keyboard view.
Running as the root user definitely worked for me (MacOS Sierra.) I didn't try the assistive devices approach.
To run as root inside XCode (I have 8.3.3 at this time), choose Product/Scheme/Edit Scheme.../Run/Info/Debug Process As: root
In the CGEventTapCreate call, replace the kCGEventMaskForAllEvents argument with CGEventMaskBit(kCGEventKeyDown) | CGEventMaskBit(kCGEventKeyUp). Your callback will now get invoked for most key presses, except for modifier keys: shift, ctrl, cmd, some of teh function keys.
To get the callback invoked for the modifier keys, add CGEventMaskBit(kCGEventKeyDown) | CGEventMaskBit(kCGEventKeyUp) | CGEventMaskBit(NX_SYSDEFINED). For some reason, with this change I also get the callback invoked for mouse button presses. This might be a side effect of how the Logitech mouse driver works -- I didn't investigate. But the volume of calls is much lower than before and doesn't include mouse moves.
Dave Keck's response to this CocoaBuilder thread gets credit for figuring this out.
I need to programmatically disable/suppress system-wide touch gestures on Mac OS. I'm referring to gestures such as the 4-finger swipe between spaces, etc.
I've looked to EventTap but that doesn't appear to be an option (despite previous reports here - perhaps it's changed under 10.8)
I've also tried numerous ways of changing the the system preferences programatically. For example, I've tried using IOConnectSetCFProperties on the service having located it using IORegistryEntryCreateCFProperties.
I've also delved into the trackpad preference pane to see how they do it, and I tried to reproduce it (ignore any create/release inconsistencies, this is just test code):
NSInteger zero = 0;
CFNumberRef numberWith0 = CFNumberCreate(kCFAllocatorDefault, kCFNumberNSIntegerType, &zero);
CFMutableDictionaryRef propertyDict = CFDictionaryCreateMutable(kCFAllocatorDefault, 0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(propertyDict, #"TrackpadFourFingerHorizSwipeGesture", numberWith0);
io_connect_t connect = getEVSHandle(); // Found in the MachineSettings framework
if (!connect)
{
NSLog(#"Unable to get EVS handle");
}
kern_return_t status = IOConnectSetCFProperties(connect, propertyDict);
if (status != KERN_SUCCESS)
{
NSLog(#"Unable to get set IO properties");
}
CFRelease(propertyDict);
CFPreferencesSetValue(CFSTR("com.apple.trackpad.fourFingerHorizSwipeGesture"), _numberWith0, kCFPreferencesAnyApplication, kCFPreferencesCurrentUser, kCFPreferencesCurrentHost);
CFPreferencesSetValue(CFSTR("TrackpadFourFingerHorizSwipeGesture"), _numberWith0, CFSTR("com.apple.driver.AppleBluetoothMultitouch.trackpad"), kCFPreferencesCurrentUser, kCFPreferencesCurrentHost);
CFPreferencesSynchronize(kCFPreferencesAnyApplication, kCFPreferencesCurrentUser, kCFPreferencesCurrentHost);
status = BSKernelPreferenceChanged(CFSTR("com.apple.driver.AppleBluetoothMultitouch.trackpad"));
In this case it appears to work, there are no errors and the option becomes disabled in the system preference pane, however the four finger gesture continues to work. I suspect that logging out then in will have an effect, but I haven't tried because that's not good enough in any case.
It's worth noting that the Pref Pane itself also calls BSKernelPreferenceChanged, but I don't know which framework that might be in order to link to it. Perhaps that's the key to the problem...
UPDATE: Actually I've now found it and linked it to. Adding that call made no difference, although it returns 1 which may indicate an error. I've added the call to the code above.
Finally I tried this from the terminal:
defaults write -globalDomain com.apple.trackpad.fourFingerHorizSwipeGesture 0
defaults write com.apple.driver.AppleBluetoothMultitouch.trackpad TrackpadFourFingerHorizSwipeGesture 0
That doesn't have an immediate effect either.
I don't believe that this isn't possible, there must be a way...
MAS compatibility is not required.
I'm also trying to do this.
Event taps does not work, neither does having a view that is first responder.
From Apple docs:
However, there are certain system-wide gestures, such as a four-finger swipe. for which the system implementation takes precedence over any gesture handling an application performs.
The only way i've been able to stop the system wide gestures is using CGDisplayCapture. This gives my application exclusive access to all events... but also a fullscreen drawing context.
Perhaps it's possible to see what calls are made to the quartz event services when entering this mode
https://developer.apple.com/library/mac/#documentation/graphicsimaging/Conceptual/QuartzDisplayServicesConceptual/Articles/DisplayCapture.html
I think you are looking in the wrong spot for disabling the touch events. The way OSX (and many other systems) is that the first responder in the view chain to handle the event will stop the event from propagating. You will need to write event handlers in your views for each of the touch events you want to handle, and if they exist, the OS will stop sending the events all the way to finder or whatever other application is next in line to handle the touch events.
See: http://developer.apple.com/library/mac/#documentation/cocoa/conceptual/EventOverview/HandlingTouchEvents/HandlingTouchEvents.html
Specifically: Handling Multi-Touch Events (call setAcceptsTouchEvents, then implement touches...WithEvent...)
Hope that helps!
I'm trying to use CGAssociateMouseAndMouseCursorPosition(NO) in a program. This disconnects the mouse from the on screen cursor when your application is "in the foreground". Unfortunately it also disconnects it when Mission Control or the application switcher or who knows what else comes up.
So far I know:
The application is still active.
The window is still key.
Nothing is sent to the default notification center when these things come up.
The application stops receiving mouse moved events, but an NSEvent addGlobalMonitorForEventsMatchingMask:handler: also does not receive them, which is strange to say the least. It should receive any events not delivered to my application. (I was planning to detect the missing events to know when to associate the mouse again.
So, is there a way to detect when my application is no longer in control, specifically because Mission Control or the switch has taken over? They really expect the mouse to work and I need to restore that association for them.
I share your surprise that a global event monitor isn't seeing the events. In a similar situation, I used a Quartz Event Tap for a similar purpose. The Cocoa global event monitor is quite similar to event taps, so I figured it would work.
I put the tap on kCGAnnotatedSessionEventTap and compared the result from CGEventGetIntegerValueField(event, kCGEventTargetUnixProcessID) to getpid() to determine when the events were going to another app (e.g. Mission Control or Exposé). (I disable the tab when my app resigns active status, so it should only receive events destined for another app when this sort of overlay UI is presented.)
By the way, you mentioned monitoring the default notification center, but, if there's a notification about Mission Control or the like, it's more likely to come to the distributed notification center (NSDistributedNotificationCenter). So, it's worth checking that.
I needed to check for mission control being active and ended up with an approach along the lines of Ken's answer.
Sharing is caring so here is the smallest sensible complete code that worked for me: (Swift 5)
import Foundation
import AppKit
let dockPid = NSRunningApplication.runningApplications(withBundleIdentifier: "com.apple.dock").first?.processIdentifier
var eventTargetPid: Int32?
let eventTap = CGEvent.tapCreate(
tap: .cgAnnotatedSessionEventTap,
place: .headInsertEventTap,
options: .listenOnly,
eventsOfInterest: CGEventMask(
(1 << CGEventType.mouseMoved.rawValue)
| (1 << CGEventType.keyDown.rawValue)
),
callback: { (tapProxy, type, event, _:UnsafeMutableRawPointer?) -> Unmanaged<CGEvent>? in
// Now, each time the mouse moves this var will receive the event's target pid
eventTargetPid = Int32(event.getIntegerValueField(.eventTargetUnixProcessID))
return nil
},
userInfo: nil
)!
// Add the event tap to our runloop
CFRunLoopAddSource(
CFRunLoopGetCurrent(),
CFMachPortCreateRunLoopSource(kCFAllocatorDefault, eventTap, 0),
.commonModes
)
let periodSeconds = 1.0
// Add a timer for periodic checking
CFRunLoopAddTimer(CFRunLoopGetCurrent(), CFRunLoopTimerCreateWithHandler(
kCFAllocatorDefault,
CFAbsoluteTimeGetCurrent() + periodSeconds, periodSeconds, 0, 0,
{ timer in
guard eventTargetPid != dockPid else {
print("Dock")
return
}
print("Not dock")
// Do things. This code will not run if the dock is getting events, which seems to always be the case if mission control or command switcher are active
}), .commonModes)
CFRunLoopRun()
This simply checks whether the dock was the one to receive the last event of interest (here that includes mouse movement and key-downs).
It covers most cases, but will report the wrong value between the command switcher or mission-control hiding and the first event being sent to a non-dock app. This is fine in my use-case but could be an issue for other ones.
Also, of course, when the dock at the bottom is active, this will detect that too.
Have you tried asking NSRunningApplication?
I'm working on a typing-tutor application for Mac OS X that needs to have keystrokes forwarded to it, even when the application is not in focus.
Is there a way to have the system forward keystrokes to the app, possibly through NSDistributedNotificationCenter? I've googled myself silly, and haven't been able to find an answer...
EDIT: Sample code below.
Thanks #NSGod for pointing me in the right direction -- I ended up adding a global events monitor using the method addGlobalMonitorForEventsMatchingMask:handler:, which works beautifully. For completeness, my implementation looks like this:
// register for keys throughout the device...
[NSEvent addGlobalMonitorForEventsMatchingMask:NSKeyDownMask
handler:^(NSEvent *event){
NSString *chars = [[event characters] lowercaseString];
unichar character = [chars characterAtIndex:0];
NSLog(#"keydown globally! Which key? This key: %c", character);
}];
For me, the tricky part was using blocks, so I'll give a little description in case it helps anyone:
The thing to notice about the above code is that it's all one single method call on NSEvent. The block is supplied as an argument, directly to the function. You could think of it kind of like an inline delegate method. Just because this took a while to sink in for me, I'm going to work through it step by step here:
[NSEvent addGlobalMonitorForEventsMatchingMask:NSKeyDownMask
This first bit is no problem. You're calling a class method on NSEvent, and telling it which event you're looking to monitor, in this case NSKeyDownMask. A list of masks for supported event types can be found here.
Now, we come to the tricky part: handler, which expects a block:
handler:^(NSEvent *event){
It took me a few compile errors to get this right, but (thank you Apple) they were very constructive error messages. The first thing to notice is the carat ^. That signals the start of the block. After that, within the parentheses,
NSEvent *event
Which declares the variable that you'll be using within the block to capture the event. You could call it
NSEvent *someCustomNameForAnEvent
doesn't matter, you'll just be using that name within the block. Then, that's just about all there is to it. Make sure to close your curly brace, and bracket to finish the method call:
}];
And you're done! This really is kind of a 'one-liner'. It doesn't matter where you execute this call within your app -- I do it in the AppDelegate's applicationDidFinishLaunching method. Then, within the block, you can call other methods from within your app.
If you are okay with a minimum requirement of OS X 10.6+, and can suffice with "read-only" access to the stream of events, you can install a global event monitor in Cocoa:
Cocoa Event-Handling Guide: Monitoring Events.
If you need to support OS X 10.5 and earlier, and read-only access is okay, and don't mind working with the Carbon Event Manager, you can basically do the Carbon-equivalent using GetEventMonitorTarget(). (You will be hard-pressed to find any (official) documentation on that method though). That API was first available in OS X 10.3, I believe.
If you need read-write access to the event stream, then you will need to look at a slightly lower-level API that is part of ApplicationServices > CoreGraphics:CGEventTapCreate() and friends. This was first available in 10.4.
Note that all 3 methods will require that the user have "Enable access for assistive devices" enabled in the System Preferences > Universal Access preference pane (at least for key events).
I'm posting the code that worked for my case.
I'm adding the global event handler after the app launches. My shortcut makes ctrl+alt+cmd+T open my app.
- (void) applicationWillFinishLaunching:(NSNotification *)aNotification
{
// Register global key handler, passing a block as a callback function
[NSEvent addGlobalMonitorForEventsMatchingMask:NSKeyDownMask
handler:^(NSEvent *event){
// Activate app when pressing cmd+ctrl+alt+T
if([event modifierFlags] == 1835305 && [[event charactersIgnoringModifiers] compare:#"t"] == 0) {
[NSApp activateIgnoringOtherApps:YES];
}
}];
}
The issue I find with this is that any key registered globally by another app will not be cought... or at least in my case, perhaps I am doing something wrong.
If your program needs to display all keys, like "Command-Shift-3" for example, then it will not see that go by to display it... since it is taken up by the OS.
Or did someone figure that out? I'd love to know...
As NSGod already pointed out you can also use CoreGraphics.
In your class (e.g. in -init):
CFRunLoopRef runloop = (CFRunLoopRef)CFRunLoopGetCurrent();
CGEventMask interestedEvents = NSKeyDown;
CFMachPortRef eventTap = CGEventTapCreate(kCGSessionEventTap, kCGHeadInsertEventTap,
0, interestedEvents, myCGEventCallback, self);
// by passing self as last argument, you can later send events to this class instance
CFRunLoopSourceRef source = CFMachPortCreateRunLoopSource(kCFAllocatorDefault,
eventTap, 0);
CFRunLoopAddSource((CFRunLoopRef)runloop, source, kCFRunLoopCommonModes);
CFRunLoopRun();
Outside of the class, but in the same .m file:
CGEventRef myCGEventCallback(CGEventTapProxy proxy,
CGEventType type,
CGEventRef event,
void *refcon)
{
if(type == NX_KEYDOWN)
{
// we convert our event into plain unicode
UniChar myUnichar[2];
UniCharCount actualLength;
UniCharCount outputLength = 1;
CGEventKeyboardGetUnicodeString(event, outputLength, &actualLength, myUnichar);
// do something with the key
NSLog(#"Character: %c", *myUnichar);
NSLog(#"Int Value: %i", *myUnichar);
// you can now also call your class instance with refcon
[(id)refcon sendUniChar:*myUnichar];
}
// send event to next application
return event;
}