How can one detect Mission Control or Command-Tab switcher superseding one's program in OS X? - cocoa

I'm trying to use CGAssociateMouseAndMouseCursorPosition(NO) in a program. This disconnects the mouse from the on screen cursor when your application is "in the foreground". Unfortunately it also disconnects it when Mission Control or the application switcher or who knows what else comes up.
So far I know:
The application is still active.
The window is still key.
Nothing is sent to the default notification center when these things come up.
The application stops receiving mouse moved events, but an NSEvent addGlobalMonitorForEventsMatchingMask:handler: also does not receive them, which is strange to say the least. It should receive any events not delivered to my application. (I was planning to detect the missing events to know when to associate the mouse again.
So, is there a way to detect when my application is no longer in control, specifically because Mission Control or the switch has taken over? They really expect the mouse to work and I need to restore that association for them.

I share your surprise that a global event monitor isn't seeing the events. In a similar situation, I used a Quartz Event Tap for a similar purpose. The Cocoa global event monitor is quite similar to event taps, so I figured it would work.
I put the tap on kCGAnnotatedSessionEventTap and compared the result from CGEventGetIntegerValueField(event, kCGEventTargetUnixProcessID) to getpid() to determine when the events were going to another app (e.g. Mission Control or Exposé). (I disable the tab when my app resigns active status, so it should only receive events destined for another app when this sort of overlay UI is presented.)
By the way, you mentioned monitoring the default notification center, but, if there's a notification about Mission Control or the like, it's more likely to come to the distributed notification center (NSDistributedNotificationCenter). So, it's worth checking that.

I needed to check for mission control being active and ended up with an approach along the lines of Ken's answer.
Sharing is caring so here is the smallest sensible complete code that worked for me: (Swift 5)
import Foundation
import AppKit
let dockPid = NSRunningApplication.runningApplications(withBundleIdentifier: "com.apple.dock").first?.processIdentifier
var eventTargetPid: Int32?
let eventTap = CGEvent.tapCreate(
tap: .cgAnnotatedSessionEventTap,
place: .headInsertEventTap,
options: .listenOnly,
eventsOfInterest: CGEventMask(
(1 << CGEventType.mouseMoved.rawValue)
| (1 << CGEventType.keyDown.rawValue)
),
callback: { (tapProxy, type, event, _:UnsafeMutableRawPointer?) -> Unmanaged<CGEvent>? in
// Now, each time the mouse moves this var will receive the event's target pid
eventTargetPid = Int32(event.getIntegerValueField(.eventTargetUnixProcessID))
return nil
},
userInfo: nil
)!
// Add the event tap to our runloop
CFRunLoopAddSource(
CFRunLoopGetCurrent(),
CFMachPortCreateRunLoopSource(kCFAllocatorDefault, eventTap, 0),
.commonModes
)
let periodSeconds = 1.0
// Add a timer for periodic checking
CFRunLoopAddTimer(CFRunLoopGetCurrent(), CFRunLoopTimerCreateWithHandler(
kCFAllocatorDefault,
CFAbsoluteTimeGetCurrent() + periodSeconds, periodSeconds, 0, 0,
{ timer in
guard eventTargetPid != dockPid else {
print("Dock")
return
}
print("Not dock")
// Do things. This code will not run if the dock is getting events, which seems to always be the case if mission control or command switcher are active
}), .commonModes)
CFRunLoopRun()
This simply checks whether the dock was the one to receive the last event of interest (here that includes mouse movement and key-downs).
It covers most cases, but will report the wrong value between the command switcher or mission-control hiding and the first event being sent to a non-dock app. This is fine in my use-case but could be an issue for other ones.
Also, of course, when the dock at the bottom is active, this will detect that too.

Have you tried asking NSRunningApplication?

Related

What has changed? Wake Windows and turn on monitor from Windows API

I have an old C program for displaying caller ID called YAC. Fortunately, the author Jensen Harris provided the source.
15 years ago, I modified the source to turn on the monitor if the computer was awake but the monitor was off. The code below worked well, turning on the monitor and making the caller ID message visible on the screen.
// TG - add a call to turn on the monitor if it is sleeping.....
SendMessage(hwnd, WM_SYSCOMMAND, SC_MONITORPOWER, -1);
Recently the behavior has changed (presumably a Windows update changed something)
Now when a Caller ID message should be displayed, the monitor turns on (as evidenced by the LED), but the screen remains black. The monitor remains in the black-screen condition for a few seconds, then turns off again.
What additional or different call is now required to cause Windows to activate the display and show the desktop? Possibly this could be forced by sending a mouse move, but is there a better way?
EDIT:
I have implemented the following additional code to press and release ESC. I was unable to find a good example of a relative mouse move of 1 pixel, so I used an example for keyboard. I will test and see if it is effective.
INPUT inputs[2];
UINT uSent;
// reference: https://learn.microsoft.com/en-us/windows/win32/api/winuser/nf-winuser-sendinput
ZeroMemory(inputs, sizeof(inputs));
inputs[0].type = INPUT_KEYBOARD;
inputs[0].ki.wVk = VK_ESCAPE;
inputs[1].type = INPUT_KEYBOARD;
inputs[1].ki.wVk = VK_ESCAPE;
inputs[1].ki.dwFlags = KEYEVENTF_KEYUP;
uSent = SendInput(ARRAYSIZE(inputs), inputs, sizeof(INPUT));
EDIT2 - I can confirm this approach does work to cause the monitor to display video, but of course has the potential for side-effects as any keyboard or mouse action would. I would still be interested in learning of a pure API function that works to fully wake the system like SC_MONITORPOWER used to.

How check for Command-Period in a tight loop?

I'm implementing a scripting language where the user might be causing an endless loop by accident. I want to give the user the opportunity to cancel such a runaway loop by holding down the command key while typing the period (".") key.
Currently, once for every line, I check for cancellation with this code:
NSEvent * evt = [[NSApplication sharedApplication] nextEventMatchingMask: NSKeyDownMask untilDate: [NSDate date] inMode: WILDScriptExecutionEventLoopMode dequeue: YES];
if( evt )
{
NSString * theKeys = [evt charactersIgnoringModifiers];
if( (evt.modifierFlags & NSCommandKeyMask) && theKeys.length > 0 && [theKeys characterAtIndex: 0] == '.' )
{
// +++ cancel script execution here.
}
}
The problem with this is that it eats any keyboard events that the user might be typing while the script is running, even though scripts should be able to check for keypresses. Also, it doesn't dequeue the corresponding NSKeyUp events. But if I tell it to dequeue key up events as well, it might dequeue the keyUp for a keypress that was being held before my script started and my application might never find out the key was released.
Also, I would like to not dequeue any events until I know it is actually a cancel event, but there is no separate dequeue call, and it feels unreliable to just assume the frontmost event on a second call will be the same one. And even if it is guaranteed to be the first, that would mean that the user typing an 'a' and then Cmd-. would mean I only ever see the 'a' and never the Cmd-. behind it if I don't dequeue events.
Is there a better option than going to the old Carbon stand-by GetKeys()? Fortunately, that seems to be available in 64 bit.
Also, I'm thinking about adding an NSStatusItem that adds a button to cancel the script to the menu bar or so. But how would I process events in a way that doesn't let the user e.g. select a menu while a script expects to be ruler of the main thread?
Any suggestions? Recommendations?
Using -addLocalMonitorForEventsMatchingMask: as Dave suggests is probably the easiest way to go about this, yes.
I just wanted to add that despite your unreliable feeling, the event queue is really a queue, and events don't change order. It is perfectly safe (and standard practice in event loops) to call -nextEventMatchingMask:inMode:dequeue:NO, examine the event, determine that it is one you want to deal with, and then call -nextEventMatchingMask:inMode:dequeue:YES in order to consume it. Just make sure that your mask and mode are identical between the two calls.
I would suggest using an event monitor. Since you're asking NSApp for events, it would seem that you're running the script in the current process, so you only have to monitor events in your own process (and not globally).
There are several ways to do this (subclassing NSApplication and overriding -sendEvent:, putting in an event tap, etc), but the easiest way to do this would be with a local event monitor:
id eventHandler = [NSEvent addLocalMonitorForEventsMatchingMask:NSKeyDown handler:^(NSEvent *event) {
// check the runloop mode
// check for cmd-.
// abort the script if necessary
return event;
}];
When you're all done monitoring for events, don't forget to unregister your monitor:
[NSEvent removeMonitor:eventHandler];
So there's +[NSEvent modifierFlags] which is intended as a replacement for GetKeys(). It doesn't cover your use case of the period key though, sadly.
The core problem here with the event queue is you want to be able to search it, which isn't something the API exposes. The only workaround to that I can think of is to dequeue all events into an array, checking for a Command-. event, and then re-queue them all using postEvent:atStart:. Not pretty.
Perhaps as an optimisation you could use +[NSEvent modifierFlags] to only check the event queue when the command key is held down, but that sounds open to race conditions to me.
So final suggestion, override -postEvent:atStart: (on either NSApplication or NSWindow) and see if you can fish out the desired info there. I think at worst it could be interesting for debugging.

Simulated MouseEvent not working properly OSX

Back in 2010, Pierre asked this question (his accepted answer doesn't work for me).
I'm having the same problem: I am able to successfully move the mouse around (and off!?!) the screen programmatically from my Cocoa Application, however bringing the mouse to the location of my dock doesn't show it (and some other applications aren't registering the mouse moved event, eg. games that remove the mouse)
The method I am using is thus:
void PostMouseEvent(CGMouseButton button, CGEventType type, const CGPoint point)
{
CGEventRef theEvent = CGEventCreateMouseEvent(NULL, type, point, button);
CGEventSetType(theEvent, type);
CGEventPost(kCGSessionEventTap, theEvent);
CFRelease(theEvent);
}
And then when I want to move the mouse I run:
PostMouseEvent(0, kCGEventMouseMoved, mouseLocation);
Note that this code DOES generate mouseover events for things such as links.
Now that's it's 2013, is it possible to fix this issue?
Thanks for your time!
I would both warp the cursor and generate the mouse-move event. I know from experience, for example, that warping the cursor, while it doesn't generate an event itself, modifies the subsequent mouse move event to include the moved distance in its mouse delta. I don't know if your synthesized move event will include the proper delta values on its own.
OK, so evidently MacOSX needs the mouse to be at exactly the edge of the screen for the dock to show!
Because I keep my dock on the left-side of the screen (due to many programs keeping vital buttons at the bottom of their windows), all I had to do was say
if (mouseLocation.x < 0)
{
mouseLocation.x = 0;
}
And it worked!
I am also using KenThomases' idea to warp the cursor as well.
(this answer is marked correct as it allows me to show the dock - however there are still some applications that are not responding to mouse input)

OSX - disabling system-wide touch gestures

I need to programmatically disable/suppress system-wide touch gestures on Mac OS. I'm referring to gestures such as the 4-finger swipe between spaces, etc.
I've looked to EventTap but that doesn't appear to be an option (despite previous reports here - perhaps it's changed under 10.8)
I've also tried numerous ways of changing the the system preferences programatically. For example, I've tried using IOConnectSetCFProperties on the service having located it using IORegistryEntryCreateCFProperties.
I've also delved into the trackpad preference pane to see how they do it, and I tried to reproduce it (ignore any create/release inconsistencies, this is just test code):
NSInteger zero = 0;
CFNumberRef numberWith0 = CFNumberCreate(kCFAllocatorDefault, kCFNumberNSIntegerType, &zero);
CFMutableDictionaryRef propertyDict = CFDictionaryCreateMutable(kCFAllocatorDefault, 0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(propertyDict, #"TrackpadFourFingerHorizSwipeGesture", numberWith0);
io_connect_t connect = getEVSHandle(); // Found in the MachineSettings framework
if (!connect)
{
NSLog(#"Unable to get EVS handle");
}
kern_return_t status = IOConnectSetCFProperties(connect, propertyDict);
if (status != KERN_SUCCESS)
{
NSLog(#"Unable to get set IO properties");
}
CFRelease(propertyDict);
CFPreferencesSetValue(CFSTR("com.apple.trackpad.fourFingerHorizSwipeGesture"), _numberWith0, kCFPreferencesAnyApplication, kCFPreferencesCurrentUser, kCFPreferencesCurrentHost);
CFPreferencesSetValue(CFSTR("TrackpadFourFingerHorizSwipeGesture"), _numberWith0, CFSTR("com.apple.driver.AppleBluetoothMultitouch.trackpad"), kCFPreferencesCurrentUser, kCFPreferencesCurrentHost);
CFPreferencesSynchronize(kCFPreferencesAnyApplication, kCFPreferencesCurrentUser, kCFPreferencesCurrentHost);
status = BSKernelPreferenceChanged(CFSTR("com.apple.driver.AppleBluetoothMultitouch.trackpad"));
In this case it appears to work, there are no errors and the option becomes disabled in the system preference pane, however the four finger gesture continues to work. I suspect that logging out then in will have an effect, but I haven't tried because that's not good enough in any case.
It's worth noting that the Pref Pane itself also calls BSKernelPreferenceChanged, but I don't know which framework that might be in order to link to it. Perhaps that's the key to the problem...
UPDATE: Actually I've now found it and linked it to. Adding that call made no difference, although it returns 1 which may indicate an error. I've added the call to the code above.
Finally I tried this from the terminal:
defaults write -globalDomain com.apple.trackpad.fourFingerHorizSwipeGesture 0
defaults write com.apple.driver.AppleBluetoothMultitouch.trackpad TrackpadFourFingerHorizSwipeGesture 0
That doesn't have an immediate effect either.
I don't believe that this isn't possible, there must be a way...
MAS compatibility is not required.
I'm also trying to do this.
Event taps does not work, neither does having a view that is first responder.
From Apple docs:
However, there are certain system-wide gestures, such as a four-finger swipe. for which the system implementation takes precedence over any gesture handling an application performs.
The only way i've been able to stop the system wide gestures is using CGDisplayCapture. This gives my application exclusive access to all events... but also a fullscreen drawing context.
Perhaps it's possible to see what calls are made to the quartz event services when entering this mode
https://developer.apple.com/library/mac/#documentation/graphicsimaging/Conceptual/QuartzDisplayServicesConceptual/Articles/DisplayCapture.html
I think you are looking in the wrong spot for disabling the touch events. The way OSX (and many other systems) is that the first responder in the view chain to handle the event will stop the event from propagating. You will need to write event handlers in your views for each of the touch events you want to handle, and if they exist, the OS will stop sending the events all the way to finder or whatever other application is next in line to handle the touch events.
See: http://developer.apple.com/library/mac/#documentation/cocoa/conceptual/EventOverview/HandlingTouchEvents/HandlingTouchEvents.html
Specifically: Handling Multi-Touch Events (call setAcceptsTouchEvents, then implement touches...WithEvent...)
Hope that helps!

CGEventCreateKeyboardEvent on Desktop vs MacBook

Ola Folks,
Once again I want to drink from the pool of knowledge shared by people using SO.
I have written a small app for OSX that sends key events to an application. I am targeting OSX 10.5.x and newer. However, the problem exists when I build for 10.6.x as well. Everything works fine except when I send only the modifier keys; Alt, Command, Control and Shift.
The problem is that on the two MacBooks, the events for the modifier keys appear to be cleared as soon as the testers move the cursor using the mouse or touch the touchpad.
On a desktop with XCode installed, everything works fine. Just like it should. On two different MacBooks, the problem occurs. The desktop has a standard 101 key keyboard and a multi-button mouse attached.
When a mouse is connected to the MacBooks, a two button mouse with scrollwheel is used. However, the problem exists when no peripherals are attached and the touchpad is used.
What I expect to happen is that the Modifier Key event is sent to the target application, the user moves the cursor using the mouse / touchpad, press buttons on the mouse / touchpad and/or press keys on the keyboard with the modifier key down event 'active'. Then, when they finish, the key up event for the modifier key is sent.
Here is how I am sending key down events, the Shift key for this example:
case ModKeyShiftDown:
xEventSource = CGEventSourceCreate(kCGEventSourceStatePrivate);
xTheCommand = CGEventCreateKeyboardEvent(xEventSource, kVK_Shift, true);
CGEventSetFlags(xTheCommand, kCGEventFlagMaskAlternate);
CGEventPost(kCGHIDEventTap, xTheCommand);
//CGEventPost(kCGSessionEventTap, xTheCommand);
//CGEventPost(kCGAnnotatedSessionEventTap, xTheCommand);
CFRelease(xTheCommand);
CFRelease(xEventSource);
break;
I have used all three flags for creating the event source (kCGEventSourceStatePrivate, kCGEventSourceStateCombinedSessionState and kCGEventSourceStateHIDSystemState).
I have tried Creating the Keyboard Event with the Event Source as well as null as the first parameter.
I have tried with and without the appropriate flags on the event.
I have tried various combinations of posting the event; kCGHIDEventTap, kCGSessionEventTap and kCGAnnotatedSessionEventTap.
For completeness, here is how I send an up event for the Shift key:
case ModKeyShiftUp:
xEventSource = CGEventSourceCreate(kCGEventSourceStatePrivate);
xTheCommand = CGEventCreateKeyboardEvent(xEventSource, kVK_Shift, false);
CGEventSetFlags(xTheCommand, 0);
CGEventPost(kCGHIDEventTap, xTheCommand);
//CGEventPost(kCGSessionEventTap, xTheCommand);
//CGEventPost(kCGAnnotatedSessionEventTap, xTheCommand);
CFRelease(xTheCommand);
CFRelease(xEventSource);
break;
When the testers trigger a modifier key down event, they can see the cursor change as expected. This lets me know the event is being processed by the target application. However, as soon as they touch the mouse or touchpad, the cursor changes back to a standard cursor and the mouse events are processed as if no modifier key events are active.
I would like to know if there is a problem with the way I am sending the events. I would also like to know if there is an alternate way to send Modifier Key events that is Going To Work.
Sorry if I overtalked this. My excuse is that I only slept a couple of hours. :P
Thanx
-isdi-
Ola,
Interestingly enough, using AXUIElementRef and AXUIElementPostKeyboardEvent seem to work.
For whatever reason, sending key events using the Accessibility object and method above solves the problem for the testers.

Resources