I'm trying to use CGCreateEventTap to monitor global mouse clicks, however when I do this it seems to block interaction with my own app. Mouse clicks in other running apps work fine, but my own app (that is the DemoAppDelegate app) does not respond completely. I can drag the main window for the app, but the red/yellow/green window buttons are greyed out. And the DemoApp's menu is unclickable as well.
This seems really strange to me, and I've been unable to figure it out. Examples of using event taps are few and far between, so any advice is greatly appreciated.
#import "DemoAppDelegate.h"
CGEventRef myCGEventCallback(CGEventTapProxy proxy, CGEventType type, CGEventRef event, void *refcon) {
CGPoint location = CGEventGetLocation(event);
NSLog(#"location: (%f, %f) - %#\n", location.x, location.y, (NSString*)refcon);
return event;
}
#implementation DemoAppDelegate
#synthesize window;
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification {
CFMachPortRef eventTap;
CGEventMask eventMask;
CFRunLoopSourceRef runLoopSource;
eventMask = 1 << kCGEventLeftMouseDown;
eventTap = CGEventTapCreate(kCGSessionEventTap, kCGHeadInsertEventTap,
1, eventMask, myCGEventCallback, #"mydata");
runLoopSource = CFMachPortCreateRunLoopSource(kCFAllocatorDefault, eventTap, 0);
CFRunLoopAddSource(CFRunLoopGetCurrent(), runLoopSource,
kCFRunLoopCommonModes);
CGEventTapEnable(eventTap, true);
CFRunLoopRun();
}
#end
When you create a Cocoa application, -[NSApplication run] is responsible for running the event loop — it runs the run loop, and dispatches events. This means that you should remove that
CFRunLoopRun();
call at the bottom of your -applicationDidFinishLaunching: method implementation, since it prevents -applicationDidFinishLaunching: from returning and also prevents NSApplication from dispatching events.
Related
I am trying to bring to foreground a NSRunningApplication* instance, and inject a keyboard event.
NSRunningApplication* app = ...;
[app activateWithOptions: 0];
inject_keystrokes();
... fails to inject keyboard events, but:
NSRunningApplication* app = ...;
[app activateWithOptions: 0];
dispatch_time_t _100ms = dispatch_time( DISPATCH_TIME_NOW, (int64_t)(0.1 * NSEC_PER_SEC) );
dispatch_after(
_100ms,
dispatch_get_main_queue(),
^{ inject_keystrokes(); }
);
... succeeds.
I imagine it takes a certain amount of time for the window to render in the foreground, and maybe this happens on a separate thread, and this explains the injection failure.
However this is a very ugly solution. It relies on an arbitrary time interval.
It would be much cleaner to somehow wait for the window to complete foregrounding.
Is there any way of doing this?
PS inject_keystrokes() uses CGEventPost(kCGHIDEventTap, someCGEvent)
PPS Refs:
- Virtual keypress goes to wrong application
- Send NSEvent to background app
- http://advinprog.blogspot.com/2008/06/so-you-want-to-post-keyboard-event-in.html
Adding an observer for the KVO property isActive on NSRunningApplication works for me.
for (NSRunningApplication* ra in [[NSWorkspace sharedWorkspace] runningApplications])
{
if ([ra.bundleIdentifier isEqualToString:#"com.apple.TextEdit"])
{
[ra addObserver:self forKeyPath:#"isActive" options:0 context:ra];
[ra retain];
[ra activateWithOptions:0];
}
}
// ...
- (void)observeValueForKeyPath:(NSString*)keyPath ofObject:(id)object change:(NSDictionary*)change context:(void*)context
{
if ([keyPath isEqualToString:#"isActive"])
{
NSRunningApplication* ra = (NSRunningApplication*) context;
[ra removeObserver:self forKeyPath:#"isActive"];
[ra release];
inject_keystrokes();
}
}
Note that I manually retain and then release the NSRunningApplication to keep its reference alive, since I'm not keeping it in a property or ivar. You have to be careful that the reference doesn't get dropped with the observer still attached.
I am trying to detect and use a gamepad attached to my Mac in my game.
I am using the IOKit for this but without success.
The gamepad is recognized as I downloaded a Mac app that views all gamepads attached.
It is a Logitech F310 and I have set it to DirectInput.
I am calling setupInput inside awakeFromNib but none of the callbacks are ever called.
There is no error or exception thrown.
I have no idea why the callbacks are not called.
What am I doing wrong?
I am thinking maybe it's an issue with the run loop. I tried both CFRunLoopGetMain and CFRunLoopGetCurrent.
Not sure what else to do.
Edit: As suggested by someone I tried to put the same GamePad code in a new project and it worked.
There is some difference in the structure of the classes but I couldn't make it work in my app even if I tried to replicate the classes.
My app use AppDelegate and NSOpenGLView. Strangely enough it doesn't have an NSViewController.
The minimal app use NSOpenGLView but also NSResponder and when I put the gamepad code inside the(custom) NSResponder class it "works"(detects the gamepad and responds to input).
I tried to add that custom class to my app but it "didn't work".
What am I missing?
void gamepadWasAdded(void* inContext, IOReturn inResult, void* inSender, IOHIDDeviceRef device) {
NSLog(#"Gamepad was plugged in");
}
void gamepadWasRemoved(void* inContext, IOReturn inResult, void* inSender, IOHIDDeviceRef device) {
NSLog(#"Gamepad was unplugged");
}
void gamepadAction(void* inContext, IOReturn inResult, void* inSender, IOHIDValueRef value) {
NSLog(#"Gamepad talked!");
IOHIDElementRef element = IOHIDValueGetElement(value);
NSLog(#"Element: %#", element);
int elementValue = IOHIDValueGetIntegerValue(value);
NSLog(#"Element value: %i", elementValue);
}
-(void) setupInput {
//get a HID manager reference
hidManager = IOHIDManagerCreate(kCFAllocatorDefault,
kIOHIDOptionsTypeNone);
//define the device to search for, via usage page and usage key
NSMutableDictionary* criterion = [[NSMutableDictionary alloc] init];
[criterion setObject: [NSNumber numberWithInt: kHIDPage_GenericDesktop]
forKey: (NSString*)CFSTR(kIOHIDDeviceUsagePageKey)];
[criterion setObject: [NSNumber numberWithInt: kHIDUsage_GD_Joystick]
forKey: (NSString*)CFSTR(kIOHIDDeviceUsageKey)];
//search for the device
IOHIDManagerSetDeviceMatching(hidManager,
(__bridge CFDictionaryRef)criterion);
//register our callback functions
IOHIDManagerRegisterDeviceMatchingCallback(hidManager, gamepadWasAdded,
(__bridge void*)self);
IOHIDManagerRegisterDeviceRemovalCallback(hidManager, gamepadWasRemoved,
(__bridge void*)self);
IOHIDManagerRegisterInputValueCallback(hidManager, gamepadAction,
(__bridge void*)self);
//scedule our HIDManager with the current run loop, so that we
//are able to recieve events from the hardware.
IOHIDManagerScheduleWithRunLoop(hidManager, CFRunLoopGetMain(),
kCFRunLoopDefaultMode);
//open the HID manager, so that it can start routing events
//to our callbacks.
IOHIDManagerOpen(hidManager, kIOHIDOptionsTypeNone);
}
// Put our timer in -awakeFromNib, so it can start up right from the beginning
-(void)awakeFromNib
{
renderTimer = [NSTimer timerWithTimeInterval:0.001 //a 1ms time interval
target:self
selector:#selector(timerFired:)
userInfo:nil
repeats:YES];
[[NSRunLoop currentRunLoop] addTimer:renderTimer
forMode:NSDefaultRunLoopMode];
[[NSRunLoop currentRunLoop] addTimer:renderTimer
forMode:NSEventTrackingRunLoopMode]; //Ensure timer fires during resize
[self setupInput];
}
Ok, I have found the issue.
The problem was that my Mac app was in sandbox and I didn't check mark the Hardware/USB check box.
With this option checked it works with the original code.
I'm creating a piano app (for OSX) that has an onscreen keyboard that displays what the user is playing on their synth keyboard. Everything is connected using the CoreMIDI framework. I have created some customs buttons by subclassing NSButton in a class called PianoKey. The class can be seen below:
#import "PianoKey.h"
#implementation PianoKey
//updates the core animation layer whenever it needs to be changed
- (BOOL)wantsUpdateLayer {
return YES;
}
- (void)updateLayer {
//tells you whether or not the button is pressed
if ([self.cell isHighlighted]) {
NSLog(#"BUTTON PRESSED");
self.layer.contents = [NSImage imageNamed:#"buttonpressed.png"];
}
else {
NSLog(#"BUTTON NOT PRESSED");
self.layer.contents = [NSImage imageNamed:#"button.png"];
}
}
#end
The "PianoKeys" are created programmatically. When unpressed, the piano keys are blue and when they are pressed they are pink (just temp colours). The colour switch works fine if I click the buttons on screen, but they are not changing when I try to play through my MIDI keyboard.
Some things to note:
1) The MIDI keyboard works
2) I am successfully getting MIDI data from the keyboard.
a) This MIDI data is passed to a callback function called midiInputCallback as seen below:
[from class AppController.m]
void midiInputCallback(const MIDIPacketList *list, void *procRef, void *srcRef) {
PianoKey *button = (__bridge PianoKey*)procRef;
UInt16 nBytes;
const MIDIPacket *packet = &list->packet[0]; //gets first packet in list
for(unsigned int i = 0; i < list->numPackets; i++) {
nBytes = packet->length; //number of bytes in a packet
handleMIDIStatus(packet, button);
packet = MIDIPacketNext(packet);
}
}
The object button is a reference to the button that I've created programatically as defined in AppController.h as:
#property PianoKey *button; //yes, this is synthesized in AppController.m
b) The callback function calls a bunch of functions that handle the MIDI data. If a note has been detected as being played, this is where I set my custom button to be highlighted...this can be seen in the function below:
void updateKeyboardButtonAfterKeyPressed(PianoKey *button, int key, bool keyOn) {
if(keyOn)
[button highlight:YES];
else
[button highlight:NO];
[button updateLayer];
}
[button updateLayer] calls my updateLayer() method from PianoKey, but it doesn't change the image. Is there something I'm missing? It seems as if the view or window is not being updated, even though my button says that it is "highlighted". Please help! Thanks in advance :)
Just a guess (but a good one...;-) Is the callback function being called on the main thread? If not, that could be the problem - at leas that is the way it works in iOS: the UI (screen drawing) is only updated in the main thread.
Put a log statement or breakpoint in the callback to see if it is getting called. If it is, then the problem is due to this thread issue.
Update:
So tell the button to update itself - but on the main thread (I think I have the syntaxt right - at least for iOS - OSX may vary)
[button performSelectorOnMainThread:#selector(updateLayer)
withObject:nil
waitUntilDone:FALSE];
Just wondering if there is a way to receive a callback in Cocoa if Magic Mouse or Trackpad is being touched by the user?
I looked into Quartz Events, but it seems I can only get callbacks if the mouse is moving or clicked etc.
Note that I want to receive a callback even if my app is not active. It's a background utility app. Also, it can't use private frameworks as it's going to be a Mac App Store app.
You could use this code to trap the events: (create a new Cocoa application and put this in the application delegate)
NSEventMask eventMask = NSEventMaskGesture | NSEventMaskMagnify | NSEventMaskSwipe | NSEventMaskRotate | NSEventMaskBeginGesture | NSEventMaskEndGesture;
CGEventRef eventTapCallback(CGEventTapProxy proxy, CGEventType type, CGEventRef eventRef, void *userInfo) {
NSEvent *event = [NSEvent eventWithCGEvent:eventRef];
// only act for events which do match the mask
if (eventMask & NSEventMaskFromType([event type])) {
NSLog(#"eventTapCallback: [event type] = %ld", [event type]);
}
return [event CGEvent];
}
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification {
CFMachPortRef eventTap = CGEventTapCreate(kCGSessionEventTap, kCGHeadInsertEventTap, kCGEventTapOptionListenOnly, kCGEventMaskForAllEvents, eventTapCallback, nil);
CFRunLoopAddSource(CFRunLoopGetCurrent(), CFMachPortCreateRunLoopSource(kCFAllocatorDefault, eventTap, 0), kCFRunLoopCommonModes);
CGEventTapEnable(eventTap, true);
}
but.. sandboxing will probably prevent you from using CGEventTapCreate because by nature, it allows an application to listen to the whole event system, which is not very secure. If not using the sandboxing is acceptable to you, then eventTapCallback is called when a new touch is made on the touchpad.
I am developing an Desktop application in which I should be able to take mouse events on transparent window. But, transparent NSWindow does not take mouse events. So, I have set setIgnoreMouseEvents to NO which allows the transparent window to take mouse events.
I have the problem in the following scenario:
There is dynamically created rectangular shape on this window. The transparent window should not take mouse events in this region; it should be delegated to the window (of some other app) that is present behind this shape.
For this purpose, if the mouseDown event is inside the shape I am setting setIgnoreMouseEvents to YES. Now, if the user performs mouse events in the area outside the shape the transparent window should take the event. Since, setIgnoreMouseEvents is set to YES, window does not take mouse events.
There is no way to identify that mouseDown event has occurred so that I can set setIgnoreMouseEvents to NO.
Could someone suggest me some best method to handle mouse events on transparent window?
Deepa
I've just come across Quartz Event Taps, which basically let you capture the mouse event and execute your own callback.
Haven't tried it out myself, but it seems like you should be able to check where the mouse click fell and execute conditionally on the values
Here's an example:
//---------------------------------------------------------------------------
CGEventRef MouseTapCallback( CGEventTapProxy aProxy, CGEventType aType, CGEventRef aEvent, void* aRefcon )
//---------------------------------------------------------------------------
{
if( aType == kCGEventRightMouseDown ) NSLog( #"down" );
else if( aType == kCGEventRightMouseUp ) NSLog( #"up" );
else NSLog( #"other" );
CGPoint theLocation = CGEventGetLocation(aEvent);
NSLog( #"location x: %d y:%d", theLocation.x, theLocation.y );
return aEvent;
}
//---------------------------------------------------------------------------
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
//---------------------------------------------------------------------------
{
CGEventMask theEventMask = CGEventMaskBit(kCGEventRightMouseDown) |
CGEventMaskBit(kCGEventRightMouseUp);
CFMachPortRef theEventTap = CGEventTapCreate( kCGSessionEventTap,
kCGHeadInsertEventTap,
0,
theEventMask,
MouseTapCallback,
NULL );
if( !theEventTap )
{
NSLog( #"Failed to create event tap!" );
}
CFRunLoopSourceRef theRunLoopSource =
CFMachPortCreateRunLoopSource( kCFAllocatorDefault, theEventTap, 0);
CFRunLoopAddSource( CFRunLoopGetCurrent(),
theRunLoopSource,
kCFRunLoopCommonModes);
CGEventTapEnable(theEventTap, true);
}