Global mouse event handling similar to MacOsX' build in screenshot functionality - macos

I know there are several questions on Stackoverflow regarding global mouse events in MacOsX and I've spent the last hours searching for one that helped me out.
What I want to create is a functionality that works almost like the build in screenshot functionality of OsX. (Shift+CMD+4)
I want to press a Shortcut and activate the mouse listener with it. Then the mouseDown event should return one coordinate and the mouseRelease event another one. I then want to make a CGRect from them and the eventlistener should be deactivated.
All the codesamples I found give me mouseEvents in a NSView or NSWindow but not global or they give me the location of the mouse coordinate but I've to poll it by myself with a NSTimer. Neither of those is what I need.
As I said: I've spent several hours searching. I've read through many questionthreads and the Apple guide for Cocoa event handling but none of the answers really helped me out.
Thanks for your time!

You will be able to get the global mouse events using
1) CGEvents. Use CGEventTap().
2) From OS X v 10.6, there is a new method :
+ (id)addGlobalMonitorForEventsMatchingMask:(NSEventMask)mask handler:(void (^)(NSEvent*))block

Related

Showing an NSSharingServicePicker on MouseUp

I'm adding sharing to my app (targeting Mavericks, 10.9), which I want to work like this:
User clicks Share button
Cursor changes to crosshair
User drags selection of what he'd like to share
NSSharingServicePicker displays, allowing the user to pick the service to share with
I'm accomplishing this using the -mouseDown:, -mouseDragged:, and -mouseUp: events. mouseDown begins the selection, mouseDragged provides feedback as to the area being selected, and then mouseUp completes the drag, showing the picker. Each time, though, I get this written to the console:
2014-06-25 00:13:45.111 App[31401:303] Warning: -[NSSharingServicePicker showRelativeToRect: ofView: preferredEdge:] should not be called on mouseUp
Please configure the sender with -[NSControl sendActionOn:NSLeftMouseDownMask];
I don't understand why that would be a problem, unless you were showing it from a button click on mouse up. Should I ignore the message? I've tried showing it using dispatch_async and dispatch_after to try and get it to run outside the event's invocation, but they didn't work. I suppose I could ignore it, but does that leave the door to deprecation open?
I know this is a year late but,
I had the same problem. After some research, I cam back with this answer. Before I implemented this code, my button would spin for a while, and then return with the same error you had. When I click my share button now, it no longer lags, and does not return any error. Insert this into your app's awakeFromNib method:[yourShareButtonName sendActionOn:NSLeftMouseDownMask];.
This is what your code should look like:
- (void)awakeFromNib {
[yourShareButtonName sendActionOn:NSLeftMouseDownMask];
}
I hope this helps!

How do I create an action after a specific number of touches on a screen?

I am making an app that has to save the screen, like a screenshot, when the screen is tapped or touched a specific number of times. I have tried all of the solutions that other users have suggested that are associated with my question, but nothing helps...
I will appreciate all suggestions. :)
Thanks
The following blog post does a good job of explaining the built-in option for recognizing multiple taps in a row (and explains the shortcomings): Detecting tap and double-tap with Gesture Recognizers.
If you need more customized logic than is provided by the built-in gesture recognizers, you will either be implementing your own custom subclass of UIGestureRecognizer or you will be adding your logic into the UIResponder (superclass of UIViewController, UIView, etc.) callbacks for tap input: touchesBegan:withEvent:, touchesMoved:withEvent:, and touchesEnded:withEvent:.
I have more experience with the latter method (not UIGestureRecognizer). The UITouch events passed to the various UIResponder callbacks each contain information about touch location and touch timing. You could use this information in combination with a NSTimer to determine if the user taps twice (or more) within a certain amount of time. If the timer fires before the second (or nth touch), then you can consider it a single touch event.
I don't know if that is the best way to do this, but it is certainly more granular control than the built-in UIGestureRecognizers provide you.

TapGestureRecognizer Not Working in IOS6 Maps

So anyway, as normal, my app is working pretty awesome in iOS5.
Then came iOS6 with new Maps app, and it no longer function as it supposed to. (insert Roll Eyes icon here)
I have a MapView with the following user interactions:
User can zoom in/out by pinching.
User can double tap to drop a pin.
This is no longer the case with iOS6, No.2 does not work anymore.
It seems the new MapView no longer detects double tapping directly.
If I disable zooming (in IB) then it works.
So, how can I make this work?
Or perhaps it is better to change the operation from "double tap" to "long tap"?
Any suggestions or how to make it work while enabling zoom
is appreciated.
Thanks yall.
What the heck.
Nobody answers this, so I'm going to answer for myself.
What I did is a simple hack.
implement the touchesBegan delegate in the same viewcontroller.
in there, detect single touch, and if detected a single touch, disable the MapkitView zooming. Hence the TapGestureRecognizer implemented for the MapKitView will response.In this
case a double tap will response as I needed it to.
And when a 2 touches detected (two fingers on the screen), enable back the zooming of the MapkitView.
This works pretty ok.
Hope this helps other people who is facing the same problem with iOS6 Maps app.
Surprisingly, this still works ok for Google Maps (prior iOS6).
Thanks.

System-wide recognition of scroll events on Mac OSX and setting focus to a different window

I'm registering for global mouse wheel events in my cocoa application. My goal is to have some kind of background application to be able to focus a window of another application when the user scrolls in it's window. If possible with Objective-C and Cocoa, what route would I need to go if I wanted to do this?
My code for the event registering looks like this:
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
{
[NSEvent addGlobalMonitorForEventsMatchingMask:
NSScrollWheelMask handler:^(NSEvent * ev) {
NSLog(#"%#", ev.description);
}];
}
This works, but the data captured in the event (like the window or the windowid) I don't seem to be able to manipulate - and the window id doesn't even seem to be the correct one, as I can get a list of windows and get a different id in there - just the screen position seems to be accurate. So three questions to solve this riddle:
How can I get a window or window id at a certain location on the
screen?
If I can only get a window id, how can I find the appropriate
application or window object to manipulate?
I guess I would need the accessibility API for manipulating the
window and giving it focus. How does that work?
Maybe these are simple tasks, but I've not ever written a Mac-Cocoa application before. Before suggesting documentations to read, you should know that I already scanned all the documentation, and that I better learn by example than by reading books :-)
EDIT: I just found out that I might use the ProcessManager to bring the application to the front. If you think this is a possible solution, ho can I get the process id for the window on a certain point on the screen?
EDIT2: I don't want to use Carbon APIs.

Global Mouse Moved Events in Cocoa

Is there a way to register for global mouse moved events in Cocoa? I was able to register for the events using Carbon's InstallEventHandler(), but would prefer a Cocoa equivalent. I have looked for NSNotificationCenter events, but there doesn't seem to be any public event names (are there private ones?)
Alternatively, is there a way to use NSTrackingArea for views with a clearColor background?
The app is Snow Leopard only.
In SnowLeopard there is a new class method on NSEvent which does exactly what you want: + (id)addGlobalMonitorForEventsMatchingMask:(NSEventMask)mask handler:(void (^)(NSEvent*))block. You’ll want mask = NSMouseMovedMask.
A similar question was already asked on StackOverflow:
How to make a transparent NSView subclass handle mouse events?
To summarize, the tansparent view method didn't work. Quartz Event Taps seem to be the best answer.
Here are some hints on working with taps:
Create the tap with CGEventTapCreate.
a) For the location (first) parameter you'll probably want to use kCGSessionEventTap.
b) For the placement (second) parameter you'll probably want kCGHeadInsertEventTap.
c) For the event mask parameter, try (1 << kCGEventMouseMoved).
Create a run loop source with CFMachPortCreateRunLoopSource, passing the event tap as the second parameter.
Add the run loop source to your run loop. Assuming you want it added to the main run loop, do:
CFRunLoopAddSource(CFRunLoopGetMain(), sourceFromStep2, kCFRunLoopDefaultMode);
Enable the event tap with CGEventTapEnable
If you want to track the mouse no matter where it is, you want a CGEventTap. There is no Cocoa equivalent. If you just want to track it in your application then you should explain why you're finding yourself unable to do so a little more thoroughly.

Resources