How to use Mac OS X Cocoa events for multitouch gestures - cocoa

I'm writing a program that has an NSView embedded in an NSScrollView which user can zoom. I'd love to set it up so the user can zoom the view using the multitouch pinch gesture supported on MacBook Air and the new unibody MacBooks/MacBooks Pro and in applications like Safari and iPhoto. I've hunted through Apple's documentation and can't figure out how do to this.
Is this supported using publicly available APIs on Mac OS X 10.5 Leopard?
If not, how "bad" are the private APIs (e.g. is it just an undeclared constant or a whole new set of methods)?

Edit: Snow Leopard adds supported APIs for gestures and multi-touch. See the AppKit release notes for Snow Leopard; ⌘F for “gesture” and “MultiTouch” (sic). They'll look pretty familiar if you've used ones below, but there probably are some fine differences, so read the new documentation anyway.
Is this supported using publicly available APIs on Mac OS X 10.5 Leopard?
No. 10.5.0 doesn't support it at all, and 10.5.1 through 10.5.6 make you implement undocumented methods.
If not, how "bad" are the private APIs (e.g. is it just an undeclared constant or a whole new set of methods)?
Not bad at all. You have to implement some undocumented event methods in your view. Since you're the one implementing the methods, you shouldn't crash if Apple changes the methods; all that will happen is the feature will stop working.
However, if you'll be retrieving the absolute (not delta) magnification or rotation from the event, then those are as-yet-undocumented methods of the event, so you should guard those messages with respondsToSelector: messages and perform careful range-checking on the methods' return values.

Related

How to send raw multitouch trackpad data under Mac OS X?

The end goal is to take touch input from an iOS device, send it over a websocket, accept it on the Mac OS X side, and send it through the system so everything using the private multitouch framework to accept input sees the data as if it were normal multitouch trackpad data.
The synthesize sub-project under https://github.com/calftrail/Touch seems like a good place to start. However it seems like the developer created it with the intent of taking valid multitouch input (from a magic mouse when there was arbitrarily little software support from Mac OS X), and piping it as multitouch trackpad input. I need to create valid/acceptable multitouch trackpad out of thin air (with just sequences of touch locations, not real HID data).
In deep here. Help, someone. :)
Glad you found my TouchSynthesis subproject — I think it will let you do what you need, since internally it is split up as you want it. [Please note however that this code is GPL licensed, i.e. virally open source, unlike many Mac libraries.]
You can treat TouchSynthesis.m as example code for using the TouchEvents "library" which provides support for your specific question via one "simple" function: tl_CGEventCreateFromGesture
The basic gist is that tl_CGEventCreateFromGesture takes in a dictionary of gesture+touch data and will return a CGEvent that you can inject via Quartz Event Services into the system. A gesture event is required to send what becomes NSTouch data, but IIRC could be a fairly generic "gesture" type rather than zoom/pan/etc.
This is sort of a halfway-private solution: Apple support injecting CGEvents into the system [at least outside The Sandbox? …I've since lost interest in their platforms so haven't researched that one…] so that part is "fine", but the actual CGEvent I create is of an undocumented type, the format for which I had to figure out via hex dumps and some Darwin source code HID headers they shared. It's that work that "TouchEvents.m" implements — that's how Sesamouse could "create valid/acceptable multitouch trackpad out of thin air" — and it should already be separate from the private framework MultitouchSupport stuff that read in the Magic Mouse input.

Is it possible for an Adobe AIR app to receive touch events in OS X?

I'm working on software in OS X that receives touches from a multi-touch device and posts them to the system for applications to process. I was hoping that it would be possible to support Adobe AIR applications, but from the research I've done, it looks like they don't support multi-touch events on OS X. This page in particular seems to indicate that, and consistent with what it says, the AIR applications I've tested with do indeed only respond to gesture events, but not touch events. However I'm (perhaps naively) hoping there's a way to do it regardless.
So is it possible for an AIR application running in Mac OS X to receive touch events at all? Perhaps using a different method than posting system-wide touch events? For example, is there any way my software could somehow send touch events directly to an AIR app?
Thanks in advance!

CGrafPtr to WindowRef

NPAPI in MacOs gives me CGrafPtr in NPWindow structure, I need a WindowRef.
Is there a way to get WindowRef from CGrafPtr ?
Thanks!
NPAPI only gives you a CGrafPtr if you are using the very, very deprecated QuickDraw drawing model (with Carbon event model). Writing a new plugin using the QuickDraw model would be a terrible idea: Firefox 64-bit doesn't support it, Safari 64-bit doesn't support it, and Chrome doesn't really support it (and soon won't at all). Your plugin wouldn't work for most users.
Instead, you should be using either Core Animation (drawing) + Cocoa (event), or CoreGraphics + Cocoa. In the Cocoa event model there is, deliberately, no way to get a reference to the browser window. Modern browsers almost all run plugins in a separate process, and you can't reference windows across processes.
In short, if you are trying to make a new NPAPI plugin that requires access to the browser window, your design is wrong.

A MapKit for Mac OS X?

on the iPhone we have the Apple's amazing MapKit. There is something similar for Mac OS X?
If possible something more advanced than a simple WebView, because I need that it manage automatically at least:
annotations
the user interaction
the zoom in/out
an overlay view
(Even if the maps are not from Google is ok.)
Thank you very much!
Update 2
MapKit is available in OS X 10.9 Mavericks : Map Kit Framework Reference.
Update - pulled from my comment below
The situation has changed and there now exists a third party MapKit for Mac OS X. Find it at http://github.com/Oomph/MacMapKit and a small writeup at http://rickfillion.tumblr.com/post/1134987954/pretroducing-mapkit-for-mac
Orginal Answer
There is no such API from Apple on Mac OS X. You should file a bug request at bugreporter.apple.com.
The best alternative is to use the Google Maps JavaScript API embedded in a WebKit view. Visit the Google Maps JavaScript API V3 Documentation to understand the API.
I realize that you asked for more then a simple WebView, but perhaps you're unaware of some of the more advance functionality a WebKit view allows.
Webkit provides means for bridging between the JavaScript scripting environment in your WebKit view and the rest of your Cocoa application.
To call a Javascript function from Objective-C, use your WebKit view's WebScriptObject. "Using Javascript From Objective-C" from the "WebKit Objective-C Programming Guide" is a great place to start learning.
If you need to call back into your Cocoa application from Javascript, "Calling Objective-C Methods From JavaScript" in the "WebKit DOM Programming Topics" provides examples and explanation.
These technologies used carefully together should provide the functionality you require.

os x gui api clarification

If I wanted to write my own window manager for OS X (please dont respond with "whats the point"??), what APIs should I be looking at?
There is no such thing as a "window manager" in OS X, and no public interface to implement one. The functions that an X11 window manager would perform are split between the GUI toolkit (Carbon/Cocoa), the Dock application and the window server.
Your only real choice if you want to change OS X's windowing behavior is to patch individual applications, the Dock (which has a privileged connection to the window server) and/or the window server. It'd involve a great deal of reverse engineering and almost certainly break in 10.6, but it's certainly possible.
At the hardware level, write your own APIs.
Otherwise, there are various graphics architectures in which to plug in your window manager:
OpenGL
Quartz
Quicktime
X11

Resources