How to send raw multitouch trackpad data under Mac OS X? - macos

The end goal is to take touch input from an iOS device, send it over a websocket, accept it on the Mac OS X side, and send it through the system so everything using the private multitouch framework to accept input sees the data as if it were normal multitouch trackpad data.
The synthesize sub-project under https://github.com/calftrail/Touch seems like a good place to start. However it seems like the developer created it with the intent of taking valid multitouch input (from a magic mouse when there was arbitrarily little software support from Mac OS X), and piping it as multitouch trackpad input. I need to create valid/acceptable multitouch trackpad out of thin air (with just sequences of touch locations, not real HID data).
In deep here. Help, someone. :)

Glad you found my TouchSynthesis subproject — I think it will let you do what you need, since internally it is split up as you want it. [Please note however that this code is GPL licensed, i.e. virally open source, unlike many Mac libraries.]
You can treat TouchSynthesis.m as example code for using the TouchEvents "library" which provides support for your specific question via one "simple" function: tl_CGEventCreateFromGesture
The basic gist is that tl_CGEventCreateFromGesture takes in a dictionary of gesture+touch data and will return a CGEvent that you can inject via Quartz Event Services into the system. A gesture event is required to send what becomes NSTouch data, but IIRC could be a fairly generic "gesture" type rather than zoom/pan/etc.
This is sort of a halfway-private solution: Apple support injecting CGEvents into the system [at least outside The Sandbox? …I've since lost interest in their platforms so haven't researched that one…] so that part is "fine", but the actual CGEvent I create is of an undocumented type, the format for which I had to figure out via hex dumps and some Darwin source code HID headers they shared. It's that work that "TouchEvents.m" implements — that's how Sesamouse could "create valid/acceptable multitouch trackpad out of thin air" — and it should already be separate from the private framework MultitouchSupport stuff that read in the Magic Mouse input.

Related

Recording X windows events

I am a novice when it comes to X windows but have some knowledge of Unix as such.
My project requires me to track user input and output on X window system. For instance, if the GUI is used to configure a route, I would like to know what application is used and what route has been configured. So far, I have explored the following options with partial success.
1)Tried to hook functions like XDrawString and XDrawText using LD_PRELOAD.
2)Used xwininfo to obtain window id and tools like xev.
3)Looked through similar discussions in this forum especially on xev and xinput
1)May not work with if X11 is statically linked? Not sure.
2)xev does not record key press events for a file edited with gedit or attempting to rename a file from the GUI
3)I am trying to go through X windows system internals.
I am pretty discouraged so far. Any input/pointer will be appreciated.
I think you want the cnee program from the Xnee project, which uses the X window system Record extension. The examples that I see for cnee are almost always about recording input events, but, according to the Xnee manual at https://xnee.files.wordpress.com/2012/10/xnee1.pdf, section 3.2.1 ("Record"), "Xnee can record the whole X11 protocol, not just mouse and keyboard events."
Regarding font operations, I believe that X font facilities, mostly through the X font server, evolved over time too, so it might be the case that the applications that you care about are doing X font operations which you can trace.

UVC (USB Video Device Class) control of pan/tilt on OS X

I am trying to modify an existing application that talks to a standard USB video device class webcam (a Logitech BCC950 camera) over USB on OS X.
The device (a conferencing webcam) is compliant with USB's "Video Device Class" (https://en.wikipedia.org/wiki/USB_video_device_class). I have provided a link to some source code that allows controlling saturation and white balance of the picture using the webcam's hardware and the VDC specification.
I now want to control the pan/tilt function of this webcam. This is called "CT_PANTILT_ABSOLUTE_CONTROL" in the specification. How do I do this?
This site has some example code for controlling the gain, exposure and a handful of other settings with OS X's IOKit.
The aim would be to make an application similar to this: https://www.youtube.com/watch?v=U10OqVzoHbw that is controllable using a web interface.
I want to send new parameters for the CT_PANTILT_ABSOLUTE_CONTROL command, to control the pan of the camera.
Additionally, in the documentation, VC_PROCESSING_UNIT is listed as 0x05, but in the source, it's listed as 0x02. Also, other sources such as the Linux UVC headers define it as 0x05.
In the UVC specifications, this is listed under 4.2.2.1.14 PanTilt (Absolute) Control, however, I am unclear of the unit & selector codes that are required to get this information.
I would love to get some help for the commands & code that needs to be written so that this application will work in OS X with IOKit.
With the help of a friend, we have found this: https://github.com/kazu/UVCCameraControl
It is a modified version of the code that I linked to in my question, however, it seems to have support for Pan & Tilt.
I have as of yet not tried it, but quickly looking at the code, it seems to support everything that I need.

How do I tell OS X to ignore the input from one of two connected USB mice?

I have two USB mice connected to my Mac, one of which I'm using as a scanner. I need access to the Generic X and Y data but I don't want that data to move the cursor. How, under either carbon or cocoa environments, do I tell the system to ignore the mouse as a pointing device?
Edit: after some digging I've found that I can turn off mouse position updating with the CGAssociateMouseAndMouseCursorPosition() function, but this does not allow me to specify a single mouse. Can anyone explain the OS X relationship between HID mouse devices and the cursor? There has to be a binding between the hardware and software on a device by device basis but I can't find it.
I would look into writing a basic user-space driver for the mouse.
This will allow you direct access to the mouse as a USB device. You can also take control of the device from the system for your exclusive use.
There is some documentation here:
Working With USB Device Interfaces
To get you started, the set up steps to connect to a USB device go like this (I think, my IOKit is rusty)
include < IOKit/IOKitLib.h > and < IOKit/usb/IOUSBLib.h >
find the device you are interested in using IOServiceMatching(). This lets you pick find the correct USB device based on its properties, including things like vendor ID, &c. (See the IORegistryExplorer tool screen shot below)
get a USB plugin instance (let's call it plugin) with IOCreatePlugInInterfaceForService()
use plugin from step 2 get a device interface (let's call it device) using (**plugin)->QueryInterface()
device represents a connection handle to your USB device--open it first using either (**device).USBDeviceOpen or (**device).USBDeviceOpenSeize(). from there you should be able to send/receive data.
Sounds like a lot I know.. and there might be an easier way, but this is what comes to my mind. There may be some benefits to having this level of control of the device, not sure. good luck.

How to use Mac OS X Cocoa events for multitouch gestures

I'm writing a program that has an NSView embedded in an NSScrollView which user can zoom. I'd love to set it up so the user can zoom the view using the multitouch pinch gesture supported on MacBook Air and the new unibody MacBooks/MacBooks Pro and in applications like Safari and iPhoto. I've hunted through Apple's documentation and can't figure out how do to this.
Is this supported using publicly available APIs on Mac OS X 10.5 Leopard?
If not, how "bad" are the private APIs (e.g. is it just an undeclared constant or a whole new set of methods)?
Edit: Snow Leopard adds supported APIs for gestures and multi-touch. See the AppKit release notes for Snow Leopard; ⌘F for “gesture” and “MultiTouch” (sic). They'll look pretty familiar if you've used ones below, but there probably are some fine differences, so read the new documentation anyway.
Is this supported using publicly available APIs on Mac OS X 10.5 Leopard?
No. 10.5.0 doesn't support it at all, and 10.5.1 through 10.5.6 make you implement undocumented methods.
If not, how "bad" are the private APIs (e.g. is it just an undeclared constant or a whole new set of methods)?
Not bad at all. You have to implement some undocumented event methods in your view. Since you're the one implementing the methods, you shouldn't crash if Apple changes the methods; all that will happen is the feature will stop working.
However, if you'll be retrieving the absolute (not delta) magnification or rotation from the event, then those are as-yet-undocumented methods of the event, so you should guard those messages with respondsToSelector: messages and perform careful range-checking on the methods' return values.

Device Information from NSEvent/CGEvent

My application uses an event tap to capture keyboard events, and I'd like to know which device (i.e. which keyboard) each event comes from. Is there an sort of device-identifying information along with the CGEvent that a tap gets? I've looked at NSEvent's methods, and the various CGEventField keys, but none of them seem to be device-unique. Any help?
You might want to take a look at DDHidLib, Dave Dribin's excellent framework to work with USB HID devices independently.
http://www.dribin.org/dave/blog/archives/2007/03/19/ddhidlib_10
(not just about joysticks, so read more than the first paragraph of that blog post)
Some of the functionality of DDHidLib no longer works under Leopard, due to some security concerns at Apple regarding capturing an HID device, but if you're lucky it might provide you with what you need.
DDHidLib is neat, and in fact I rewrote parts of it for Delicious Library 2 for Leopard's newer HID APIs, and submitted the changes back to the original author -- if you write him you can get the Leopard-only sample code.
Unfortunately, the new Leopard HID APIs have the ability to peak at keyboard events as they pass by, but NOT to intercept them, so you can't build your own application-level device handler unless it's OK that the key events are also going to the AppKit, as well. (This is why there's a BONKING noise when you use a USB barcode scanner in Delicious Library 2 - I peak at the scanner and read the barcode, but then the typing is still sent to the topmost window, which doesn't want it, and beeps a lot. Sigh.)
-Wil

Resources