How can I synthesize Cocoa multi-touch gesture events? - cocoa

Dear stackoverflow folks! To this day I never saw the need to ask a question, because all of you have done a great job in asking and answering nearly all of the code-related problems I encountered. So, thank you for that!
At the moment I am working on an iOS application that is able to process raw touch events. These are then sent to an iMac over a WiFi network (the protocol I use is OSC). On the OS X side there is a server application listening for these OSC messages and converting them to mouse pointer movement / mouse button press / multi-touch gestures. So basically I want to build a (of course much more basic) software-bundle like mobile mouse (http://mobilemouse.com/) that I am able to adapt to the needs of our customers (by the means of customizing colors / additional buttons / gestures, and so on) for small remote control projects.
Right now, everything works but the multitouch gestures (pinch, rotate, two-finger-scroll). So my question is: How can I programmatically create and post a multitouch gesture event?
I searched a lot and found some threads about it here on stackoverflow, but none of them could help me:
Is there a way to trigger gesture events on Mac OS X? - Is there a way to change rotation of gesture events? - Generate and post Multitouch-Events in OS X to control the mac using an external camera - ...
Update 1:
The last thing I tried was:
CGEventSourceRef eventSource = CGEventSourceCreate(kCGEventSourceStateCombinedSessionState);
CGEventRef event = CGEventCreate(eventSource);
CGEventSetType(event, NSEventTypeMagnify);
CGEventPost(kCGHIDEventTap, event);
CFRelease(eventSource);

Related

Unity: GUI button with event trigger as virtual control pad on Android phone

Let's say I have two GUI buttons with EventTrigger as virtual key. The expectation is whenever a button is pressed, the camera will rotate until the button is released.
At the beginning, I used the pointer Down and pointer up function. It works, but it extremely sensitive, the camera rotation didn't stop at the moment when I release my finger. I've solved this problem by using drag function (whenever dragging is detected, the camera stop rotating something like that).
However, there is still a bug that I couldn't solve, that is if I swipe the button instead of touch&release, the button doesn't release. eventually the camera just keep rotating until I touch the button again. I've tried all the event trigger function such as pointer exist, end drag etc.
I just want the touch input works as perfectly as the keyboard input.
This problem doesn't shown when I debugging on unity remote, only when I build it on my phone. So is that hardware issue? (I'm using mi3)
Thanks for having time to read my broken English.
I'm so sorry that I've asked a stupid question, since I'm still kind of beginner. The problem is somehow it didn't update when I installed it in my phone. It actually works fine.

How can I implement gesture recognizers in OS X?

I have done quite a bit with gesture recognizers for iOS, but I am now doing work in OS X, and I am lost.
I want to duplicate the functionality that exists like in Finder where you can two-finger swipe (on your magic mouse) to go back/forward through a directory tree.
I have an NSWindow based app that looks very similar to Finder. I have used apps before that allows you to build your own gesture recognizers so I know it is possible to do it, but I don't see any documentation on it.
What do I need to do to implement these gestures?
Mac now has:
NSClickGestureRecognizer
NSMagnificationGestureRecognizer
NSPanGestureRecognizer
NSPressGestureRecognizer
NSRotationGestureRecognizer
Available in storyboards too.
You can read about Handling Trackpad Events in the Cocoa Event Handling guide. The system can detect some pre-defined gestures (swipe, rotate, etc.) or you can listen to the raw touch events, which travel up the NSResponder chain, just like regular mouse events.
Looks like there is also an Event Recognizer class in CZKit. https://github.com/CarterA/CZKit
I haven't used this (yet), so YMMV.

Different kinds of clicks in Mac OS?

I have just bought a Wacom Bamboo touch tablet. It works fine with all applications except the Twitter client, which gets a bit confused when I click on a link.
Is there a quick bit of code I can knock up / API I can call to see what kind of mouse events are being generated by the driver (just to satisfy my curiosity)?
To clarify: I'm not writing an app here... just trying to use a product and work out why it's not working properly.
Tablet events are somewhat different than mouse events. Specifically:
A [tablet] pointer event is an NSEvent object of type NSTabletPoint or
an object representing a mouse-down, mouse-dragged, or mouse-up event
with a subtype of NSTabletPointEventSubtype.

Windows Phone Map control has unresponsive pinch zoom

I'm seeing very spotty/unresponsive behavior with the silverlight map control on Windows Phone 7 (Microsoft.Phone.Controls.Map). The map control doesn't seem to pick up a lot of my pinch gestures. Is anyone else seeing this? Is there a way a workaround?
Just as an experiment I hooked up custom gesture listeners to PinchStarted, PinchDelta and PinchCompleted. My event handlers are getting fired every time I pinch, but the Silverlight map control doesn't pick them "most" of the times.
PS. I'm using the most up to date SDK/Toolkits.
After talking to our Microsoft contact we have no answers for the pinch performance problems. The official side stepped our question but and recommend putting a +/- buttons on the map for zooming in and out of the Map. To show the +/- buttons use the ZoomBarVisible property.

Detecting multitouch iPhone-like "tap" on MacBookPro

After a period of iPhone work, I'm once again working on normal Cocoa apps on my MBP
and I miss the "tap" gesture. I know that I can turn on the incredibly annoying "Tap to Click",
feature in the Trackpad control pannel, but I don't want a click, I want a tap.
I know it's probably not mac canon, but is it possible to receive this multi-touch style event?
You might want to have a look at the native code underpinning my Java API: http://kenai.com/projects/macmultitouch

Resources