How can I implement gesture recognizers in OS X? - macos

I have done quite a bit with gesture recognizers for iOS, but I am now doing work in OS X, and I am lost.
I want to duplicate the functionality that exists like in Finder where you can two-finger swipe (on your magic mouse) to go back/forward through a directory tree.
I have an NSWindow based app that looks very similar to Finder. I have used apps before that allows you to build your own gesture recognizers so I know it is possible to do it, but I don't see any documentation on it.
What do I need to do to implement these gestures?

Mac now has:
NSClickGestureRecognizer
NSMagnificationGestureRecognizer
NSPanGestureRecognizer
NSPressGestureRecognizer
NSRotationGestureRecognizer
Available in storyboards too.

You can read about Handling Trackpad Events in the Cocoa Event Handling guide. The system can detect some pre-defined gestures (swipe, rotate, etc.) or you can listen to the raw touch events, which travel up the NSResponder chain, just like regular mouse events.

Looks like there is also an Event Recognizer class in CZKit. https://github.com/CarterA/CZKit
I haven't used this (yet), so YMMV.

Related

How to recognize mouse wheel in MAUI view for desktop application

How to get notified for mouse wheel interaction for MacCatalyst and windows at MAUI platform.
Answer 1: Scrolling.
What do you want to do based on mouse wheel interaction? If you simply want to scroll, or know when scrolling has occurred, then you can rely on ScrollView, and other views that handle scrolling themselves. E.g. ScrollView.Scrolled event.
Answer 2: General use of mouse scroll wheel.
Input functionality for mouse or keyboard has not yet been implemented in MAUI. Nor has a specification been finalized.
Here is one mouse proposal.
You could add a comment to that proposal requesting that mouse wheel support be included.
However, this might not be in the first release of MAUI, as the current emphasis is on stabilizing the functionality that is needed on all platforms (including mobile), some of which don't have mice.
In case anyone is wondering "shouldn't this be specified in .net 6?" (And then MAUI would simply use it.)
There are interactions between what is happening on the display (views or windows) and how mouse/keyboard input should be handled - it makes sense to put that input in the same code base that is displaying to the screen - therefore MAUI is a good place for it.
Especially given that touch is part of MAUI.
Until then, the solution is to make a DependencyService on each platform, to refer to the platform's APIs that you need.
Surprisingly, I'm not finding one that anyone has done for mouse on Windows and Mac.
Other than "implicitly", since a mouse can be used similar to a touch device. And text can be typed on a keyboard. The point is that there is no API specific to functionality that only makes sense if you have a physical mouse (scroll wheel) or a physical keyboard (global keyboard hooks).
TBD I'll look into this further.
Basic approach would be to look at what WinUI 3 uses as input APIs.
On Windows Desktop app, forward to those input APIs. Write an adapter on other platforms (Mac, Linux).
I'll see if Uno Platform or Avalonia have taken this approach.

How to make Sure Keyboard is not placed over an Entry when it is focused in Xamarin?

I have an Entry which is placed in a ContentView, and this ContentView is placed in a Grid. When this Entry is Focused, the Keyboard is placed over the ContentView preventing the user from seeing the Entry.
I would like to know if there is a way to determine if a View is Visible and if not make sure it is (prevent the Keyboard from being placed over it).
Any thoughts on how I could do this.
I would need this to work on iOS specifically, Android and Windows seem not to have this issue in my use-case.
On Android platform, the official document provides Soft Keyboard Input Mode. Refer to https://learn.microsoft.com/en-us/xamarin/xamarin-forms/platform/android/soft-keyboard-input-mode for details.
On IOS platform, you can query KeyboardOverlap installation in IOS's nuget, and then add KeyboardOverlapRenderer.Init (); in AppDelegate to achieve the effect in IOS.

How can I synthesize Cocoa multi-touch gesture events?

Dear stackoverflow folks! To this day I never saw the need to ask a question, because all of you have done a great job in asking and answering nearly all of the code-related problems I encountered. So, thank you for that!
At the moment I am working on an iOS application that is able to process raw touch events. These are then sent to an iMac over a WiFi network (the protocol I use is OSC). On the OS X side there is a server application listening for these OSC messages and converting them to mouse pointer movement / mouse button press / multi-touch gestures. So basically I want to build a (of course much more basic) software-bundle like mobile mouse (http://mobilemouse.com/) that I am able to adapt to the needs of our customers (by the means of customizing colors / additional buttons / gestures, and so on) for small remote control projects.
Right now, everything works but the multitouch gestures (pinch, rotate, two-finger-scroll). So my question is: How can I programmatically create and post a multitouch gesture event?
I searched a lot and found some threads about it here on stackoverflow, but none of them could help me:
Is there a way to trigger gesture events on Mac OS X? - Is there a way to change rotation of gesture events? - Generate and post Multitouch-Events in OS X to control the mac using an external camera - ...
Update 1:
The last thing I tried was:
CGEventSourceRef eventSource = CGEventSourceCreate(kCGEventSourceStateCombinedSessionState);
CGEventRef event = CGEventCreate(eventSource);
CGEventSetType(event, NSEventTypeMagnify);
CGEventPost(kCGHIDEventTap, event);
CFRelease(eventSource);

Enable click-through without overriding NSView acceptsFirstMouse

The proper way of enabling click-though is to override acceptsFirstMouse on an NSView to return YES. (Click-through means that you can click on and use a control even when its window is not focused. For example, the Finder toolbar buttons, and the traffic-light window controls use this.)
My problem is that my application is not based on Cocoa, but GTK. Under the hood, GTK uses some Carbon and Cocoa, and I can get a pointer to the NSView if I want - but I can't get the widget to use a different NSView subclass without editing the GTK source. What are other ways to achive click-through?
(And, if possible, "hover-through" - I'd like to have mouse-over events on the click-through controls, too, so I can highlight them, telling the user that they are clickable.)
I could call Carbon's InstallWindowEventHandler with kEventWindowGetClickActivation, but I'm not sure how to use it, and if it's going to work (I've read Carbon is deprecated and might not work on modern Macs anymore). Alternatively, there must be a low-level mechanism to enable this (the Cocoa mechanism has to be implemented somehow). Any ideas?

Detecting multitouch iPhone-like "tap" on MacBookPro

After a period of iPhone work, I'm once again working on normal Cocoa apps on my MBP
and I miss the "tap" gesture. I know that I can turn on the incredibly annoying "Tap to Click",
feature in the Trackpad control pannel, but I don't want a click, I want a tap.
I know it's probably not mac canon, but is it possible to receive this multi-touch style event?
You might want to have a look at the native code underpinning my Java API: http://kenai.com/projects/macmultitouch

Resources