Detect pinch-n-zoom on Macbook touchpad in Electron - macos

The only way that I could find to detect pinch-n-zoom in Javascript is by means of Gesture and Touch events. But those events are only implemented in Safari (not even chrome). The same code that detects gesture events in safari, didn't detect anything in my electron app. Is there any way to detect pinch-n-zoom or other two finger gestures on macbook trackpads in Electron?

Related

Xamarin Forms buttons stop receiving mouse clicks after clicking on SkiaSharp CanvasView on iOS

I use SkiaSharp canvas to draw the main game screen, and then there are various Xamarin.Forms Buttons around the UI. This all works fine on when used directly on iPhone or iPad using a finger. However, when I connect a mouse (e.g., through a MacBook or otherwise), the buttons start working with about 10% chance after mouse-clicking on the SkiaSharp canvas (and not receiving the mouse click events with 90% chance). The SkiaSharp canvas itself works just fine.
If I bring up the iOS app launch menu from the bottom (which probably somehow temporarily exists the mouse navigation on the app), the buttons start working again with the mouse. But if I click the SkiaSharp canvas again with the mouse, the buttons have a high chance of becoming disabled again. If I change to using a finger, all works fine (even if the mouse clicks were not being registered immediately before). However, mouse clicks are not being registered even after touching with a finger, so finger-touching does not reset the issue with the mouse (but bringing up the menu from the bottom does).
We found this bug by testing the iOS game on MacBook Pro (the iOS apps recently came available on the App Store) but the same issue persists also directly with an iPad / mouse combination. It seems to be some sort of an issue between using a mouse (on iPad or on MacBook Pro), SkiaSharp canvas and Xamarin.Forms buttons.
Does anyone know what the root cause of the problem is and what is the workaround?
Not an answer as such, but some more information about reproducing the issue: A simpler repro case may be this small project: https://github.com/jrc14/TraceMatching/ .
Don't worry too much about what it's doing, but note that you're mean to click in the grey Skia canvas in the middle to create 'targets' - and that after you've done that, mouse-clicks are getting lost.
If you run it on a Mac, you'll see that, though the clicks get lost after you've clicked on the Skia canvas, they will start being received again if you click on something else (another app, or the Mac background).
(further edit) - after some noodling around I did find a workaround. If, once you've finished processing the touch action on the SKCanvasView, you reset its EnableTouchEvents property (i.e. set it to false, then back to true again), it seems that the clicks don't get lost any more,

Touchscreen gestures are not working in Firefox on Windows 10/11 despite being enabled in the settings

I am running Firefox on a 2-1 Lenovo windows laptop and none of the touchscreen or touchpad gestures works. It is like they are not enabled but they are.
Swipe left or right on the screen should trigger navigation back / forward but nothing happens. Similar with the touchpad. The only gesture that works is pinch zoom on the touchpad.
Touch appears to be configured in the settings (default values)
Update: There is a new experimental trackpad swipe gesture. Go to about:config and set widget.disable-swipe-tracker to false.
This made swipe navigation start working on my touchpad.
Any idea about how to get touch enabled on the touch screen as well?
This issue persists reinstall of windows / Firefox and I have never seen touchscreen gestures work in Firefox on this device.
Ok. After a bit of investigation I have found that there is no support for touch screen gestures in Firefox. I have found a 5 year old bug report about this. The positive news is that there is a recent case updates, where it says that they will start to look at implementing touchscreen support
https://bugzilla.mozilla.org/show_bug.cgi?id=1443710
To anyone who is still looking for a fix, change this to false in about:config
widget.disable-swipe-tracker
found in https://bugzilla.mozilla.org/show_bug.cgi?id=1539730#c26

Receiving high precision WM_MOUSEWHEEL events with Logitech mouse on Windows 10

To improve scrolling in my application, I recently added support for high-resolution scrolling wheel events. According to the documentation this is pretty straightforward: The handler for WM_MOUSEWHEEL should support arbitrary values as opposed to just +/- 120.
Examples of applications doing this properly are Firefox and Chrome on Windows.
I am using a Logitech mouse with a high-res wheel (MX Master 3), but I noticed that all the events I am receiving are just +/- 120. However, I could find two pretty weird work-arounds:
Rename my program to Firefox.exe
Focus on Firefox (with my app in the background), move the mouse over and scroll there
The second trick works with other things as well. For example, it makes the Windows 10 Settings app scroll smoothly. Here's a demo of the difference in action (first 4 seconds focused, then unfocused with Firefox having focus): https://www.youtube.com/watch?v=gb1FUtyLxUg&feature=youtu.be
I assume the driver is doing this for compatibility with older apps that can't handle anything that isn't 120. But is there a way for my app to opt-in to get the better events? Or does the Logitech driver simply hardcode a bunch of browsers and everyone else is out of luck?

How can I synthesize Cocoa multi-touch gesture events?

Dear stackoverflow folks! To this day I never saw the need to ask a question, because all of you have done a great job in asking and answering nearly all of the code-related problems I encountered. So, thank you for that!
At the moment I am working on an iOS application that is able to process raw touch events. These are then sent to an iMac over a WiFi network (the protocol I use is OSC). On the OS X side there is a server application listening for these OSC messages and converting them to mouse pointer movement / mouse button press / multi-touch gestures. So basically I want to build a (of course much more basic) software-bundle like mobile mouse (http://mobilemouse.com/) that I am able to adapt to the needs of our customers (by the means of customizing colors / additional buttons / gestures, and so on) for small remote control projects.
Right now, everything works but the multitouch gestures (pinch, rotate, two-finger-scroll). So my question is: How can I programmatically create and post a multitouch gesture event?
I searched a lot and found some threads about it here on stackoverflow, but none of them could help me:
Is there a way to trigger gesture events on Mac OS X? - Is there a way to change rotation of gesture events? - Generate and post Multitouch-Events in OS X to control the mac using an external camera - ...
Update 1:
The last thing I tried was:
CGEventSourceRef eventSource = CGEventSourceCreate(kCGEventSourceStateCombinedSessionState);
CGEventRef event = CGEventCreate(eventSource);
CGEventSetType(event, NSEventTypeMagnify);
CGEventPost(kCGHIDEventTap, event);
CFRelease(eventSource);

Detecting multitouch iPhone-like "tap" on MacBookPro

After a period of iPhone work, I'm once again working on normal Cocoa apps on my MBP
and I miss the "tap" gesture. I know that I can turn on the incredibly annoying "Tap to Click",
feature in the Trackpad control pannel, but I don't want a click, I want a tap.
I know it's probably not mac canon, but is it possible to receive this multi-touch style event?
You might want to have a look at the native code underpinning my Java API: http://kenai.com/projects/macmultitouch

Resources