Cocoa detect when hand is over right side of magic mouse - cocoa

How can I detect if a user's hand is on the right side of a magic mouse? Not right clicking, just checking what side of the mouse the finger is on.

Unless you write something in the IOKit to handle this, it isn't that easy. What the App gets is what the driver (kext) sends it.
You could get something like Better Touch Tool or Magic Prefs which opens up a variety of options i.e. positions of fingers on mouse, where fingers are and are not registered etc...
Writing IOKit kext's isn't a simple process, but you could begin here:
https://developer.apple.com/library/mac/#documentation/devicedrivers/conceptual/IOKitFundamentals/Introduction/Introduction.html
Other than that, you're stuck with what the kext sends to your App as a notification.

Apple official driver is really limited . This include the lack
of support for advanced gestures like pinch and rotate.
The following proof of concept grab (very rawly) the pinch event,
using the euclide's distance between the two fingers, and send a
combined keystrokes as response to the front most application
(kCGHIDEEventTap). Launching the binary and bring a Preview.app window
to the front, you'll able to pinch in/out, using your magic
mouse...amazing! :-)
Take a look at http://www.iphonesmartapps.org/aladino/?a=multitouch
Extending Functionality of Magic Mouse: Do I Need a kext?
Apple Magic Mouse Api

Related

Raw mouse and keyboard input for macOS games post-Catalina?

I have a feeling I know the answer to this question, but it's been surprisingly difficult to confirm it anywhere. Is it possible to get raw keyboard and mouse input post-Catalina without needing "Input Monitoring" access in the Security & Privacy preferences?
I've never programmed a game engine before on macOS, but so far as I can tell, the "usual" way for games to get raw mouse deltas and raw keyboard states has been to use IOKit's HID API. Up until recently, this apparently was also the only way to get input from Xbox and Playstation gamepads.
When I attempt to use the IOKit HID API to get mouse and keyboard input, I receive a prompt to allow Input Monitoring in Security & Privacy preferences. Elsewhere on the internet, I've learned that this is due to a change in macOS Catalina: to use IOKit HID input handling, you now need the Input Monitoring permission. As far as I'm concerned, it's a deal-breaker for my game to require users turn this on.
Am I understanding this situation correctly? And if I am, is it just not possible to get raw (unprocessed) keyboard and mouse input without the Input Monitoring permission? I mean, I can hack together a solution for FPS-style mouse input using mouseMoved: events and the like, but such a solution would disallow sub-pixel precision and would subject my game to macOS's cursor acceleration. Is there another solution that games/game engines use?
If you’re willing to take on the macOS 11 requirement, it seems like Apple added raw keyboard and mouse support to the Game Controller framework. Seems to even support caps lock.
https://developer.apple.com/documentation/gamecontroller/gckeyboardinput?language=objc

Make a key click a certain place on the screen

I am running a game that has buttons on both sides of the screen, which gives you an easy control on a tablet. But on ARC it makes it difficult to use because you need to move your mouse across the screen a bunch of times. Does ARC Welder have an option to make a key on the keyboard "tap" a certain place on the screen?
If you are comfortable with the concept of scripting, you could use AutoHotKey to use keyboard events to click specific areas of the screen. This would be through your OS, not the ARC, but I think the script can be linked to a specific application so it will only run with that app.
See specifically Click

Using CGDisplayStream to detect window movement

I want to detect when a window is being moved in real time and figured that CGDisplayStreamCreate etc. should provide just that. But I'm having difficulty deciding which window is being moved when my CGDisplayStreamFrameAvailableHandler is called. Is there a direct way to match the updated rects with with an app and its windows?
CGDisplayStream cannot tell you which applications/windows are responsible for a given screen update. You might be able to use another API like Accessibility to determine window locations and then guess which of the kCGDisplayStreamUpdateMovedRects corresponds to each window, but that will not be very reliable. If you're going to go the route of Accessibility, you may as well use Accessibility notifications for window move events: How can my app detect a change to another app's window?.
If you also need the pixel contents of the windows when they are moving, then you'll need to do some unfortunate time alignment between CGDisplayStream and Accessibility callbacks.

VC++ mouse events

I want to write a console program for mouse events (Only mouse scroll). How do I do it in VC++? The application will listen only to scroll events.
Description: If the user scrolls down, the Desktop window fades down, and fades-in when user scrolls up.
Here I just need to know to to listen to mouse events in console app.
Note: I am developing using win32 API, and for development environment I am using VS2010.
I've never actually done this myself. It seems that a console application responding to mouse events almost belies its nature and intended purpose. Generally, you would only need to respond to keyboard input from a console app and leave the mouse stuff to a GUI app.
That being said, this tutorial indicates that it is in fact possible to capture these mouse events from a Win32 console application. Generally, the suggestion is to use the ReadConsoleInput function and extract the information of interest from the INPUT_RECORD structure that it fills. The only tricky thing is that the call to ReadConsoleInput is a blocking call, which means it will not return until there is an input event fired. You'll need to structure your application's code accordingly. Mouse events are covered in detail about 3/4 of the way down the page.

No keyboard input from GLUT while mouse is moving (In OS X)

So I just built an OpenGL application on a Mac for the first time. I'm using GLUT to get keyboard input. The trouble is, I've discovered that if I'm moving the mouse at the same time I push a button on the keyboard, my keyboard function doesn't get called! If I push a button when the mouse isn't moving, it gets called just fine. The same goes for my keyUp function. Why could this be?
I'm also having trouble with the mouse motionFunc - it seems to not be getting called every frame and lead to choppy mouse input ...
Can you provide a code sample? It sounds like a bug in your event handling code.
That said, GLUT is no longer developed and you should not be using it. There are numerous better alternatives, the most popular being SDL. Others include GLFW and SFML, and you can even use Qt.

Resources