Detect Keyboard event in MAC - macos

i'm developing a JAVA cross platform software that is essentialy based on AWT and Processing. I'm stuck with an issue about key-responsiveness on MAC: it seems that the AWT canvas can't gain focus -just for keyboard events-.
Is there any software available on MAc that detect the input events of the OS and that says to which window or process they're going to?
Thank you all,
Stefano

Related

Weird Android Emulator and Mac tap-to-click sensitivity issue

I'm experiencing a really weird and frustrating issue with the Android Emulator on macOS Monterey.
I have "tap to click" enabled on my Macbook Pro (Mid 2015 15"), and it works fine in all other apps. But somehow, when the emulator window is active it seems to miss almost every other tap. If I click hard instead of tapping, it catches every click. The tap sensitivity in the Trackpad settings is set to "light".
So, it seems that the emulator window is somehow less sensitive to tapping than all other apps. I don't even know how this is possible, is there even such a thing as app-specific tap-sensitivity??
What's more, it's not only the emulator window itself that has this issue, but the emulator settings window as well. If I tap the "Enable clipboard sharing" toggle, it misses about 50% of the taps. If I click hard, it catches them 100%. If I try the same in some other app (tested with the "System Preferences" window), it catches 100% of the taps.
I have tested and tested this again to make sure I'm not biasing the results, but there really is a difference, and it's driving me nuts. I think it appeared after updating to Monterey, but not 100% sure of the exact timing correlation.
Any ideas??
My problem was really similar, I am using MAC with apple mouse, so I could fix it by disabling the mouse wheel on Android Emulator Extended Controls.
Hope that help
I've noticed the same issue some time ago. Unfortunately, I didn't find any solutions.
However, there are a couple of good enough workarounds:
Launch the emulator in a tool window. Anyway, this is a default approach for modern versions of Android Studio. To enable/disable it check Preferences -> Tools -> Emulator -> Launch in a tool window.
Use alt emulators. For instance, Genymotion doesn't have such an issue.
I ran into a simmiliar issue for me and the solution that i found was enabling "tap on click" to in the "system preferences" -> "trackpad".
I am new to the Android Emulator, but am experiencing the same issue in Ubuntu, even though I have tap-to-click disabled in the OS. I hate tap-to-click, so having an ultra-sensitive-to-touch Android screen emulated on my laptop is beyond frustrating.
Looking at the documentation, I came across the SOURCE_CLASS_POINTER method, which states:
The input source is a pointing device associated with a display. Examples: SOURCE_TOUCHSCREEN, SOURCE_MOUSE. A MotionEvent should be interpreted as absolute coordinates in display units according to the View hierarchy. Pointer down/up indicated when the finger touches the display or when the selection button is pressed/released. Use getMotionRange(int) to query the range of the pointing device. Some devices permit touches outside the display area so the effective range may be somewhat smaller or larger than the actual display size.
In reading that, I've come to believe this may actually be the default behavior due to touchpad events being interpreted through the SOURCE_TOUCHSCREEN method, rather than SOURCE_TOUCHPAD or SOURCE_MOUSE.
Unfortunately, I don't have a solution as much as a workaround:
I plugged in a mouse and tested the pointer up/down movements over the screen, which this part of the document suggests should register as a press. However, with the mouse it only responds to clicks. So it suggests to me that it is indeed properly interpreted as a SOURCE_MOUSE controlled pointer and not a SOURCE_TOUCHSCREEN controlled pointer.
So unless we can find out how to make the AVD properly interpret a touchpad as a touchpad, and not a touchscreen, using a mouse seems like the best solution.
For reference, I'm including this link to the AVD manual: https://developer.android.com/studio/run/emulator
UPDATE: Somehow over the period of about 18 hours and several restarts, my AVD no longer does tap-to-click on its virtual screen. It would be very hard to pinpoint exactly what changed because I've been updating packages frequently since I'm running a pre-alpha release of Ubuntu, but I think it's from using X11 instead of Wayland.
Which got me thinking, you could try changing your display server from Cocoa to X11. Thankfully, MacPorts, the MacOS version of the FreeBSD Ports Tree, makes it fairly easy to cross-compile software. It contains build recipes for multi-platform unix-like software, much like HomeBrew but often allowing for more customization.
That tap issue was annoying enough it's probably worth giving a shot.
(from macports website) The X11 windowing environment, for ports that depend on the functionality it provides to run. You have multiple choices for an X11 server: https://www.macports.org/install.php
I would build them in this order:
MacPorts: X11 - If you build it, you'll have a bunch of libraries already
MacPorts: QEMU - use make configure menu to select GTK3+, if there's no option for X11, try this build flag with make after you install X11 (pointing it at your X11's lib dir):
make -L/opt/X11/lib -lX11
Lastly, MacPorts: Android Platform tools
Related StackOverflow Q/As:
Compiling a C program that uses OpenGl in Mac OS X
Running x11 on Mac OS

Recognize the Host App in an iOS8 Custom Keyboard

How do I recognize in which host app my keyboard is running?
Basically, I want to change some things in my custom keyboard in specific apps
Maybe I can customize the keyboard traits to my use?
Thanks
This is not possible unless it is one of your own app, or you have some kind of collaboration with the running app.
A custom keyboard and a running app can communicate with Darwin notification. The running apps needs to broadcast the darwin notification, and keyboard needs to observe that. So if the developer is not already broadcasting it, then you can't identify it.

Is it possible for an Adobe AIR app to receive touch events in OS X?

I'm working on software in OS X that receives touches from a multi-touch device and posts them to the system for applications to process. I was hoping that it would be possible to support Adobe AIR applications, but from the research I've done, it looks like they don't support multi-touch events on OS X. This page in particular seems to indicate that, and consistent with what it says, the AIR applications I've tested with do indeed only respond to gesture events, but not touch events. However I'm (perhaps naively) hoping there's a way to do it regardless.
So is it possible for an AIR application running in Mac OS X to receive touch events at all? Perhaps using a different method than posting system-wide touch events? For example, is there any way my software could somehow send touch events directly to an AIR app?
Thanks in advance!

What devices are available to test WM_GESTURE and WM_TOUCH code on a desktop machine?

I'm writing some code to handle WM_GESTURE and WM_TOUCH events in Windows 7, but I can't figure out how to test it. I do my development in Boot Camp on a 17" Mac Book Pro.
So far, I have determined that the Boot Camp trackpad driver in Windows 7 does not generate those events, and this generic trackpad I found on Amazon.com that claims to be 'multi-touch' works as advertised, but not by creating WM_GESTURE or WM_TOUCH events. I verified this by using Spy++ to report the events; nothing with the WM_GESTURE or WM_TOUCH value was reported.
What kind of hardware is supposed to generate these kinds of events? At this point, I'm assuming it's only for tablet or mobile (Windows CE) hardware, but I'd appreciate any other suggestions.
I suppose there's another way to approach this -- I want to get functionality similar to Cocoa's [NSResponder swipeWithGesture:] and related methods, which report back swipes, rotation, and other gestures on the trackpad. WM_GESTURE appears to be the equivalent on Windows 7.
An other option, which would require only another physical mouse device to work with, and should get you at least 95% of the way there is the Multi-Touch Vista project, which can emulate up to 256 touch points using physical devices - thus the need for an extra mouse, or two since it can be awkward to simultaneously work with a mouse in one hand and trackpad with another.
There are several monitors out there supporting touch with Windows 7. For example: Acer T230H.
HTH
Wacom makes several touchpads that support multitouch; a particularly inexpensive version is the Bamboo Touch. This gives you touch without having to buy another monitor - although it doesn't give that direct interaction feeling.

os x gui api clarification

If I wanted to write my own window manager for OS X (please dont respond with "whats the point"??), what APIs should I be looking at?
There is no such thing as a "window manager" in OS X, and no public interface to implement one. The functions that an X11 window manager would perform are split between the GUI toolkit (Carbon/Cocoa), the Dock application and the window server.
Your only real choice if you want to change OS X's windowing behavior is to patch individual applications, the Dock (which has a privileged connection to the window server) and/or the window server. It'd involve a great deal of reverse engineering and almost certainly break in 10.6, but it's certainly possible.
At the hardware level, write your own APIs.
Otherwise, there are various graphics architectures in which to plug in your window manager:
OpenGL
Quartz
Quicktime
X11

Resources