No keyboard input from GLUT while mouse is moving (In OS X) - macos

So I just built an OpenGL application on a Mac for the first time. I'm using GLUT to get keyboard input. The trouble is, I've discovered that if I'm moving the mouse at the same time I push a button on the keyboard, my keyboard function doesn't get called! If I push a button when the mouse isn't moving, it gets called just fine. The same goes for my keyUp function. Why could this be?
I'm also having trouble with the mouse motionFunc - it seems to not be getting called every frame and lead to choppy mouse input ...

Can you provide a code sample? It sounds like a bug in your event handling code.
That said, GLUT is no longer developed and you should not be using it. There are numerous better alternatives, the most popular being SDL. Others include GLFW and SFML, and you can even use Qt.

Related

No mousewheel events when using touchpad

I'm having troubles retrieving mousewheel events when my program is running on my laptop and when I'm scrolling using the touchpad.
I was initially using DirectInput to catch input events, but I've read here and there that DirectInput wasn't able to handle scroll events sent by touchpads.
I did some extra researchs and fell on this old topic: C++ DirectInput Mouse Scroll Wheel with a Laptop Touchpad
So i've tried to use a PeekEvent loop to catch my mouse inputs. Everything wen fine when using a real mouse, but when I switched to my laptop, ta-da: no WM_MOUSEWHEEL events received. (And this guy predicted it )
I don't receive any WM_VSCROLL or WM_GESTURE event either.
Additionnaly I've made another program based on wxWidgets and in this case, the mouse wheel events are propery catched by the application. I've parsed the source code to see how wxWidgets retrieve windows events and, except if I'm missing something, it seems to be the exact same code as mine.
Is there some kind of voodoo magic trick to catch mouse wheel events generated by a touchpad?
I can provide more informations about my code if needed.
Thanks
EDIT:
I did some extra debug to find what's going on:
First, I was wrong saying I don't catch WM_MOUSEWHEEL event at all. In fact, in the WindowProc Callback I actually receive wheel events.
However, the PeekMessage call doesn't return any event.
I could eventually change the way I collect mouse events to do it directly in the WindowProc callback, but I'll need to do some weird stuff just to handle something that should be working the same way using both a real mouse or a touchpad.

GLFW application not getting focus under MacOSX

I have an OpenGL application that uses GLFW under OSX 10.12.3.
50% of the time when I run it it works fine. When glfwPollEvents() is called from my main loop, the mouse callback is called correctly for mouse events.
However, the other 50% of the time it doesn't receive ordinary mouse events even though a window is correctly created in the foreground. The main loop is running but the mouse event callback isn't called. If I double click (two clicks within around 0.1s) then the event callback receives 4 events as expected. But it receives nothing for single mouse clicks.
Within the main loop glfwGetWindowAttrib(window, GLFW_FOCUSED) returns 1 when the code is working and 0 when it isn't. Adding glfwFocusWindow(window) after window creation doesn't seem to change anything.
If I switch focus to a different application and switch back then my application starts receiving events correctly.
I have used GLFW 3.1.2 built using homebrew, and GLFW 3.2.1 built directly from a github clone. I get the same behaviour either way.
For a while I thought glfwWindowHint(GLFW_FOCUSED, 1) fixed my problems but after the 10th launch or so the problem returned.
I'd like to avoid posting my code (whose glfw calls come mostly from tutorials anyway) and I'm hoping someone will recognise the symptoms I describe as some obvious mistake I've made. The double click behaviour seems like a big clue.

Secondary touch monitor not moving with the cursor but still pressing buttons (or emitting events for Qt app)

I intend to make an app for Windows in Qt with multi-monitor support, one (secondary) monitor being touch. I want one person to be able to work with the touch screen independently on a second person, which would be working on the main screen with mouse and keyboard.
I don't have the the touch monitor yet, so I don't know how it really works, but I am afraid that touching the monitor would move the cursor (mouse pointer), making the work with mouse very hard.
So my question:
Is it possible somehow to make the touch screen not to affect the cursor in any way (not interrupting drag&drop counts, too), but still to be able to push buttons n' stuff, and that on either Windows or Qt level?
No buttons pushing, but generating QTouchEvents (or similar) would be sufficent, too.
Thanks for your responses.

SDL2 input focus

I'm currently trying to build and run an UnrealEngine4 demo app, which uses SDL2, on a Linux armv7 embedded system with X server running but no window manager.
What I'm seeing is that the app is not responsive to keyboard events, although mouse works fine.
Digging further it turns out the problem is that UE4 ignores the keyboard event if the SDL_Event.key.windowID does not match the ID of the app's SDL_Window. I verified this by calling SDL_GetKeyboardFocus() and it turns out the window ID for the keyboard focus is 0 whereas the app's window's ID is 5.
Digging a bit further inside SDL2 it looks like the keyboard focus is set to a particular SDL window after a focusIn event is received from the X server. (see X11_DispatchEvent() in SDL_x11events.c).
It looks like if you're running X11 without a window manager however this focusIn event is never generated by the X-server as already answered here:
FocusIn/FocusOut not generated
After hacking X11_RaiseWindow() in SDL_x11windows.c by adding:
X11_XSetInputFocus(display, data->xwindow, RevertToParent, CurrentTime);
SDL_SetKeyboardFocus(data->window);
was able to finally get the keyboard events processed.
Does the above assessment sound correct ? If so is there a clean solution to this ? I'm still digging around but any suggestion would be greatly appreciated.
Thanks
This turned out to be a bug in SDL, which we just pushed a fix for:
https://hg.libsdl.org/SDL/rev/aa4e4768c6c1
This fix will be in SDL 2.0.4, and should bubble over to Unreal shortly thereafter.

Cocoa detect when hand is over right side of magic mouse

How can I detect if a user's hand is on the right side of a magic mouse? Not right clicking, just checking what side of the mouse the finger is on.
Unless you write something in the IOKit to handle this, it isn't that easy. What the App gets is what the driver (kext) sends it.
You could get something like Better Touch Tool or Magic Prefs which opens up a variety of options i.e. positions of fingers on mouse, where fingers are and are not registered etc...
Writing IOKit kext's isn't a simple process, but you could begin here:
https://developer.apple.com/library/mac/#documentation/devicedrivers/conceptual/IOKitFundamentals/Introduction/Introduction.html
Other than that, you're stuck with what the kext sends to your App as a notification.
Apple official driver is really limited . This include the lack
of support for advanced gestures like pinch and rotate.
The following proof of concept grab (very rawly) the pinch event,
using the euclide's distance between the two fingers, and send a
combined keystrokes as response to the front most application
(kCGHIDEEventTap). Launching the binary and bring a Preview.app window
to the front, you'll able to pinch in/out, using your magic
mouse...amazing! :-)
Take a look at http://www.iphonesmartapps.org/aladino/?a=multitouch
Extending Functionality of Magic Mouse: Do I Need a kext?
Apple Magic Mouse Api

Resources