No mousewheel events when using touchpad - windows

I'm having troubles retrieving mousewheel events when my program is running on my laptop and when I'm scrolling using the touchpad.
I was initially using DirectInput to catch input events, but I've read here and there that DirectInput wasn't able to handle scroll events sent by touchpads.
I did some extra researchs and fell on this old topic: C++ DirectInput Mouse Scroll Wheel with a Laptop Touchpad
So i've tried to use a PeekEvent loop to catch my mouse inputs. Everything wen fine when using a real mouse, but when I switched to my laptop, ta-da: no WM_MOUSEWHEEL events received. (And this guy predicted it )
I don't receive any WM_VSCROLL or WM_GESTURE event either.
Additionnaly I've made another program based on wxWidgets and in this case, the mouse wheel events are propery catched by the application. I've parsed the source code to see how wxWidgets retrieve windows events and, except if I'm missing something, it seems to be the exact same code as mine.
Is there some kind of voodoo magic trick to catch mouse wheel events generated by a touchpad?
I can provide more informations about my code if needed.
Thanks
EDIT:
I did some extra debug to find what's going on:
First, I was wrong saying I don't catch WM_MOUSEWHEEL event at all. In fact, in the WindowProc Callback I actually receive wheel events.
However, the PeekMessage call doesn't return any event.
I could eventually change the way I collect mouse events to do it directly in the WindowProc callback, but I'll need to do some weird stuff just to handle something that should be working the same way using both a real mouse or a touchpad.

Related

Event Handling of a background program

Is there a way to let my program handle events like mouse movements and key strokes although the program is working in the background?!
Example : the user is working on notepad and I want my program to handle the events happening over there.
You can install mouse and keyboard hooks.

How to make COM DoDragAndDrop API be touch sensitive on Windows 7/8?

On Windows, the drag and drop action can be done via COM DoDragAndDrop API, see http://msdn.microsoft.com/zh-cn/library/windows/desktop/ms678486%28v=vs.85%29.aspx. It can perform D&D operation perfectly and has the best system integration.
Recently, I found it is not touch friendly API, it can not handle touch event very well. On Window 7/8, A Win32 window created by CreateWindow API is also able to handle touch event in the same way as handling mouse event. Actullay, it seems the touch events are converted into a similar mouse events, e.g. a mouse down event is triggered when a finger is tapped down, a mouse move event is triggered when a finger is moved.
However, the DoDragAndDrop COM API doesn't convert touch event into mouse event, even the COM service doesn't have any idea about touch event at all. But I had a try to drag a file from one folder to another folder on Win8, it works. If the D&D operation is also implemented based on COM API, I indeed got a conflicting result.
Did I miss something when I use DoDragAndDrop for touch event support? Thanks.
DoDragDrop() does support touch on Win7/8 (and yes, D&D of files is implemented by Windows Explorer using DoDragDrop()), so your problem is related to something else. Did you check whether DoDragDrop() is returning any error code to your code that you may be ignoring?

WM_TOUCH is not immediately sent with touch down event

I am working with a touch screen and using Windows 7 built in drivers (as it never prompted me to install any). it works fine, except for one small issue. When I touch the screen, it will not send the WM_LBUTTONDOWN until I move my finger off the screen. It appears to do this to determine if I intend to hold down to emulate WM_RBUTTONDOWN or not. (Also, I tried to disable the hold down emulate gesture, but it never disables in practice.)
So I thought I would just receive the WM_TOUCH messages. And I found that WM_TOUCH (0x240) is also not sent to my window until I move my finger off the screen. I sort of thought that defeats the purpose of WM_TOUCH altogether.
Both before and after registering to receive WM_TOUCH messages, I received three messages immediately upon touching the screen:
1. Send: 0x02CC (undocumented tablet messages)
2. Post: 0x011B (undocumented)
3. Send: 0x011A (WM_GESTURENOTIFY)
0x011A is WM_GESTURENOTIFY, which my code is to respond to (perhaps I am not responding correctly?). I reply with a standard response (using sample code from MS) to receive full notifications.
Another thing, I began getting WM_TOUCH when I register for touch messages, but I continue to get the WM_GESTURENOTIFY message as well. According to the MS documentation, once I register to get WM_TOUCH, I no longer get gesture messages.
If anyone can tell me how to get WM_TOUCH messages immediately (e.g. when I am getting the WM_GESTURENOTIFY messages), and not after I let my finger up off the touch scree, I would appreciate it much.
Check out this tutorial on touch events:
http://msdn.microsoft.com/en-us/gg464991
What you want to use is the RegisterTouchWindow function, as such:
RegisterTouchWindow(handle, 0);
Windows will now send WM_TOUCH messages instead of WM_GESTURE messages to your window. Keep in mind that you will have to compile against Windows SDK version 7.0 or newer for this to work.
I almost got the same issue and solved it by using :
RegisterTouchWindow( hWnd, TWF_WANTPALM );

VC++ mouse events

I want to write a console program for mouse events (Only mouse scroll). How do I do it in VC++? The application will listen only to scroll events.
Description: If the user scrolls down, the Desktop window fades down, and fades-in when user scrolls up.
Here I just need to know to to listen to mouse events in console app.
Note: I am developing using win32 API, and for development environment I am using VS2010.
I've never actually done this myself. It seems that a console application responding to mouse events almost belies its nature and intended purpose. Generally, you would only need to respond to keyboard input from a console app and leave the mouse stuff to a GUI app.
That being said, this tutorial indicates that it is in fact possible to capture these mouse events from a Win32 console application. Generally, the suggestion is to use the ReadConsoleInput function and extract the information of interest from the INPUT_RECORD structure that it fills. The only tricky thing is that the call to ReadConsoleInput is a blocking call, which means it will not return until there is an input event fired. You'll need to structure your application's code accordingly. Mouse events are covered in detail about 3/4 of the way down the page.

No keyboard input from GLUT while mouse is moving (In OS X)

So I just built an OpenGL application on a Mac for the first time. I'm using GLUT to get keyboard input. The trouble is, I've discovered that if I'm moving the mouse at the same time I push a button on the keyboard, my keyboard function doesn't get called! If I push a button when the mouse isn't moving, it gets called just fine. The same goes for my keyUp function. Why could this be?
I'm also having trouble with the mouse motionFunc - it seems to not be getting called every frame and lead to choppy mouse input ...
Can you provide a code sample? It sounds like a bug in your event handling code.
That said, GLUT is no longer developed and you should not be using it. There are numerous better alternatives, the most popular being SDL. Others include GLFW and SFML, and you can even use Qt.

Resources