Is it possible to suppress single-click events during a double-click? - macos

Our application supports single click and double-click events on a window which do different things. However we always get a single-click event during the double-click which causes undesired effects.
Our application is in Qt but really this is a question about underlying Windows/Mac APIs - is this a fundamental detail that the OS detects a single click as soon as you lift your finger since it can't possibly know you are going to click a second time, or can it be prevented?
If it can't be prevented, is their an accepted best practice how to handle it?

Start a timer when you get WM_LBUTTONDOWN (or Qt equivalent). If you get WM_LBUTTONDBLCLK (or Qt equivalent) before the timer expires, cancel the timer and execute your double-click action. Otherwise, when the timer expires, execute your single-click event.
On Windows, you can get the double-click time using GetDoubleClickTime().
That's about the best you can do - you can't prevent the single click message being generated in the first place on either platform.

Related

How to prevent mouse click event on Windows

I record the mouse events on windows by using robotgo package. Package provides to get bitmap of clicked area but the latency of having bitmap is super sensitive situation here.
For example:
If I click any checkbox which is unchecked on the screen, provided bitmap must contains the state of unchecked but it provides me checked state and cannot simulate it with robotgo or cannot trigger click by using bitmap.
Solution to this scenario is that I need to prevent windows mouse click event until bitmap provided by the package (or adding some delay for click event) then trigger the click event on windows.
I made some research online but couldn't find a proper solution. How prevent click event on Windows in Go? Is it possible or is there any other way to make it happen?
A low-level mouse hook can eat mouse events. SendInput can generate mouse input events.
You would have to set a flag somewhere so you don't eat your own fake input events.
Keep in mind that SendInput is not perfect (can be detected by other hooks) and playing with the input system like this is usually not the best solution. Adding 500ms (or some other delay) to every mouse click is going to be very annoying for your users.
It is better to use UI Automation to get information about UI element states in other applications...

How to write a program that runs another GUI program inside it

I am not sure how to ask the question so here is a picture of some idea that came to mind
So for example, when you run my "custom launcher" it displays a window with a couple buttons on the side which you can assign values to. When you click on a button, the appropriate program will run in the big panel on the right (in window mode).
This is all from the user's perspective of course. They will just see that the program they want to run appears in that panel. The actual implementation may have nothing to do with "one program running inside another program"
My own use case is limited to windows desktop platforms only, but if it is possible to generalize it that would be nice as well.
Is this actually possible? Can I write such a program that will run another program inside a panel? The program that's launched may be someone else's, such as MS paint or calculator.
Just to expand on my comment above, here is an approach that may work for you: Fake it :)
When you launch the program, intercept all windows messages to the program that control it's position on screen. That way it 'appears' to be fixed in place, but in reality it's still attached to the normal Windows desktop.
Here's some light reading for you:
Windows Event Hooks
A hook is a mechanism by which an application can intercept events,
such as messages, mouse actions, and keystrokes. A function that
intercepts a particular type of event is known as a hook procedure. A
hook procedure can act on each event it receives, and then modify or
discard the event.
I would recommend against it in a commercial application because you are modifying the behavior of software you don't own - that software may make assumptions about what its parent window is, but for experimentation there's the SetParent Win32 function.

Stop application stealing input

I have a third party application (I'll call it GreedyApp for brevity), which holds the mouse and keyboard input hostage when its window gets focus i.e. it hides the standard mouse cursor and replaces it with it's own cursor, and confines the cursor to its window. The only way to get input to other windows is to ALT+TAB away from GreedyApp.
I need to allow the user free use of all of the components of the system (the delivered system will be purely touch-screen), so at the minute the rest of the system becomes unusable once GreedyApp gets focus.
So far, I've hijacked user32.dll for GreedyApp, hooked SetCursor, ShowCursor and ClipCursor, and disabled them. The result is that GreedyApp no longer hides the cursor, and the cursor is free to roam wherever the user moves it, but...
The problem I'm left with, is that no matter where on the screen the cursor is pressed, or what keys on the keyboard are pressed (except ALT+TAB), the input is still directed into GreedyApp, and other windows don't receive any input.
I'm not sure how GreedyApp is achieving this, and therefore I don't yet know which API calls to hook to stop it. I though it might have been using hooks itself, but I've hooked and disabled SetWindowsHookEx, but the problem persists.
So my question is this:
Either:
A) Is there a (relatively straight-forward) way to find out what API calls an application is making at runtime?
or
B) What method is GreedyApp likely to be using to stop other windows from receiving input?
The application was using RegisterRawInputDevices to get raw mouse and keyboard input, and using the flag RIDEV_CAPTUREMOUSE to stop other applications getting focus.
I've hooked the API call and remove the flag before passing the parameters to the Windows API to process. The user now has control over the system :)

WH_KEYBOARD_LL hook doesn't capture input in own process

I'm using a low-level keyboard hook (WH_KEYBOARD_LL) to disable certain input, such as Alt-Tab. I create the hook on a thread with a message pump, so I can properly handle the notifications.
The hook's callback function is able to process keyboard events whenever I'm not focused in the window that created the hook (i.e. my main window), but as soon as I activate that window, no events show up in the hook until I deactivate the window again and the input instead propagates to the window's WindowProc.
Does anybody have any clue what's going on here?
UPDATE: So, it turns out this behavior is caused by also registering for raw input in the same process. Apparently, using raw input causes my low-level keyboard hook to be disabled whenever my process’s window is focused. Does anybody know why and how to work around this?
Windows doesn't call low-level keyboard hooks if the most recently registered hook (aka the first hook to be executed) comes from a process that registered itself for raw keyboard events.
So a workaround is to create a second low-level keyboard hook in another process afterwards. Yes, this will cause both low-level keyboard hooks to be executed even when the focus is on a window from the first process.
Bad for performance, and who knows what Windows will bodge next - so I'm not really endorsing it - but it works.

Step-through in VS without VS having focus

I find myself working with GUI code where the GUI program needs input focus and remain the topmost window, but whenever I'm debugging with VS stepping-through with F5/F10/F11 requires that VS has focus.
Is it possible to have VS intercept the F-keys whilst the debugee has focus? If VS doesn't have this functionality I imagine it should be possible to write a simple program or VS add-in that has a keyboard book and commands the debugger accordingly - has anyone developed such a program?
I'm working with a GUI test automation framework that sends mouse-clicks and other events by moving the cursor. When the debugee program is out of focus any click on its surface brings the main window forward but does not activate any controls, but the automation framework assumes that its focus of the application will never be interrupted. So if I set a breakpoint before a click that is meant to open the File menu then the click that is sent will only restore the debugee's focus and not open the File menu (if that makes sense).
I've done some searching but couldn't find anything immediately.
Why do you need to maintain focus? Have you specific hooks in the GotFocus/LostFocus?
I've had problems before with the Paint event being called as soon as F5 was hit causing the debugger to show again and therefore requiring another repaint. I got around these simply by arranging my windows so they didn't overlap. I'm pretty sure the LostFocus/GotFocus pair also don't fire when the windows are arranged this way too.

Resources