We've run into a tricky situation developing our Windows application (problem applies to both our old WinForm-based app as well as our new UWP-based app). The application is used on a touch screen based device (15 – 27 inch touch screen depending on system size). Hence, no touch pad involved. Touch screen only (which does not have any Windows-related adjustments available that does effect this scenario. The touch device is identified as a “10-point multi touch screen” by Windows.
When using the tap-gesture (as opposed to previously using PointerPressed) the interface gets very picky about the way the user taps. If there is only the slightest "glide" when the finger touches the button (which is very often the case, it turns out, in usability studies) the pan-event is trigged (as opposed to the tap-event) and the button is not "pressed". The pan gesture is used for other purposes in the application.
The problem is specifically noticeable on smaller devices (with higher DPI). Probably a function of the finger used "does not scale" as the display does.
Is there a way to adjust for high DPI for the tap gesture? Something along the lines “if the slide is less than x pixels, then it’s not a pan but a tap”-kind of value?
“Google” has little to offer on the subject.. On the other hand - we simply cannot be the first to run into this situation!?
Related
I am creating a Cocoa application in Xcode 6, and the application uses an OpenGLView. The draw method in my extension of NSOpenGLView is getting called repeatedly, but I am not sure at which rate it is being called or a way to set the rate.
Is there a default "framerate" for NSOpenGLView, and is there a way to change it?
Apple has a technote describing how to drive an OpenGL rendering loop. The answer is to use a CoreVideo display link (CVDisplayLink), it will call a callback during the blanking interval.
Generally in any window system, the window is not redrawn on a periodic schedule; it only happens in response to events that cause a "damaged" or "dirty" state.
The number of things that cause this "damaged" state has gotten a lot smaller in recent years thanks to compositing window managers (OS X uses one such window manager). It used to happen whenever a window moved over top it, but in modern window managers it only happens during/after a resize event or when the window is moved.
As you would expect, Cocoa's documentation says the same thing:
- update
Called by Cocoa when the view’s window moves or when the view itself moves or is resized.
Owen Taylor writes an excellent blog; this diagram illustrates what may happen:
If a compositor isn’t redrawing immediately when it receives damage from a client, but is waiting a bit for more damage, then it’s possible it might wait too long and miss the vertical reblank entirely. Then the frame rate could drop way down, even if there was plenty of CPU and GPU available.
I develop audio plugins, which are run inside their hosts and work realtime. Each plugin has its own window with controls, which often contains some kind of analysis pane, a pretty big rectangle that gets repeatedly painted (e.g. 20-50x per second). This is all working well.
The trouble comes when the user adjusts a parameter - the plugin uses WM_MOUSEMOVE to track mouse movements and on each change calls ::InvalidateRect to make the relevant portion of the window be redrawn. If you move quickly enough, the window really gets quickly repainted, however there seems no time for the host and other windows to be redrawn and these usually perform some kind of analysis feedback too, so it is really not ideal.
No my questions:
1) Assuming the host and other window are using ::InvalidateRect too, why mine is prioritized?
2) How to make ::InvalidateRect not prioritized, meaning the window needs to be invalidated, but it may be later, the rest of the system must get time for their redrawing too.
Thanks in advance!
We're making a user-space device driver for OS X that moves the cursor using Quartz Events, and we ran into a problem when games — especially ones that run in a windowed mode — can't properly capture the mouse pointer (= contain/keep it within the boundaries of their windows). For example, it would go outside the game window and click on the desktop or nearby inactive applications.
We could fix this if only we could detect when an active application calls CGAssociateMouseAndMouseCursorPosition.
How would you do this? Any ideas are appreciated.
I dont know if this can help you
There is an option called Focus Follows Mouse
Focus Follows Mouse - The Mouse pointer will grab automatically change focus to a new window inisde this one app if you mouse over it, instead of having to click a window to get focus, then clicking to do something.
http://wineskin.urgesoftware.com/tiki-index.php?page=Manual+4.6+Advanced+-+Options
I have written a few different mouse logical layers (for bridging different input devices, etc.). I have found that hooking into the OS level WM_INPUT event is a sure way of getting very real-time mouse position information. There is also a less rigorous solution of just polling the mouse data you need from one of Windows' very primitive DLLs. They are lightning fast. You could poll on a 10ms timer and never see performance loss on a modern machine.
It is possible to use the SetWindowPos API on Windows to keep a windows always on top of other windows, and there are many questions on StackOverflow dealing with this.
It is possible to keep only part of a Window always visible? I.e. specify a clipping region inside an existing window, and keep only that part visible?
A use case would be the following (on Windows):
User clicks on icon to run app.
User highlights a portion of the screen to focus on (similar to the Snipping Tool on Windows 7)
The highlighted part of the screen remains always visible, even when other windows/programs are moved over the selected region.
I know the issues that would spring up with having other applications that are also set to being topmost. Just curious if this is even possible?
Even if you change part of your window to be transparent to what's below (with a clipping region) it's still going to take all the mouse clicks, etc. that occur over the transparent part.
Your best bet is to create a new smaller window and make it top-most while hiding the main one.
How would I go about implementing something along the lines of "scrubby sliders", like in Photoshop and quite a few other image-processing applications?
They are slightly hard to describe.. basically you have a regular numeric input-box, but you can click-and-hold the mouse button, and it functions like a slider (until you release). If you click in the box, you can select text, edit/paste/etc as usual.
The Photoshop docs describe it, and I put together a quick example video (an example of the sliders in Shake)
Another similar implementation would be the jog-wheel in Final Cut Pro, which functions similarly, without the numeric readout being underneath.
I can't seem to find any mention of implementing these, although there is probably alternative names for this. It is for a OS X 10.5 Cocoa application.
It is for a colour-grading application, where a user might need to make tiny adjustments (0.001, for example), to huge adjustments (say, -100 +100) on the same control. A regular slider isn't accurate enough over that range of value.
Copy-and-pasting values into the box would be a secondary concern to scrubbing the values, and the Photoshop/Shake setup really well. The unobviousness of the control is also of a low concern, as it's not a "regular desktop application"
I've encountered those. They suck, because they prevent the user from dragging to select the text of the number.
A better idea would be a miniature slider beneath the field that expands to a full-size slider when the user holds down the mouse button on it and collapses back to its miniature size when the user releases the mouse button. This way, the selection behavior is still available, but you also provide the slider—and in a more obvious way.
There's no built-in class in Cocoa for either one. You'll have to implement your own.
I doubt that this exists in Cocoa framework. As far as I remember it is not mentioned in the Apple Human Interface Guidelines.
You can develop one yourself by using a custom view and tracking mouse events (-mouseDown:, mouseUp:, -mouseDragged:).