Secondary touch monitor not moving with the cursor but still pressing buttons (or emitting events for Qt app) - windows

I intend to make an app for Windows in Qt with multi-monitor support, one (secondary) monitor being touch. I want one person to be able to work with the touch screen independently on a second person, which would be working on the main screen with mouse and keyboard.
I don't have the the touch monitor yet, so I don't know how it really works, but I am afraid that touching the monitor would move the cursor (mouse pointer), making the work with mouse very hard.
So my question:
Is it possible somehow to make the touch screen not to affect the cursor in any way (not interrupting drag&drop counts, too), but still to be able to push buttons n' stuff, and that on either Windows or Qt level?
No buttons pushing, but generating QTouchEvents (or similar) would be sufficent, too.
Thanks for your responses.

Related

absolute mouse buttons on dock

I have created an absolute mouse piece of hardware.
This is very much like a wacom tablet, but it just uses a regular mouse going into a piece of hardware that then increments or decrements the X and Y coordinates in order to specify the exact location of the mouse (yes I need to do this for another reason).
While everything works fine on multiple operating systems, it works fine on Mac OSX Yosemite except ---->
When I move to the dock, if I click on an icon it will not start the application, I can move the icon and once open I can close the app or scroll, anything you would expect from a mouse, it will not start properly. Also the back arrow in the system settings/preferences will only operate if I am on the exact right location on the button, i.e. normally there is no back action, but if I mess with it and get it in just the right location it will finally work.
This sounds like an oversight in the operating system, does anyone have any insight into this?

Make a key click a certain place on the screen

I am running a game that has buttons on both sides of the screen, which gives you an easy control on a tablet. But on ARC it makes it difficult to use because you need to move your mouse across the screen a bunch of times. Does ARC Welder have an option to make a key on the keyboard "tap" a certain place on the screen?
If you are comfortable with the concept of scripting, you could use AutoHotKey to use keyboard events to click specific areas of the screen. This would be through your OS, not the ARC, but I think the script can be linked to a specific application so it will only run with that app.
See specifically Click

Win7 and multiple monitors: switch focus when moving mouse to different monitor?

I recently upgraded to a dual monitor setup at work, and while the extra real estate is very nice, there's one annoyance: my intuitive reaction is that there are two "active" windows now, namely the topmost window in each monitor -- and I frequently get surprised when keyboard events go to the actual active window, rather than the one I've moused over and am looking at.
There's a setting in the control panel that gives this effect (ease of access -> make the mouse easier to use -> activate a window by hovering over it with the mouse) but it also acts on windows within the same monitor, which I don't want.
I frequently use my ThinkPad's scrolling function on unfocused windows which I don't want to receive focus, which come to think of it probably adds to my confusion, since I can scroll my email in the other window but my keyboard shortcuts don't go there.
Is there any way to achieve this effect or am I just wishing?
Thanks,
Ryan
Yeah, get a Mac :-p
In all seriousness OS X does provide this functionality. It might be worth searching for an add on that does the same sort of thing. I know of Wizmouse -- http://antibody-software.com/web/software/software/wizmouse-makes-your-mouse-wheel-work-on-the-window-under-the-mouse/
There might be more though.
AT LAST!!! Windows 10 has this support :-)
SM
You can change the settings to use classic windows appearance etc. and try to focus on the border color of the window. The board changes on the active window.
I use two monitors and there really isn't much you can do besides change your behavior.
Select things from the taskbar, drag active windows to the same screen and always refer to inactive windows by moving them to the inactive windows monitor and remember to go back to the window you want to be active.

Touch screen hide cursor

I have an existing windows XP based application that has 2 screens (and currently 2 PCs, one hosting each screen). One is a touch screen and the other a normal mouse driven screen. The touch screen is used for quick user actions e.g. touch for an action to be triggered. The application uses the mouse pointer within the non-touch application window to determine where to carry out the action. I want to get rid of the the PC hosting the touch screen, and just have a touch screen hosted on one PC (dual screen). However, if I do this is there any way to stop the cursor moving to the touch screen? I don't think I have focus issues because I can use WS_EX_NOACTIVATE within the touch screen application (the touch screen application only has to respond to touch events).
I have seen some internet posts saying that the cursor can be hidden via the touch panel configuration (if supported), but does anyone know whether these is windows OS support for this? I have freedom to move to Windows 7 if this provides the answer. I also don't particularly want to capture the events at the device level (before reaching the OS).
Windows XP doesn't have native support for touch screens (because at the time it was written there were almost no touch devices), so the touch events that come from a touch screen are treated as mouse events on Windows XP. I don't think there is any way to make a difference between a touch and a mouse click in Windows XP (at application level after reaching OS, not at device level before reaching OS).
Windows 7 on the other hand introduced real support for touch. Whenever a touch event happens, you get a WM_TOUCH message which is very easy to use and, of course, has nothing to do with the mouse.
In conclusion, I think you should upgrade to Windows 7 as it has way better support for touch input. If you decide to go with the Win7 WM_TOUCH, here's another article that should be helpful to you.

Cocoa accessibility API, can I click a window in the background without activating it?

I've been searching forever for a solution to this, so I thought I'd seek out the brainpower of greater minds than mine. I'm developing a Cocoa app that uses the Accessibility API to manipulate another program (it's a hotkey app). The app I'm controlling typically has multiple windows open, with some hidden behind others. What I would like to do, if it's possible, is to send mouse events to windows using the Accessibility API in a way that presses a button in the window without bringing it to the foreground (interact with the window but don't activate it). The reason I'm trying to do this is that sending the mouse event to this other window will force it to the foreground and disrupt the user's interaction with the foremost window.
This is possible on Windows - apparently, because apps similar to mine do it there - but I'm getting the feeling that this isn't possible with Cocoa, given the way the window manager works. Am I mistaken?
Accessibility is higher-level than that. You send, for example, AXPress actions to AXButton objects, but “press” is not necessarily a click—pressing the space bar while a view is focused, for example, is also a “press”. AXPress is a high-level action that means “do your thing”, which obviously has meaning for some views (such as buttons) and not others (such as fields).
Accessibility activating the application does make sense when you look at it from its intended purpose: Assistive devices for disabled users. If the user “presses” something by whatever means, they probably intend to activate the application and work in it.
Quartz Event Services will get you almost there: You can create an event tap for the process you want to control, and you can forge events and send them to a tap. The catch is that you can only send events to a tap when the tap fires—i.e., when the application already has an event to deal with. When it doesn't, you're stuck.

Resources