I have an existing windows XP based application that has 2 screens (and currently 2 PCs, one hosting each screen). One is a touch screen and the other a normal mouse driven screen. The touch screen is used for quick user actions e.g. touch for an action to be triggered. The application uses the mouse pointer within the non-touch application window to determine where to carry out the action. I want to get rid of the the PC hosting the touch screen, and just have a touch screen hosted on one PC (dual screen). However, if I do this is there any way to stop the cursor moving to the touch screen? I don't think I have focus issues because I can use WS_EX_NOACTIVATE within the touch screen application (the touch screen application only has to respond to touch events).
I have seen some internet posts saying that the cursor can be hidden via the touch panel configuration (if supported), but does anyone know whether these is windows OS support for this? I have freedom to move to Windows 7 if this provides the answer. I also don't particularly want to capture the events at the device level (before reaching the OS).
Windows XP doesn't have native support for touch screens (because at the time it was written there were almost no touch devices), so the touch events that come from a touch screen are treated as mouse events on Windows XP. I don't think there is any way to make a difference between a touch and a mouse click in Windows XP (at application level after reaching OS, not at device level before reaching OS).
Windows 7 on the other hand introduced real support for touch. Whenever a touch event happens, you get a WM_TOUCH message which is very easy to use and, of course, has nothing to do with the mouse.
In conclusion, I think you should upgrade to Windows 7 as it has way better support for touch input. If you decide to go with the Win7 WM_TOUCH, here's another article that should be helpful to you.
Related
I intend to make an app for Windows in Qt with multi-monitor support, one (secondary) monitor being touch. I want one person to be able to work with the touch screen independently on a second person, which would be working on the main screen with mouse and keyboard.
I don't have the the touch monitor yet, so I don't know how it really works, but I am afraid that touching the monitor would move the cursor (mouse pointer), making the work with mouse very hard.
So my question:
Is it possible somehow to make the touch screen not to affect the cursor in any way (not interrupting drag&drop counts, too), but still to be able to push buttons n' stuff, and that on either Windows or Qt level?
No buttons pushing, but generating QTouchEvents (or similar) would be sufficent, too.
Thanks for your responses.
I need to do a pre-purchase evaluation of a Flash application that is intended for a touch screen.
Since I still don't have the touchscreen now, I need to run the application on my desktop computer and the application is unusable without a visible cursor.
I am using Windows.
Is there a way to unhide the cursor without asking the developers to change the application?
I've previously used remote access software (such as Windows Remote Desktop or TeamViewer) for this purpose. Another option is a virtual machine - in both cases you'll be able to see the cursor on the local/host machine.
If you happen to be on a Windows 8 machine, you might give a try to the Windows Simulator (http://blogs.msdn.com/b/visualstudio/archive/2011/09/29/first-look-at-windows-simulator.aspx, available for free with Visual Studio Express) that additionally simulates multitouch gestures such as pinch/rotate with only a mouse.
A few other ideas:
1) You can try using the "Show location of pointer when I press the CTRL key" mouse visibility property (Control Panel - Hardware and Sound - Devices and Printers - Mouse - Pointer options). Although not entirely convenient, it might help you if the application doesn't require quick response times.
2) If the application is distributed as a .swf file and the right button hasn't been disabled, sometimes right-clicking (anywhere in the application) to bring up the context menu will cause the cursor to show up and remain visible.
I'm developing a app that will be available at a regional fair and the public will use it to quickly download "perks" to theirs pen-drives. BUT when you move your mouse to the top-right corner of the screen a "menu" appears (there's a similar thing on tablets) and it enables the user to quit/switch-out-of my app, and that I can't allow! How do I block that?
You have to wait for Windows 8.1 i.e. 18th October. 8.1 offers such thing to manage. Moreover it also offers kiosk mode. Through which you can allow only one app to be open on top most.
Source
This is not under the control of your application, it's a feature of the operating system. And as far as I know, there's no way to block it. Typically such changes would be done through group policy settings, but at least in Windows 8, there's no such control available. I'm not sure about Windows 8.1, although I haven't stumbled upon any mention of such features there, either.
Also, this is not the only way for the user to switch out of your application. He could also drag the app down from the top and close it or go to the lower bottom corner to open the start screen, if he's using the mouse. The keyboard would give him even more options, of course.
I have a game developed for Windows 7 in XNA which uses the Mouse Event only! There are buttons around the screen that the user plays with.
Will this same game work with Windows 7 Touch screens with a single touch? I do not need any multi-touch functionality.
If not, how do I get the touch event in XNA on a Windows 7 PC?
Touchscreens do send mouse events.
Though they might send mousestates pressed/released/moved a little bit differently depending on the model. For example the original single touch eeetop sometimes never sent a pressed state(got moved instead even if you just clicked), while the later multitouch versions worked as expected.
I have a hidden process that waits for non-standard hardware button messages and runs an application (with CreateProcess). No problem with the user disturbing, it's an action that the user approved himself. Everything is fine when it's usual layout with taskbar shown and multiply captioned and non captioned- windows. But the situation is different in XP and 7, when the current application is full-screen. Full-screen application in this case is window without borders having exactly the same dimension as the screen. Windows hides taskbar for such application even if it's always on.
In Xp, it's ok, the taskbar is being shown in this case and appication (for example calculator) also, the full-screen app is still visible in areas other than the launched app's and taskbar'. But in Windows 7 nothing visual happens, the full-screen app is still on and if I switch to taskbar, the executed application is there. I tried to solve it with SetForegroundWindow, BringWindowToTop, even AllowSetForegroundWindow(GetCurrentProcessId()) call for a window handle found with CreateProcess-WaitForIntputIdle-EnumThreadWindows, no change. So did something change since XP related to full-screen windows that officially documented?
Thanks,
Max
I would imagine that, if you have your own hardware device, that there is some API for generating "real" user input. Clearly the legacy keyboard and mouse, and now USB HID drivers (many of which are usermode I think?) have access to an API to do so.
Synergy+ for example can generate fake keyboard and mouse events on connected PCs, and the consequence of the faked input is windows switching activation normally.
So, my initial idea is for your usermode "Device" application to synthesize actual keyboard messages - SendInput seems a likely candidate for "the API that can "fake" real user input events.
Then, use an API like RegisterHotKey in your "UI" app to respond to the hotkey combination your device app generates.
Now, (assuming that SendInput IS generating user input events at the correct level), you should (from within the WM_HOTKEY handler in your UI app) have permission (because everything was "user initiated") to change the foreground window (to yourself).
Vista introduced the desktop composition feature. In short, all windows are drawing to a memory bitmaps and the Desktop Window Manager is then composing these bitmaps and drawing on a full-screen Direct3D surface. Full-screen windows do not participate in the desktop composition and get to draw directly on the screen (mostly because the majority of full-screen apps are games that need real-time screen updates).
In particular, this means that when a full screen app is up and running, it is covering the DWM composed image and the user needs to switch to a DWM-managed window for the DWM to start drawing on top of the full-screen app.
I don't have a good solution for your problem, unfortunately. One way to solve it would be to add the WS_CAPTION style to your app and then handle WM_NCPAINT/WM_NCCALCSIZE/WM_NCHITTEST yourself. This would allow you to lie to the DWM that you are a regular windowed application, but change visually your NC area to look like you have no title. However, this does require certain amount of additional code and might be a bit more effort you want to invest.
Another way you can try to solve your problem is to explicitly minimize your full-screen application window when launching the new process. However, you will then have to solve the problem of when to maximize it back again.
Btw, you might find the comments on this post from Raymond Chen interesting.
Windows supports multiple desktops and my guess would be that the full screen up is using a different desktop than the default one (where your application will be shown). A desktop object in Windows is "a logical display surface and contains user interface objects such as windows, menus, and hooks". For example, screen savers normally are started on a separate desktop.
You can find out which desktop an application is running on using Process Explorer:
Set Process Explorer to replace Task Manager and to run always on top.
When your full screen up is shown, launch Process Explorer by pressing Ctrl + Shift + Esc
Within Process Explorer, select the full screen process and press Ctrl + H to display the handles of this process
See the value of the Desktop item in the list. Usually this would be set to Default
If you know what desktop this app is running on you can start your process on the same desktop by first calling OpenDesktop to get a handle to this desktop and then pass it into the STARTUPINFO of your CreateProcess call.