Windows Multitouch Events and LabView - windows-7

I'm having some problems with multi-touch and LabView.
My objective is to intercept the Windows Touch Messages (generated by multitouch monitors and then interpreted and handled by Windows 7), which are intended for any and all windows owned by a program called LabVIEW.
This will prevent Windows from communicating Touch Messages with LabVIEW while allowing me to use the touch messages to create custom responses in LabVIEW myself. And, it will still allow Windows to use the Touch Messages as normal for any and all other programs which the user may want to interact with.
LabVIEW has not been registered with Windows 7 to interpret Windows Touch Messages specifically. It therefore handles them using default Windows 7 responses.
I have developed a library for LabVIEW which creates the custom multitouch enabled responses but it requires me to provide my own driver for the multitouch monitor being used in order to prevent Windows 7 from listening to the monitor's touch event messages and converting them to its own set of Touch Messages. This is inefficient as I want users to be able to plug and play any commercial multitouch monitor with my code and I don't want to have to write custom drivers for every monitor type.
So, I want to intercept the Touch Messages intended for LabVIEW (and only those Windows Touch Messages) so that they
Never reach LabVIEW
Can then be sent on to my existing program for reinterpretation via TCP messages over the localhost (this seems the best way I've found so far).
If anybody has any ideas I'd be exceedingly grateful!

LabVIEW does not see the Windows Touch Events as you already know. The only events you see are the ones you can use in the Event Structure. However, there are ways to use .Net Callbacks to see other Windows events. You can then create User Events to feed the event back to your Event Structure. Below are a couple of links that might help:
Capturing Windows System Events without Polling (Windows)
Windows Message Queue Library
Use windows touch screen (multi touch) and distinguish get touch event and mouse click

Use the event handler structure in a while loop and only register the events you want LabVIEW to handle.

If you are willing to pay for it, there is a commercial toolkit that supports multi-touch and smartphone-style gestures on a number of touchscreen devices via UDPP or Windows 7 messages:
https://www.ni.com/en-us/shop/software/products/touchscreen-toolkit-for-labview.html

Related

How to determine display - touch device associations for windows 10?

I am currently writing an application that receives touch input through the windows WM_INPUT messages and the HID API. Every touch point received has a handle to the device associated from which it came. This is the same device that the family of WM_POINTER messages would report for the same touch point. My application needs to know which monitor corresponds to a particular touch device. Is there a programmatic way to determine this? If I was using the WM_POINTER API I could use MonitorFromPoint or something similar.
In the control panel under "hardware and sound" there is a category "Tablet PC Settings". If you click this category, a dialog box launches which has a button "Setup". This button launches a calibration tool which allows you to pair a USB HID touch device with a monitor.
Does anyone know where these settings might be saved to?
relevant links:
structure received in WM_INPUT messages:
https://msdn.microsoft.com/en-us/library/windows/desktop/ms645562(v=vs.85).aspx
Structure received in WM_POINTER messages:
https://msdn.microsoft.com/en-us/library/windows/desktop/hh454907(v=vs.85).aspx
Thanks.
A generic way to determine where things are stored in the registry, is to watch registry changes. Process Monitor from the Windows Sysinternals Suite by Mark Russinovich, can be used to watch and log changes to the windows registry. So you can start it logging, and then perform your calibration, and then stop and examine the log for the desired registry activity.
Here is a link to a similar question I asked
Associate HID Touch Device with Pnp Monitor.
In short, you can use the details from the HIDApi calls with queries of registry keys to link HID Touch Devices to monitors.

Can Windows store apps generate UI Events?

I am considering writing a Kinect v2.0 Gesture -> Keyboard/Mouse event translator so I can control video games. Since I will be using Microsoft's SDK, cross-platform is out-of-the-question; it seems natural to distribute this through the Windows store. However, I know Windows store apps have significant restrictions. Can a Windows store app:
Run in the background (possibly with an elevated priority to ensure that the game doesn't miss input)?
Create user input events like "key-down" and "mouse move" that will be read by other applications?
Looking at Microsoft's capability page didn't seem to give me a definite yes or no.
You'll need to write this as a desktop app. Windows Store apps run in a sandboxed context with limited access to the system. They cannot interact with other processes as you'd need, and they cannot inject input events.

Windows phone app that detects call, SMS and alarms and sends notification to PC

I am trying to develop an app for windows phone that will detect calls and send notification to PC through wifi. Similarly it will also send SMS that are received to PC and when an alarm goes off in phone, a notification will be sent to PC. How should I proceed about developing this app?
Thanks in advance.
There is no API exposed to allow you to spy on incoming calls or SMS in Windows Phone (like there was with Windows Mobile). The only option would be to build this as an OEM extension, but unless you work for HTC, Samsung or Nokia, this isn't going to be an option.
See it's not that you cannot but to some extent you can. Like you can use Obscured Event to detect call , lock screen etc. go through this msdn discussion for details
Detect lock and calls msdn
And as far as the alarm or the reminder is concerned you can go ahead and design a reminder system within your app. Limit is 50.
Obscured event Windows phone

What devices are available to test WM_GESTURE and WM_TOUCH code on a desktop machine?

I'm writing some code to handle WM_GESTURE and WM_TOUCH events in Windows 7, but I can't figure out how to test it. I do my development in Boot Camp on a 17" Mac Book Pro.
So far, I have determined that the Boot Camp trackpad driver in Windows 7 does not generate those events, and this generic trackpad I found on Amazon.com that claims to be 'multi-touch' works as advertised, but not by creating WM_GESTURE or WM_TOUCH events. I verified this by using Spy++ to report the events; nothing with the WM_GESTURE or WM_TOUCH value was reported.
What kind of hardware is supposed to generate these kinds of events? At this point, I'm assuming it's only for tablet or mobile (Windows CE) hardware, but I'd appreciate any other suggestions.
I suppose there's another way to approach this -- I want to get functionality similar to Cocoa's [NSResponder swipeWithGesture:] and related methods, which report back swipes, rotation, and other gestures on the trackpad. WM_GESTURE appears to be the equivalent on Windows 7.
An other option, which would require only another physical mouse device to work with, and should get you at least 95% of the way there is the Multi-Touch Vista project, which can emulate up to 256 touch points using physical devices - thus the need for an extra mouse, or two since it can be awkward to simultaneously work with a mouse in one hand and trackpad with another.
There are several monitors out there supporting touch with Windows 7. For example: Acer T230H.
HTH
Wacom makes several touchpads that support multitouch; a particularly inexpensive version is the Bamboo Touch. This gives you touch without having to buy another monitor - although it doesn't give that direct interaction feeling.

How to detect if a windows app is Tablet PC "Aware"

Does anyone know how I can determine if an application is able to accept Tablet PC input? i.e. some kind of hittest or windows message that I can send it?
thanks,
H
The Tablet PC Input Panel (which as of Vista is available even on non-tablets other than the basic editions) will send input using the Text Services Framework (TSF) to an application that supports it. Otherwise it will send normal input messages to the application which basically means a bunch of simulated keyboard events.
I suspect you are trying to determine whether or not an application supports TSF which provides a much more integrated experience such as bidirectional correction interface, information about the current selection, context hints, etc.
I don't know if this is reliable but if the control in question is a rich edit, you could try the EM_GETEDITSTYLE message which has a SES_USECTF flag in the return value that indicates whether or not TSF is turned on for that control. But for standard edit controls I'm not so sure. This article on MSDN goes into much more detail about using the text services framework and is probably more than you care to implement.

Resources