How to simulate touch events in Windows 7? - windows-7

I would like to use the touch controller in windows 7.Actually i will get the touch coordinates as bluetooth packets accordingly using these coordinates to generate corresponding touches in OS as human mouse click or touches do.So how to start up with this.How could i register for touch events on windows7 os.

Download and install Microsoft Surface SDK 2.0. It includes a little software called Microsoft Surface Input Simulator which simulates touch events and some other things in Windows.
http://www.microsoft.com/en-us/download/details.aspx?id=26716
EDIT: I know this question is over a year old, but I thought it could help others.

There are some open source virtual touch drivers that you can use to simulate touch input if you don't have a touch input device
http://multitouchvista.codeplex.com/
http://code.google.com/p/vmulti/

Related

Unity Game View support Touch Events?

I would like to buy me a Microsoft Surfacebook. But bevore I would like to know if Unity IDE support Touch Events in the Game View of the Editor? Or only if a real device is connect? But the monitor itselft is a touch device.
Thanks for help!
Regards,
Markus
Few days, there was a similar question about touch detection in Unity using Windows tablets.
There is no support for the touch. Input.touches and Input.GetTouch will not work in the Editor Game View. It will only work when you deploy the Game.
For some reason, touches are treated as mouse input.
This shouldn't be a problem at-all because Input.mousePosition and OnMouseDown will work on it so you can use pre-processor to detect when in Editor mode then use input.mousePosition and OnMouseDown and then be able to test your app without having to build it.
Everything else should be fine. Remember, it is not a real computer. It is a mobile computer and is weaker than a real computer. Don't expect a 4K texture game to run smoothly on this device.

How to get TouchEvent on Windows desktop for OpenFL app?

I'm on a touch enabled Windows machine compiling OpenFL. I'm compiling to the Windows/C++ target using OpenFL and Haxe 3.
I cannot get touch events to work. Here's where I'm adding them:
private function onAdded(e:Event):Void
{
stage.addEventListener(Event.RESIZE, resize);
resize(null);
init();
addEventListener( Event.ENTER_FRAME, onEnterFrame);
addEventListener( TouchEvent.TOUCH_BEGIN, onTouchBegin );
stage.addEventListener( TouchEvent.TOUCH_MOVE, onTouchMove );
stage.addEventListener( TouchEvent.TOUCH_END, onTouchEnd );
}
My enterFrame() is getting invoked just fine. No touch (or using mouse) triggers the handlers. Is this a Windows desktop limitation? Would this work once I put on iOS and Android? Why not? Is this a NME/OpenFL bug?
You can use mouse in place of touch on windows for now with the 1.1 update, until full support for multitouch is implemented. Works fine for iOS and Android etc.
singmajesty on the forum
Try it the other way, use a MouseEvent to handle both mouse and touch input on a desktop.
We just (in OpenFL 1.1, released this week) migrated to SDL2 for the backend for Windows. This actually has support for real touch events, so in the future I expect to see multi-touch support for the desktop. It just was not that long ago that this was only a thing for mobile devices smiling
So if you don't need to track individual touch points separately, mouse events should do it for you today. Otherwise, we're positioned now to be able to wire support for this in the not-too-distant future smiling

Windows Multitouch Events and LabView

I'm having some problems with multi-touch and LabView.
My objective is to intercept the Windows Touch Messages (generated by multitouch monitors and then interpreted and handled by Windows 7), which are intended for any and all windows owned by a program called LabVIEW.
This will prevent Windows from communicating Touch Messages with LabVIEW while allowing me to use the touch messages to create custom responses in LabVIEW myself. And, it will still allow Windows to use the Touch Messages as normal for any and all other programs which the user may want to interact with.
LabVIEW has not been registered with Windows 7 to interpret Windows Touch Messages specifically. It therefore handles them using default Windows 7 responses.
I have developed a library for LabVIEW which creates the custom multitouch enabled responses but it requires me to provide my own driver for the multitouch monitor being used in order to prevent Windows 7 from listening to the monitor's touch event messages and converting them to its own set of Touch Messages. This is inefficient as I want users to be able to plug and play any commercial multitouch monitor with my code and I don't want to have to write custom drivers for every monitor type.
So, I want to intercept the Touch Messages intended for LabVIEW (and only those Windows Touch Messages) so that they
Never reach LabVIEW
Can then be sent on to my existing program for reinterpretation via TCP messages over the localhost (this seems the best way I've found so far).
If anybody has any ideas I'd be exceedingly grateful!
LabVIEW does not see the Windows Touch Events as you already know. The only events you see are the ones you can use in the Event Structure. However, there are ways to use .Net Callbacks to see other Windows events. You can then create User Events to feed the event back to your Event Structure. Below are a couple of links that might help:
Capturing Windows System Events without Polling (Windows)
Windows Message Queue Library
Use windows touch screen (multi touch) and distinguish get touch event and mouse click
Use the event handler structure in a while loop and only register the events you want LabVIEW to handle.
If you are willing to pay for it, there is a commercial toolkit that supports multi-touch and smartphone-style gestures on a number of touchscreen devices via UDPP or Windows 7 messages:
https://www.ni.com/en-us/shop/software/products/touchscreen-toolkit-for-labview.html

How do I go about writing a windows driver for a bluetooth device?

I am looking for some advice/input on writing a device driver. I have never written one for windows before, let alone for bluetooth.
Can you recommend a book or website or something to get me started? I have the windows driver kit, and the examples therein but with out some place to start I am dead in the water.
The specifics: My friend gave me his mac Magic Mouse. I have a windows 7 machine. With the mouse set up as just a generic HID device it works ok as a two button mouse with no scroll, the motion is smooth and the acceleration what you expect in a windows mouse.
The mouse actually has a fairly good lpi resolution, making it pretty sensitive. There are mac drivers available, extracted from bootcamp. They kind of work. The cursor will randomly freeze or stop responding to the mouse's movement, which is buffered, and then leap once whatever has caused the stall stops. As an added touch the mac drivers make the cursor move like it would on a mac, with that logarithmic acceleration, that will completely throw off any windows user. With the driers you get vertical and horizontal scroll, but that's it. There is no multi touch functionality, and you can't change any of the behaviors, like acceleration. There isn't multiclutch for windows, or other 3rd party software for a multi touch mouse.
So I figured I would endeavor to make my own drivers and multi touch functionality in windows for this thing. I know mac will never support it properly under windows and windows won't write there own drivers until there is a reason.
Also if any one knows of any one else trying to do the same or similar things, point me to them.

What devices are available to test WM_GESTURE and WM_TOUCH code on a desktop machine?

I'm writing some code to handle WM_GESTURE and WM_TOUCH events in Windows 7, but I can't figure out how to test it. I do my development in Boot Camp on a 17" Mac Book Pro.
So far, I have determined that the Boot Camp trackpad driver in Windows 7 does not generate those events, and this generic trackpad I found on Amazon.com that claims to be 'multi-touch' works as advertised, but not by creating WM_GESTURE or WM_TOUCH events. I verified this by using Spy++ to report the events; nothing with the WM_GESTURE or WM_TOUCH value was reported.
What kind of hardware is supposed to generate these kinds of events? At this point, I'm assuming it's only for tablet or mobile (Windows CE) hardware, but I'd appreciate any other suggestions.
I suppose there's another way to approach this -- I want to get functionality similar to Cocoa's [NSResponder swipeWithGesture:] and related methods, which report back swipes, rotation, and other gestures on the trackpad. WM_GESTURE appears to be the equivalent on Windows 7.
An other option, which would require only another physical mouse device to work with, and should get you at least 95% of the way there is the Multi-Touch Vista project, which can emulate up to 256 touch points using physical devices - thus the need for an extra mouse, or two since it can be awkward to simultaneously work with a mouse in one hand and trackpad with another.
There are several monitors out there supporting touch with Windows 7. For example: Acer T230H.
HTH
Wacom makes several touchpads that support multitouch; a particularly inexpensive version is the Bamboo Touch. This gives you touch without having to buy another monitor - although it doesn't give that direct interaction feeling.

Resources