I'm using the Windows 8 Simulator to do some testing of my website in IE10 using touch. However, when I select the single point (basic) touch mode (or any touch mode, for that matter), I find that Windows (and IE) still generate hover events. What I really want to do is test as if I was on a purely touch system with no support for hover.
Is it possible to adjust the Simulator so that it only responds to taps and related events, and not hover?
Related
I'm developing an Adobe Air application for Windows which makes use of a touch screen. When I touch the screen, both on the Windows desktop and within my app, there is a circular flash at the touch point.
How do I disable this, so that there is no visible indication of the touch point, other than that provided by my app? Is does not seem possible to disable this in the Windows touch settings, and there is no mention of visual feedback that I can find in the air API.
Try this...it worked for me on a win7 system.
OK,--Click Start, type in gpedit.msc and hit enter. --Click through
User Configuration --Administrative Templates -- Windows Components --
Tablet PC -- Cursors. --Double click on "Turn off pen feedback" and
select Enabled to enable the filter that turns off the ripple. Do this
not just for User Configuration but repeat the process for Computer
Configuration to make sure it is disabled for everyone using the
workstation
The link here.
I'm on a touch enabled Windows machine compiling OpenFL. I'm compiling to the Windows/C++ target using OpenFL and Haxe 3.
I cannot get touch events to work. Here's where I'm adding them:
private function onAdded(e:Event):Void
{
stage.addEventListener(Event.RESIZE, resize);
resize(null);
init();
addEventListener( Event.ENTER_FRAME, onEnterFrame);
addEventListener( TouchEvent.TOUCH_BEGIN, onTouchBegin );
stage.addEventListener( TouchEvent.TOUCH_MOVE, onTouchMove );
stage.addEventListener( TouchEvent.TOUCH_END, onTouchEnd );
}
My enterFrame() is getting invoked just fine. No touch (or using mouse) triggers the handlers. Is this a Windows desktop limitation? Would this work once I put on iOS and Android? Why not? Is this a NME/OpenFL bug?
You can use mouse in place of touch on windows for now with the 1.1 update, until full support for multitouch is implemented. Works fine for iOS and Android etc.
singmajesty on the forum
Try it the other way, use a MouseEvent to handle both mouse and touch input on a desktop.
We just (in OpenFL 1.1, released this week) migrated to SDL2 for the backend for Windows. This actually has support for real touch events, so in the future I expect to see multi-touch support for the desktop. It just was not that long ago that this was only a thing for mobile devices smiling
So if you don't need to track individual touch points separately, mouse events should do it for you today. Otherwise, we're positioned now to be able to wire support for this in the not-too-distant future smiling
Can someone tell me how to test the Touch function in VS2010 for Windows Phone 7 emulator? I wrote a simple phone 7 app to test touch function, when I move the mouse and click in side the phone screen, nothing happened.
-Henry
The mouse does work for single finger gestures and taps, not sure why it wouldn't work for you except to assume that your code must not be working as you assume :-)
Just for reference, the other way of testing multi-touch is to have a multi-touch enabled monitor
I'm looking to target a website specifically for an iPad but we don't have any Macs in house for testing. What's the most accurate way to test the site on a PC? I image I could use the Safari browser and shrink the window down to approximate the iPad screen size but I wonder if there's a better method out there.
If you target a website specifically for a particular device, buy that particular device. This doesn't only apply to iPad.
Two caveats I noticed a lot of websites have for a touch-oriented device like an iPad, iPhone:
The mouse-hover event isn't generated. So, the HTML/CSS/Javascript menu structure which works without clicking on a WebKit browser (like Safari) on a mouse-oriented device might stop working completely.
The scrolling event (coming from a flick of a finger) is not passed to elements inside a page; instead it just scrolls the entire page. A subelement shown with a scroll bar on a non-touch-oriented device might be shown without the scroll bar at all. So, sometimes you lose the ability to scroll inside a subelement.
There might be other caveats. It's really difficult to imagine all the way a device might behave differently from a mouse-oriented device; so, buy an iPad.
By the way, it's of no use to buy a Mac in this situation: Safari on a Mac still behaves (as far as the mouse/touch events are concerned) rather differently from Safari on an iPad/iPhone. An iPad can be paired with a Windows PC.
See this Apple document for a few advices for preparing a web page for the iPad.
I'd just use Safari, as the mobile version uses the same rendering engine (though possibly modified to fit the iPads resources).
It should display the same, if not close.
You can try to use online imitation services.
For example http://app.crossbrowsertesting.com/, or https://saucelabs.com/. They provide lots of imitations environments, for different devices and OS. You can test the site, that is already in the web, or your local files.
I myself am working currently with app.crossbrowsertesting.com for the first time. It really shows the problem, that the client encountered on his iPad. Also have good notices about these services from experienced developer, a friend of mine.
I'm writing some code to handle WM_GESTURE and WM_TOUCH events in Windows 7, but I can't figure out how to test it. I do my development in Boot Camp on a 17" Mac Book Pro.
So far, I have determined that the Boot Camp trackpad driver in Windows 7 does not generate those events, and this generic trackpad I found on Amazon.com that claims to be 'multi-touch' works as advertised, but not by creating WM_GESTURE or WM_TOUCH events. I verified this by using Spy++ to report the events; nothing with the WM_GESTURE or WM_TOUCH value was reported.
What kind of hardware is supposed to generate these kinds of events? At this point, I'm assuming it's only for tablet or mobile (Windows CE) hardware, but I'd appreciate any other suggestions.
I suppose there's another way to approach this -- I want to get functionality similar to Cocoa's [NSResponder swipeWithGesture:] and related methods, which report back swipes, rotation, and other gestures on the trackpad. WM_GESTURE appears to be the equivalent on Windows 7.
An other option, which would require only another physical mouse device to work with, and should get you at least 95% of the way there is the Multi-Touch Vista project, which can emulate up to 256 touch points using physical devices - thus the need for an extra mouse, or two since it can be awkward to simultaneously work with a mouse in one hand and trackpad with another.
There are several monitors out there supporting touch with Windows 7. For example: Acer T230H.
HTH
Wacom makes several touchpads that support multitouch; a particularly inexpensive version is the Bamboo Touch. This gives you touch without having to buy another monitor - although it doesn't give that direct interaction feeling.