I would like to buy me a Microsoft Surfacebook. But bevore I would like to know if Unity IDE support Touch Events in the Game View of the Editor? Or only if a real device is connect? But the monitor itselft is a touch device.
Thanks for help!
Regards,
Markus
Few days, there was a similar question about touch detection in Unity using Windows tablets.
There is no support for the touch. Input.touches and Input.GetTouch will not work in the Editor Game View. It will only work when you deploy the Game.
For some reason, touches are treated as mouse input.
This shouldn't be a problem at-all because Input.mousePosition and OnMouseDown will work on it so you can use pre-processor to detect when in Editor mode then use input.mousePosition and OnMouseDown and then be able to test your app without having to build it.
Everything else should be fine. Remember, it is not a real computer. It is a mobile computer and is weaker than a real computer. Don't expect a 4K texture game to run smoothly on this device.
Related
Does anyone have any information to share regarding gesture detection using wrist movements (in contrast to swipe movements on the touchscreen) with Android Wear?
Is there official support for this in the API? If not, do you think this can be enabled via custom ROMs?
Considering Android Wear is... well, Android, and Android has all the API functions to get data from the accelerometer of an Android based smartphone (and lots of helpful development sources), I assume it wouldn't be too hard? Or am I overlooking something?
Someone said the following at http://forum.xda-developers.com/android-wear/development/gesture-detection-using-wrist-t2936656
Use asus remote camera app for android wear. If you twist your watch
it will take a picture with your smartphone camera!
To which I replied:
So wrist gesture detection is perfectly possible it seems? But can you
or anyone else give me some additional background about Android Wear:
Should I see Android Wear as a simple extension of the Android API?
By which I mean: can I utilise all the Android API functions related
to interpreting smartphone/tablet sensor data, provided the smartwatch
has that sensor? (e.g accelerometer)
Since I was looking in this
particular section related to Wearables (Remove the spaces: developer
. android . com/training/building-wearables.html ) and I couldn't find
any information about wrist gesture detection.
Is that simply because
everything else from the Android API is automatically also applicable
to smartwatch development?
(As one can tell, I'm quite new to mobile development.)
So far no answer. I'm now asking here in hope I get an answer...
Wrist gestures were added in Android Wear 5.1 however at the time of this response, Android Wear 5.1 is only available on the LG Watch Urbane. This update is expected on other Wear devices in the upcoming weeks
Two common wrist gestures are interpretted by the system as swipe up/down. You change the speed of the flick either away or towards you.
If you flick your wrist away from you quickly, then back slowly that will cause a scroll down.
If you turn your wrist slowly away then flick it quickly back towards you, this will scroll up.
You can see the wrist gestures in action here - https://www.youtube.com/watch?t=14&v=_R0qbB4hVbU
You can use the IBM Wearables SDK for Android
to record and recognise all kinds of gestures not only wrist
for more info:
https://github.com/ibm-wearables-sdk-for-mobile/ibm-wearables-android-sdk
Disclaimer: I am the author of the articles/code described below.
I just created an Accessibility Service that does exactly what you are describing: it gives you the ability to completely control your smartwatch using wrist gestures.
I created it for my own use (I can only use one of my hands, the one in which I wear the watch) and I can fully control it using wrist gestures.
https://jsalatas.ictpro.gr/handsfree-wear-a-wrist-gesture-input-method-for-android-based-smartwatches/
notice that in the current phase I'm not sure if the model that recognizes the wrist gestures can generalize or if it just recognizes my wrist gestures only.
I'm on a touch enabled Windows machine compiling OpenFL. I'm compiling to the Windows/C++ target using OpenFL and Haxe 3.
I cannot get touch events to work. Here's where I'm adding them:
private function onAdded(e:Event):Void
{
stage.addEventListener(Event.RESIZE, resize);
resize(null);
init();
addEventListener( Event.ENTER_FRAME, onEnterFrame);
addEventListener( TouchEvent.TOUCH_BEGIN, onTouchBegin );
stage.addEventListener( TouchEvent.TOUCH_MOVE, onTouchMove );
stage.addEventListener( TouchEvent.TOUCH_END, onTouchEnd );
}
My enterFrame() is getting invoked just fine. No touch (or using mouse) triggers the handlers. Is this a Windows desktop limitation? Would this work once I put on iOS and Android? Why not? Is this a NME/OpenFL bug?
You can use mouse in place of touch on windows for now with the 1.1 update, until full support for multitouch is implemented. Works fine for iOS and Android etc.
singmajesty on the forum
Try it the other way, use a MouseEvent to handle both mouse and touch input on a desktop.
We just (in OpenFL 1.1, released this week) migrated to SDL2 for the backend for Windows. This actually has support for real touch events, so in the future I expect to see multi-touch support for the desktop. It just was not that long ago that this was only a thing for mobile devices smiling
So if you don't need to track individual touch points separately, mouse events should do it for you today. Otherwise, we're positioned now to be able to wire support for this in the not-too-distant future smiling
I would like to use the touch controller in windows 7.Actually i will get the touch coordinates as bluetooth packets accordingly using these coordinates to generate corresponding touches in OS as human mouse click or touches do.So how to start up with this.How could i register for touch events on windows7 os.
Download and install Microsoft Surface SDK 2.0. It includes a little software called Microsoft Surface Input Simulator which simulates touch events and some other things in Windows.
http://www.microsoft.com/en-us/download/details.aspx?id=26716
EDIT: I know this question is over a year old, but I thought it could help others.
There are some open source virtual touch drivers that you can use to simulate touch input if you don't have a touch input device
http://multitouchvista.codeplex.com/
http://code.google.com/p/vmulti/
Quick question -- if I code something to respond to "MouseLeftButtonDown" such as the pushing of an image, if I leave the code the same way when I ship the app, will this directly translate to the user pushing their finger down on the same spot, and thus fire the code?
Do I have to change the MouseLeftButtonDown to the gesture listener for this to translate, such that MouseLeftButtonDown is only used in place of non-touch monitors when coding to test things?
THanks!
As corrected by Matt in the comments, the MouseLeftButtonDown event is not the same as an image tap. However, the result would be the same in that if your code works on the emulator, regarding the tapping, then it should work on the device.
You should try and get your app running on a device though as there can be things easily overlooked in the emulator. For example, performance can decrease on your phone since it's likely to be quite a lot less powerful than your PC. Therefore, if your app is performing fine on your PC (emulator), it doesn't necessarily mean you'll get the same speeds on the device.
I am looking for some advice/input on writing a device driver. I have never written one for windows before, let alone for bluetooth.
Can you recommend a book or website or something to get me started? I have the windows driver kit, and the examples therein but with out some place to start I am dead in the water.
The specifics: My friend gave me his mac Magic Mouse. I have a windows 7 machine. With the mouse set up as just a generic HID device it works ok as a two button mouse with no scroll, the motion is smooth and the acceleration what you expect in a windows mouse.
The mouse actually has a fairly good lpi resolution, making it pretty sensitive. There are mac drivers available, extracted from bootcamp. They kind of work. The cursor will randomly freeze or stop responding to the mouse's movement, which is buffered, and then leap once whatever has caused the stall stops. As an added touch the mac drivers make the cursor move like it would on a mac, with that logarithmic acceleration, that will completely throw off any windows user. With the driers you get vertical and horizontal scroll, but that's it. There is no multi touch functionality, and you can't change any of the behaviors, like acceleration. There isn't multiclutch for windows, or other 3rd party software for a multi touch mouse.
So I figured I would endeavor to make my own drivers and multi touch functionality in windows for this thing. I know mac will never support it properly under windows and windows won't write there own drivers until there is a reason.
Also if any one knows of any one else trying to do the same or similar things, point me to them.