Input.Touch not working on OSX - macos

I'm a total beginner in Unity and created a function that simply sets the text of a text element when a touch is detected like this:
// Update is called once per frame
void Update () {
if (didTap ()) {
print ("did tap!");
tapText.text = "TAP!";
}
}
private bool didTap() {
print ("checking didtap");
if (Input.touchCount > 0) {
Touch initialTouch = Input.GetTouch (0);
return initialTouch.phase == TouchPhase.Began;
}
return false;
}
When I run this on an actual device (Android phone) it works perfectly, but when I 'click' in the unity editor when running the game nothing happens. I'm running OSX 10.12.2. Anything I need to configure to have the mouse mimic touch events?

You can duplicate Unity's play mode window onto your mobile device by using the UnityRemote app. This will allow you to interact with your phone's features, while still running Unity on your computer.
UnityRemote requires some setup in Unity along with downloading the app. Consider watching this video: Unity Remote Setup Tutorial
Alternatively, see the answer from Hellium in this question, in conjunction with getting the mouse clicks, to only compile code based on the device used.

There's an odd difference between "touch" input and "mouse" input in Unity. Input.touchCount will always return 0 while you're running in the Editor but Input.GetMouseButtonDown(0) works for both the Editor and the first touch of one or more touches while on a mobile build.
You have two options to solve this issue via code (your other option is the remote ryemoss mentioned):
If you're only ever using one touch, use Input.GetMouseButtonDown(0) and Input.GetMouseButtonUp(0).
Use #if UNITY_IOS, #if UNITY_ANDROID and #if UNITY_EDITOR tags to write separate input readers for your platforms and then have them all call into the same functionality.

Related

How to add bezel to WearOS simulator

Samsung started using WearOS in their latest smartwatches, e.g. in Galaxy 4 watch, and I need to test bezel functionality since the latter model does have it. However I didn't find any WearOS devices in AVD supporting bezel.
I've also tried creating a new h/w profile, but didn't find a bezel option there either. All navigation options they have are below. None of them is related to bezel.
I've also tried to find a skin for Galaxy 4, but with no luck so far. The code that doesn't work according to a Galaxy4 owner is below. You can suggest how to fix the code of course, but I still want to know how to test it without buying a watch
view.setOnGenericMotionListener { v, ev ->
if (ev.action == MotionEvent.ACTION_SCROLL &&
ev.isFromSource(InputDeviceCompat.SOURCE_ROTARY_ENCODER)
) {
val delta = -ev.getAxisValue(MotionEventCompat.AXIS_SCROLL) *
ViewConfigurationCompat.getScaledVerticalScrollFactor(
ViewConfiguration.get(this), this
)
if (Math.abs(delta) > 2f) {
val np = if (delta > 0) Util.nextAccount(mAccount) else Util.prevAccount(mAccount)
Util.d(TAG, mAccount + np.toString())
switchAccount(np)
}
true
} else {
false
}
}
nextAccount and prevAccount are some custom functions that switch the view. None of them is called according to a user.
Here is a Tizen Studio emulator with a bezel that can be rotated by dragging the white dot:
I've finally fixed the problem. In a view's layout that is supposed to process the rotary event I've added requestFocus tag:
<ScrollView
android:layout_width="match_parent"
android:layout_height="match_parent"
android:fillViewport="true"
android:scrollbars="vertical"
android:fadeScrollbars="false"
android:id="#+id/token_scroll"
>
<requestFocus />
...
To test the bezel, I've used menu on the right of the emulator as shown on the picture below. Bezel events are processed correctly at least in the emulator. I'll let you know if it works in real Galaxy 4 smartwatch when hear from the user.
UPDATE
A Galaxy 4 smartwatch user has just confirmed that bezel works after the fix. It confirms that both the fix and the testing method were correct and achieved their goals.

volume buttons have no effect when playing through ear piece

In my Xamarin Forms Android app I'm sending audio through the ear piece instead of the normal speaker. I'm doing something like:
myMediaPlayer.SetAudioAttributes(new AudioAttributes.Builder().SetLegacyStreamType(Stream.VoiceCall).Build());
The deprecated version of the above is
myMediaPlayer.SetAudioStreamType(Stream.VoiceCall);
Either way, this seems to work (though, if there is a better way, please let me know).
HOWEVER, I cannot control the volume. The sound correctly comes out of the ear speaker only, but it is at a constant volume, ignoring my volume key presses (though the volume meter displays on the screen going up and down as a press the buttons). ... (for clarification, I'm not trying to control the volume programatically, I simply want the device to adjust the volume as one would expect.)
Any help?
UPDATE
I've also tried this code:
myMediaPlayer.SetAudioAttributes(new AudioAttributes.Builder().SetContentType(AudioContentType.Music).SetUsage(AudioUsageKind.VoiceCommunication).Build());
I have two goals:
Play only out of the ear piece (or headphones if attached)
Be able to control the volume (which I thought would be a given)
UPDATE with Code Sample
The following will play through the earpiece, but not change volume.
public void Play(string url)
{
var myMediaPlayer = new MediaPlayer();
myMediaPlayer.SetDataSource(url);
if (Build.VERSION.SdkInt >= BuildVersionCodes.Lollipop)
myMediaPlayer.SetAudioAttributes(new AudioAttributes.Builder().SetContentType(AudioContentType.Music).SetUsage(AudioUsageKind.VoiceCommunication).Build());
else
myMediaPlayer.SetAudioStreamType(Stream.VoiceCall);
myMediaPlayer.Prepare();
myMediaPlayer.Start();
}

Input.GetAxis("Mouse X") not working on some Windows configurations

public float GetAxis()
{
if (inputDevice == InputDevice.MouseKeyboard)
{
return Input.GetAxis(this.buttonName);
}
}
This code is working perfectly on my Windows 7 x64 PC. My Project Input settings are ordinal:
Input settings:
But I watched some videos on youtube where people playing my game. And they can't use mouse in it. Looks like Input.GetAxis("Mouse X") and Input.GetAxis("Mouse Y") has not returning proper values for them and they can't control camera in game.
Other input is working fine for them.
My Unity version is 5.6.0f3 and I can't upgrade to actual version because game's code is too complex.
How to troubleshot and fix it? I have not build for other platforms then windows x86 and x64.
Input object was constructed:
public GenericInput rotateCameraXInput = new GenericInput("Mouse X", "RightAnalogHorizontal");
To read delta mouse movement I am running this method in LateUpdate():
protected virtual void CameraInput()
{
if (tpCamera == null || cc.lockCamera)
return;
var Y = rotateCameraYInput.GetAxis();
var X = rotateCameraXInput.GetAxis();
}
I've come across this issue myself while using an RDP session or some kind of remote viewer such as TeamViewer. Mouse X and Mouse Y read the output directly from a device. If the device is not directly plugged into the machine that the player is being ran on then the inputs will not be properly retrieved. I'm not sure if this is the case for you, but this is the only instance I can think of these not being picked up.
Maybe you should add a bit of code that gets the mouse position each frame and outputs the difference, this would bypass the Mouse X/Y inputs anyways.
Update. It is not a bug. I've just missed something in my project.
I have a script to control the speed of mouse-controlled camera and PlayerPrefs variable to change it. And, in some conditions, that variable was set to 0. But for my case, it has already been set to registry and on my PC everything was working fine.
Maybe I need to delete this question, because it has not provided enough data.
I found this thread at unity forum There are some people who encountered the same issue on real Windows PC with different Unity versions and different mouse drivers.
This is an old Unity hardware compatibility bug. Looks like it can't be fixed other then upgrading Unity or using other Input system.

Unity UI not working properly ONLY on Windows

I have been working on figuring out what is going on with my game's UI for at least two days now, and no progress.
Note that this is a mobile game, but I was asked to build for Windows for visualization and presentation purpose.
So the problem is that when I run my game on the Unity Editor, Android, iOS and Mac platforms the UI works just perfect, but then when I run the game on Windows the UI still works fine UNTIL I load a specific scene.
This specific scene is a loading screen (between main menu and a level) when the level finished async loading, a method called MoveObjects is called in a script in the loading screen, to move some objects that where spawned in the loading screen scene into the level scene (this is not the issue though, since I already try without this method and the problem on the UI persist).
Once the logic of this MoveObjects method is done, a start button is enabled in the loading screen, for the player to click and start playing (I did try moving the start button to the level scene, since maybe it not been a child of the currently active scene could be the issue, but the problem still persist). Is at this point that the UI is partially broken, what I mean with this is, that I can see buttons (and some other UI elements like a scrollbar) changing color/state when the mouse moves over them, but I cannot click on them anymore (the button wont even change to the pressed state).
Also note that I tried creating a development build to see if there was any errors in the console, and I notice that this problem is also affecting the old UI system, so I was not able to interact with the development console anymore.
Also also, note that if I grab and drag the scrollbar before this issue appear, and I keep holding down on the scrollbar until this happens, the mouse gets stuck on the scrollbar, meaning that I cannot interact with the UI anymore, but the scrollbar will still move with the mouse.
I already check that this things are not the source of the problem:
Missing EventSystem, GraphicRaycaster or InputModule.
Another UI element blocking the rest of the UI.
Canvas is Screen Space - Overlay so there is no need for a camera reference.
I only have one EventSystem.
Time.timeScale is 1.
I am not sure what else I could try, so if anyone has any suggestions, I would appreciate it. Thanks.
P.S: I am sorry to say that I cannot share any code or visual material or examples due to the confidentiality.
A major source for a non-working UI for me has always been another (invisible) UI object blocking the raycast (a transparent Image, or a large Text object with raycast on).
Here's a snippet I put together based on info found elsewhere, I often use it to track objects that are masking the raycast in complex UI situations. Place the component on a text object, make sure it's at least few lines tall, as the results will be displayed one under another.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.EventSystems;
using UnityEngine.UI;
[RequireComponent(typeof(Text))]
public class DebugShowUnderCursor : MonoBehaviour
{
Text text;
EventSystem eventSystem;
List<RaycastResult> list;
void Start()
{
eventSystem = EventSystem.current;
text = GetComponent<Text>();
text.raycastTarget=false;
}
public List<RaycastResult> RaycastMouse(){
PointerEventData pointerData = new PointerEventData (EventSystem.current) { pointerId = -1, };
pointerData.position = Input.mousePosition;
List<RaycastResult> results = new List<RaycastResult>();
EventSystem.current.RaycastAll(pointerData, results);
return results;
}
void Update()
{
list= RaycastMouse();
string objects="";
foreach ( RaycastResult result in list)
objects+=result.gameObject.name+"\n";
text.text = objects;
}
}

Handsontable edit on mobile device

i know that handsontable is not mobile friendly but is there a workaround that we can edit on mobile devices with the newest version?
Regards
I use Handsontable version 0.25.1, but I imagine the situation is still the same at the very latest release.
My experience is only with iPad device (iOS 9); I cannot speak about all mobile devices.
I found the rendering acceptable. But if you tap a cell to edit nothing happens, which is rather limiting! I made just two small changes to rectify:
Tracing all of this right down in handsontable.full.js, the logic in onCellMouseDown() is testing for event.button === 0 (i.e. left mouse button) to start setting up for recognising the double click to activate the mobile text editor. On a touch device, they are explicitly calling their onMouseDown/Move/Up() from their onTouchStart/Move/End(), passing the mouse events the event structure received for the touch event. However, touch events' event structure does not have a button member, so that is undefined, causing bad behaviour.
Directly setting passed-in event.button = 0 before passing to mouse event handlers solves this. Put in line:
event.button = 0; // set as left mouse button
into onTouchStart() [prior to call to onMouseDown(event), around line 1326] & onTouchEnd() [prior to call to onMouseUp(event), around line 1368].
Now tapping into a cell correctly allows editing. It brings up their MobileTextEditor. Which I did not like for a number of reasons, including the fact that often it appears behind the on-screen keyboard so the user does not even know it is there! I changed their Handsontable.TextCell [around line 4346] so that the editor: line now reads:
editor: (isMobileBrowser() && Handsontable.useMobileEditor) ? getEditorConstructor('mobile') : getEditorConstructor('text'), // only use mobile editor if explicitly called for
So it uses the standard in-place text editor, which I prefer, unless you go hot.updateSettings({useMobileEditor: true}).

Resources