Input.GetAxis("Mouse X") not working on some Windows configurations - windows

public float GetAxis()
{
if (inputDevice == InputDevice.MouseKeyboard)
{
return Input.GetAxis(this.buttonName);
}
}
This code is working perfectly on my Windows 7 x64 PC. My Project Input settings are ordinal:
Input settings:
But I watched some videos on youtube where people playing my game. And they can't use mouse in it. Looks like Input.GetAxis("Mouse X") and Input.GetAxis("Mouse Y") has not returning proper values for them and they can't control camera in game.
Other input is working fine for them.
My Unity version is 5.6.0f3 and I can't upgrade to actual version because game's code is too complex.
How to troubleshot and fix it? I have not build for other platforms then windows x86 and x64.
Input object was constructed:
public GenericInput rotateCameraXInput = new GenericInput("Mouse X", "RightAnalogHorizontal");
To read delta mouse movement I am running this method in LateUpdate():
protected virtual void CameraInput()
{
if (tpCamera == null || cc.lockCamera)
return;
var Y = rotateCameraYInput.GetAxis();
var X = rotateCameraXInput.GetAxis();
}

I've come across this issue myself while using an RDP session or some kind of remote viewer such as TeamViewer. Mouse X and Mouse Y read the output directly from a device. If the device is not directly plugged into the machine that the player is being ran on then the inputs will not be properly retrieved. I'm not sure if this is the case for you, but this is the only instance I can think of these not being picked up.
Maybe you should add a bit of code that gets the mouse position each frame and outputs the difference, this would bypass the Mouse X/Y inputs anyways.

Update. It is not a bug. I've just missed something in my project.
I have a script to control the speed of mouse-controlled camera and PlayerPrefs variable to change it. And, in some conditions, that variable was set to 0. But for my case, it has already been set to registry and on my PC everything was working fine.
Maybe I need to delete this question, because it has not provided enough data.
I found this thread at unity forum There are some people who encountered the same issue on real Windows PC with different Unity versions and different mouse drivers.
This is an old Unity hardware compatibility bug. Looks like it can't be fixed other then upgrading Unity or using other Input system.

Related

volume buttons have no effect when playing through ear piece

In my Xamarin Forms Android app I'm sending audio through the ear piece instead of the normal speaker. I'm doing something like:
myMediaPlayer.SetAudioAttributes(new AudioAttributes.Builder().SetLegacyStreamType(Stream.VoiceCall).Build());
The deprecated version of the above is
myMediaPlayer.SetAudioStreamType(Stream.VoiceCall);
Either way, this seems to work (though, if there is a better way, please let me know).
HOWEVER, I cannot control the volume. The sound correctly comes out of the ear speaker only, but it is at a constant volume, ignoring my volume key presses (though the volume meter displays on the screen going up and down as a press the buttons). ... (for clarification, I'm not trying to control the volume programatically, I simply want the device to adjust the volume as one would expect.)
Any help?
UPDATE
I've also tried this code:
myMediaPlayer.SetAudioAttributes(new AudioAttributes.Builder().SetContentType(AudioContentType.Music).SetUsage(AudioUsageKind.VoiceCommunication).Build());
I have two goals:
Play only out of the ear piece (or headphones if attached)
Be able to control the volume (which I thought would be a given)
UPDATE with Code Sample
The following will play through the earpiece, but not change volume.
public void Play(string url)
{
var myMediaPlayer = new MediaPlayer();
myMediaPlayer.SetDataSource(url);
if (Build.VERSION.SdkInt >= BuildVersionCodes.Lollipop)
myMediaPlayer.SetAudioAttributes(new AudioAttributes.Builder().SetContentType(AudioContentType.Music).SetUsage(AudioUsageKind.VoiceCommunication).Build());
else
myMediaPlayer.SetAudioStreamType(Stream.VoiceCall);
myMediaPlayer.Prepare();
myMediaPlayer.Start();
}

Input.Touch not working on OSX

I'm a total beginner in Unity and created a function that simply sets the text of a text element when a touch is detected like this:
// Update is called once per frame
void Update () {
if (didTap ()) {
print ("did tap!");
tapText.text = "TAP!";
}
}
private bool didTap() {
print ("checking didtap");
if (Input.touchCount > 0) {
Touch initialTouch = Input.GetTouch (0);
return initialTouch.phase == TouchPhase.Began;
}
return false;
}
When I run this on an actual device (Android phone) it works perfectly, but when I 'click' in the unity editor when running the game nothing happens. I'm running OSX 10.12.2. Anything I need to configure to have the mouse mimic touch events?
You can duplicate Unity's play mode window onto your mobile device by using the UnityRemote app. This will allow you to interact with your phone's features, while still running Unity on your computer.
UnityRemote requires some setup in Unity along with downloading the app. Consider watching this video: Unity Remote Setup Tutorial
Alternatively, see the answer from Hellium in this question, in conjunction with getting the mouse clicks, to only compile code based on the device used.
There's an odd difference between "touch" input and "mouse" input in Unity. Input.touchCount will always return 0 while you're running in the Editor but Input.GetMouseButtonDown(0) works for both the Editor and the first touch of one or more touches while on a mobile build.
You have two options to solve this issue via code (your other option is the remote ryemoss mentioned):
If you're only ever using one touch, use Input.GetMouseButtonDown(0) and Input.GetMouseButtonUp(0).
Use #if UNITY_IOS, #if UNITY_ANDROID and #if UNITY_EDITOR tags to write separate input readers for your platforms and then have them all call into the same functionality.

How to detect which screen is the OSVR headset?

I have an WPF+SharpDX Windows application that displays to the OSVR HDK via a fullscreen window on the screen that is the HDK. This setup works well, but it requires users to state which screen the HDK is on.
I would like to have that automatically detected, but haven't seen anything in the API on which screen is the headset.
Currently I render in a window:
var bounds = dxgiDevice.Adapter.Outputs[_selectedOutput].Description.DesktopBounds;
form.DesktopBounds = new System.Drawing.Rectangle(
bounds.X, bounds.Y, bounds.Width, bounds.Height);
And _selectedOutputis the thing I'm looking for.
I don't support direct mode at this time and I'm using Managed-OSVR. The application will run on Windows 8/8.1/10.
It's been a while since I coded anything for OSVR, but here's from what I remember:
If you're running in extended mode, the OSVR is treated as a regular display. You can rearrange it as any other screen. The output location can be configured in the OSVR config file.
I used the following (Java) to retrieve the position and size to set up my window:
osvrContext.getRenderManagerConfig().getXPosition()
osvrContext.getRenderManagerConfig().getYPosition()
osvrContext.getDisplayParameters().getResolution(0).getWidth()
osvrContext.getDisplayParameters().getResolution(0).getHeight()
To clarify: I don't know if you can retrieve the id of the display in extended mode. From what I know, it's only defined as a position and size on the desktop.
I hope that it helps you, somewhat.

SDL2 relative mouse mode reporting motion when mouse has not been moved

I'm new to SDL2, so pardon any ignorance, but I am experiencing strange results when using relative mouse mode in SDL2.0.3. When I do SDL_SetRelativeMouseMode(SDL_bool::SDL_true), the cursor is hidden as expected. Inside the event loop, I check for windowEvent.type == SDL_MOUSEMOTION and then use windowEvent.motion.xrel/yrel xrel and yrel report values from -4 to 4 when the mouse is not even moving! Furthermore, actually moving my mouse does not seem to correlate whatsoever to the xrel and yrel's being reported.
Should I be doing this differently?
I have the exact same problem on my computer, but using Uint32 SDL_GetRelativeMouseState(int* x,int* y) instead works fine.

Range Slider for Cocoa

I was wondering if there was a way to create a range slider for an OS X app in Cocoa. Basically it's just a regular slider with two buttons instead of one, so you can set a range (for a graph for example). I've seen a lot of custom range sliders for iOS (such as http://www.cocoacontrols.com/platforms/ios/controls/rangeslider), but none for OS X.
The closest thing I could find was SMDoubleSlider, which doesn't seem to work at all in XCode 4.
Warning: I just learned the hard way that SMDoubleSlider uses private APIs that will not pass App Review. Don't use this class if you intend to distribute your app through the Mac App Store.
Of course it's possible, almost anything is, but you would have to do all the drawing and handling of the slider movement yourself.
I'm not sure what you mean about SMDoubleSlider not working in Xcode 4 -- I tried it out, and it worked fine for me. I didn't try the IB plugin stuff, I just imported the 4 files (SMDoubleSlider .h and .m and the slider cell files as well) and alloc init'ed it to create one and add it to my view. I was able to get and set the values of the hi and lo sliders.
If you're going to use it, just make sure you abide by the copyright restrictions, it is open source, but not public domain.

Resources