Issue with hand controls and thumb controls in Aframe 1.0.4. Similar code works fine in Aframe 1.0.0 - controls

In Aframe 1.0.4 I am having an issue where in the Oculus Quest the hand controls in my app only work if you enter VR, exit VR, and enter VR again. Also the thumb controls do not work. However, the code (with one tiny modification to how hand-controls are written) works fine in Aframe 1.0.0.
Here is the code working fine in Oculus Quest with Aframe 1.0.0
Project Page link: https://glitch.com/~keylime
Live App: https://keylime.glitch.me/
To verify that its working fine you need to put on the Oculus Quest, enter VR, pull the thumbstick towards yourself to launch teleport, and point the thumbstick away from yourself to trigger a click (to verify the click point at the box (with the ray from the raycaster) in front of you, when you initiate a click event (while pointing at the box) the box will change color).
In the second example, the code is almost identical, but when you enter VR initially there is no hand-control working, to get it working you have to exit VR and enter VR again, when you do the hand-controls are working, which means you can pull the trigger on the Oculus Touch controller and it will change the color of the cube if you are pointing at the cube (with your ray) however at this point the thumb-controls are not working, so you will notice that if you pull the thumbstick toward you, or push it away from you, nothing happens.
Here is the code not working well in Oculus Quest with Aframe 1.0.4
Project Page link: https://glitch.com/~keynine
Live App: https://keynine.glitch.me

There are redundant controls: laser-controls, hand-contols, mylasercontrols, daydream controls. Keep it simple with just two entities, one per hand. Also hand-controls API is:
<a-entity hand-controls=“hand: left”></a-entity>
See docs

Related

Xamarin Forms buttons stop receiving mouse clicks after clicking on SkiaSharp CanvasView on iOS

I use SkiaSharp canvas to draw the main game screen, and then there are various Xamarin.Forms Buttons around the UI. This all works fine on when used directly on iPhone or iPad using a finger. However, when I connect a mouse (e.g., through a MacBook or otherwise), the buttons start working with about 10% chance after mouse-clicking on the SkiaSharp canvas (and not receiving the mouse click events with 90% chance). The SkiaSharp canvas itself works just fine.
If I bring up the iOS app launch menu from the bottom (which probably somehow temporarily exists the mouse navigation on the app), the buttons start working again with the mouse. But if I click the SkiaSharp canvas again with the mouse, the buttons have a high chance of becoming disabled again. If I change to using a finger, all works fine (even if the mouse clicks were not being registered immediately before). However, mouse clicks are not being registered even after touching with a finger, so finger-touching does not reset the issue with the mouse (but bringing up the menu from the bottom does).
We found this bug by testing the iOS game on MacBook Pro (the iOS apps recently came available on the App Store) but the same issue persists also directly with an iPad / mouse combination. It seems to be some sort of an issue between using a mouse (on iPad or on MacBook Pro), SkiaSharp canvas and Xamarin.Forms buttons.
Does anyone know what the root cause of the problem is and what is the workaround?
Not an answer as such, but some more information about reproducing the issue: A simpler repro case may be this small project: https://github.com/jrc14/TraceMatching/ .
Don't worry too much about what it's doing, but note that you're mean to click in the grey Skia canvas in the middle to create 'targets' - and that after you've done that, mouse-clicks are getting lost.
If you run it on a Mac, you'll see that, though the clicks get lost after you've clicked on the Skia canvas, they will start being received again if you click on something else (another app, or the Mac background).
(further edit) - after some noodling around I did find a workaround. If, once you've finished processing the touch action on the SKCanvasView, you reset its EnableTouchEvents property (i.e. set it to false, then back to true again), it seems that the clicks don't get lost any more,

Some of my atlas animations have cropped and only showing half

All my animations were working fine, I was adding a boss fight in level 12
Got it working but the player ship that I wasn’t working on was only showing the back half, as if it had been cropped! Yet still animated as normal.
When I tried a clean build I tested level 1 and the player ship is still cropped, along with some other animated nodes, like the weapon power up symbols.
All the other animations are working ok.
Any ideas what could cause this?
I finally worked out what was wrong and how to fix it. I had made lots of atlas files but a Xcode update didn’t like the way I had made them or the location I put them. I deleted all the atlas files and made them inside the Assets.xcassets file, using the menu and selecting create atlas sprite file.
Now all my animations work as they should

VR RPG like dialogue system

So I'm trying to implement a dialogue like system using Google Carboard VR, and I have been having trouble as to how I would implement this, I tried using Fungus but the dialogue boxes that are showing up are not exactly VR ready. This is how it is showing up:
I got references to individual cameras (Left and Right) and setting them as parents to script generated dialogue boxes (duplicated it onto both) as their respective parents. Also, I tried changing their positions to inside each respective lens but still i did not get the result i needed.
Is there a tutorial for Fungus for VR or if someone has worked on something similar. Or is there any other asset (preferably free) that can help me with what I am trying to do? OR maybe a solution through Unity's native UI?
Keep in mind I need a solution for VR not normal rendering.

Unity: GUI button with event trigger as virtual control pad on Android phone

Let's say I have two GUI buttons with EventTrigger as virtual key. The expectation is whenever a button is pressed, the camera will rotate until the button is released.
At the beginning, I used the pointer Down and pointer up function. It works, but it extremely sensitive, the camera rotation didn't stop at the moment when I release my finger. I've solved this problem by using drag function (whenever dragging is detected, the camera stop rotating something like that).
However, there is still a bug that I couldn't solve, that is if I swipe the button instead of touch&release, the button doesn't release. eventually the camera just keep rotating until I touch the button again. I've tried all the event trigger function such as pointer exist, end drag etc.
I just want the touch input works as perfectly as the keyboard input.
This problem doesn't shown when I debugging on unity remote, only when I build it on my phone. So is that hardware issue? (I'm using mi3)
Thanks for having time to read my broken English.
I'm so sorry that I've asked a stupid question, since I'm still kind of beginner. The problem is somehow it didn't update when I installed it in my phone. It actually works fine.

Core Animation in Interface Builder

So, I'm working on a mac app, and I'm trying to add a shadow via core animation to a button. I used the effects pane in Interface Builder and set the shadow and color, and made sure to check the "Want's Core Animation Layer" checkbox. But when simulating the interface or building the app, there is no shadow. I would appreciate it if someone knows what's wrong.
Thanks!
Edit: I've tried several things, including cleaning the project and turning on and off core animation. Nothing fixes it.
I fixed it. I ended up going through every interface element in the tab view and turned off core animation on each one (I had some strange transparency stuff going on). Then I went back to add the shadow, and it worked fine.

Resources