Issues with animating UI in different aspect ratio in unity - user-interface

I am struggling with animating the Unity UI. It's just a simple slide up menu I have animated via the anchored position. The problem comes when the aspect ratio is changed, it will often end up completely offscreen instead of stopping in the intended position.

Related

UI Elements not affected by light

I'm trying to make a menu screen in which all the UI elements (buttons, text...) are completely dark and by touching the screen you create a fire (or just an area light) that makes the UI elements visible.
Sort of like this
I read that the default shader for the UI elements isn't affected by light, but i can't seem to change it.
How do I go about doing this?
The UI elements by default use a unlit shader and are also rendered directly to clip space. so you'll need to do two things, first put a lit shader onto the elements, the unity standard shader should do fine, then you should change the canvas render mode to world space. with the canvas in world space you can move it around like it was a sprite. i would also recommend creating a second higher priority camera for the UI with culling turned off. with the canvas in view of the UI camera, you should be able to place a light source near it and see the resulting lighting on the UI.

Animated Sprites blocked by Canvas in Unity

I have managed to animate a series of sprites by dragging a group of sprites into the Hierarchy. However, when I enabled the Canvas, the Canvas was blocking all the sprites. What should I do to display the animation on the Canvas? I have tried adjusting the layers and camera modes, but to no avail.
I'm not sure if I understood your question correctly, but I understand that you want your sprites to be displayed on top of the UI. You can use 2 cameras and adjust their depths so that one of them displays the layer of the animation (make this one the top camera) and the other one displays the UI layer.
Use Culling Masks in Camera
How to use Multiple Cameras

Responsive widths, not heights, in threejs

I have noticed that in threejs, every example of the sizing of an object relative to its container seems to only be based on the height of the canvas/window it is responding to. This works well for worlds where the canvas is the only thing on the page. However, in a new project we are using overlaying divs as part of the design and as the window gets thinner, we are hoping to be able to scale down the main object in our scene. I have linked to and took 2 screenshots of a basic example. I am wondering if anyone has come up with a good solution for dealing with making the width responsive as well.
https://threejs.org/examples/#webgl_geometry_cube

CreateJS Performance issue when rendering in a canvas that is displayed on mobile

I have a very specific issue. I am doing a demo in VR with three.js where I want to display 2D data. The data that will be displayed is dynamic (Text info) and needs to be animated.
Animate CC providing a nice suite of tools is an easy choice for this. With three.js, the way I found to add some 2D animation in the world, was to create a plane, add a texture from a canvas I created, which I update on RAF. No problem as of now.
The canvas I'm rendering is also the one that I create my stage from. Here the issue : Whatever the animation is (even an empty stage) I see a drop in framerate of about 15, as soon as I add the eventlistener on tick for the stage update. I tried many things (like not even adding the mesh, onto which I draw the canvas, to my scene) And If the eventlistener is added, I see my fps take a hit.
Whether the animation is "heavy" or not, I see this drop in framerate. And that drop is a big issue in VR since staying on 60 FPS is pretty much a must have at this point.
Any lead on what I could do to make this better ? Thank you !

Warping GUI elements in Unity's OnGUI

I am using Unity3D, and I have a function which is being called inside of OnGUI to lay out the various gui components of my application. Ordinarily, the labels and buttons are all inside of a certain Rect that I supply, which is centered on the screen.
No problem there... however, what I want to is sometime render the exact same gui elements, which can be dynamic, and thus not just put into a prefabbed texture, into a trapezoid-shaped area off to the side, looking as if that gui were actually on a flat plane, pushed away from the center of the screen, and rotated slightly. All gui buttons that were drawn in the function should still respond normally.
I was rather hoping I could just specify some values in GUI.matrix to map the rectangle to a trapezoid, but my initial exploration seems to show that the gui elements don't appear to use homogenous coordinates, and everything still shows up as rectangular.
Is there any way to do this with Unity, ideally without requiring access to pro-only features?
Since now Unity3D GUI system isn't very flexible. The new GUI system is one of the features still not released in Unity 4 (we are all waiting for it).
From my point of view it has several problems, particularly:
You are forced to layout components using the flow of the code, instead of having a more declarative (or at least a more structured) way to do that.
It's quite inefficient (at least one draw call for button).
It isn't flexible at all. Add, Remove, Enable/Disable buttons can be come quick a painful operation when the number of buttons increase.
however, what I want to is sometime render the exact same gui
elements, which can be dynamic, and thus not just put into a prefabbed
texture, into a trapezoid-shaped area off to the side, looking as if
that gui were actually on a flat plane, pushed away from the center of
the screen, and rotated slightly. All gui buttons that were drawn in
the function should still respond normally.
This is quite hard if not impossible to obtain using Unity's GUI classes.
I see 2 possibilities:
Don't use GUI classes to do that. If your GUI is simple enough, you can implement your own (even 3d) buttons using for example:
A mesh (a plane or a trapezoid mesh) with a texture for the button background
TextMesh for drawing 3D text
RayCasting to check if a button has been pressed
Use a library that implements a more advanced GUI system like NGUI
When I ran into the same problem, I just used normal 3D GameObjects cubes with textures and called OnMouseDown(PC/Mac) or RayCasting(Android/iOS) on them. I guess that's how everyone does it.

Resources