UI Elements not affected by light - user-interface

I'm trying to make a menu screen in which all the UI elements (buttons, text...) are completely dark and by touching the screen you create a fire (or just an area light) that makes the UI elements visible.
Sort of like this
I read that the default shader for the UI elements isn't affected by light, but i can't seem to change it.
How do I go about doing this?

The UI elements by default use a unlit shader and are also rendered directly to clip space. so you'll need to do two things, first put a lit shader onto the elements, the unity standard shader should do fine, then you should change the canvas render mode to world space. with the canvas in world space you can move it around like it was a sprite. i would also recommend creating a second higher priority camera for the UI with culling turned off. with the canvas in view of the UI camera, you should be able to place a light source near it and see the resulting lighting on the UI.

Related

React Three Fiber darken unhovered elements

I am using react-three-fiber. I have multiple groups of meshes on my scene.
I want to achieve that upon interaction (mouse hover or click) I darken the scene except for the hovered mesh or group. This selection would change depending on what I hover over.
Is this best done through shaders or post-processing?
I've got meshbasicmaterials with baked image textures.
I am quite a noob, and I will have to learn how to do either way, but I need some pointers on which way is best to go.

CreateJS Performance issue when rendering in a canvas that is displayed on mobile

I have a very specific issue. I am doing a demo in VR with three.js where I want to display 2D data. The data that will be displayed is dynamic (Text info) and needs to be animated.
Animate CC providing a nice suite of tools is an easy choice for this. With three.js, the way I found to add some 2D animation in the world, was to create a plane, add a texture from a canvas I created, which I update on RAF. No problem as of now.
The canvas I'm rendering is also the one that I create my stage from. Here the issue : Whatever the animation is (even an empty stage) I see a drop in framerate of about 15, as soon as I add the eventlistener on tick for the stage update. I tried many things (like not even adding the mesh, onto which I draw the canvas, to my scene) And If the eventlistener is added, I see my fps take a hit.
Whether the animation is "heavy" or not, I see this drop in framerate. And that drop is a big issue in VR since staying on 60 FPS is pretty much a must have at this point.
Any lead on what I could do to make this better ? Thank you !

How to apply animation to different objects

I have a cupboard with 9 boxes. On one of them I have animation, which open / close box. It is only change X coordinate of the box, but I can't apply this animation to another boxes, because animation will move it to the coordinate of the first box.
In the debug mode parameter Keep Original Position XZ are disabled. Can't understand, what is wrong.
Should I create 9 similar animations for 9 boxes?
I know that it is possible to animate stuffs using relative positions on the UI when using the anchors, but there does not seem to be any clean solution for 3D objects... This post offers what seem to be the "best" solution for now (it uses an empty parent transform to move the animated object correctly...)
You should be able to apply the animation to any object. I would recommend making a prefab of the "box" with the animation attached, then using the prefab for each. Honestly I don't have much experience with animations of 3D objects, but even my 2D animations are in a 3D space, and each object animates properly with the same animations individually regardless of their location.

Fabricjs object controls visible outside canvas

I am using fabric.js for html 5 interactive canvas app. If the object scales larger than the canvas, the controls go invisible outside canvas. How to make it visible outside the canvas or is there a way to style those controllers in css.
So, in fabricjs controls are rendered on the canvas, so having controls outside canvas it is not possible.
You can still obtain the effect in this way, this is just a trace.
Make canvas big as 100% of window, or anyway very bigger than drawing area.
Then you can clip the drawing canvas with a rectangle.
If you need borders to define the drawing area you can still put an overlay image as fabricjs allow you.
If you need to have controls near the drawing canvas you will have to position them OVER the non drawing part of the canvas.
This will give you some additional tasks:
When you create some object you have to be sure that they go in the drawing area. If you consider position of objects, you have to consider that a translation has to be applied, because you have absolute position while user will position the object watching a fake top left corner of a drawing area and not the top left corner of the canvas.
The best thing you can do is make a larger canvas, but limit the drawing area to a limited part, like a margin. Not easy because after that you always need to consider the margin when making other calculations, but it is possible.

Warping GUI elements in Unity's OnGUI

I am using Unity3D, and I have a function which is being called inside of OnGUI to lay out the various gui components of my application. Ordinarily, the labels and buttons are all inside of a certain Rect that I supply, which is centered on the screen.
No problem there... however, what I want to is sometime render the exact same gui elements, which can be dynamic, and thus not just put into a prefabbed texture, into a trapezoid-shaped area off to the side, looking as if that gui were actually on a flat plane, pushed away from the center of the screen, and rotated slightly. All gui buttons that were drawn in the function should still respond normally.
I was rather hoping I could just specify some values in GUI.matrix to map the rectangle to a trapezoid, but my initial exploration seems to show that the gui elements don't appear to use homogenous coordinates, and everything still shows up as rectangular.
Is there any way to do this with Unity, ideally without requiring access to pro-only features?
Since now Unity3D GUI system isn't very flexible. The new GUI system is one of the features still not released in Unity 4 (we are all waiting for it).
From my point of view it has several problems, particularly:
You are forced to layout components using the flow of the code, instead of having a more declarative (or at least a more structured) way to do that.
It's quite inefficient (at least one draw call for button).
It isn't flexible at all. Add, Remove, Enable/Disable buttons can be come quick a painful operation when the number of buttons increase.
however, what I want to is sometime render the exact same gui
elements, which can be dynamic, and thus not just put into a prefabbed
texture, into a trapezoid-shaped area off to the side, looking as if
that gui were actually on a flat plane, pushed away from the center of
the screen, and rotated slightly. All gui buttons that were drawn in
the function should still respond normally.
This is quite hard if not impossible to obtain using Unity's GUI classes.
I see 2 possibilities:
Don't use GUI classes to do that. If your GUI is simple enough, you can implement your own (even 3d) buttons using for example:
A mesh (a plane or a trapezoid mesh) with a texture for the button background
TextMesh for drawing 3D text
RayCasting to check if a button has been pressed
Use a library that implements a more advanced GUI system like NGUI
When I ran into the same problem, I just used normal 3D GameObjects cubes with textures and called OnMouseDown(PC/Mac) or RayCasting(Android/iOS) on them. I guess that's how everyone does it.

Resources