The light component is behind Image in Unity - image

I have two different canvases: one for the background and one for the game scene
Principal canvas:
Background canvas:
I'm having this problem: If I put an object in the principal canvas, everything seems to works but If I add a light component to this object, I'll not see the light (it is like the background image is ahead the light):
Without the background canvas:
With the background canvas:
Any idea why?
(The problem is not the BGcanvas, the problem is the image component of the BGcanvas, if I disable it, I can see the light)

Lights are 3D scene objects
UI objects are not effected by lights or scene objects because they exist in a completely different rendering path:

Related

Image and Sprite renderer cover the light component in Unity

As you can see, for some reasons the sprite renderer is covering the light component, though the light component is ahead it
Any ideas why?
I found a valid solution:
I created a new material and in the "shaders" section, I choose "Particles/Standard Unit".
I added this material to the image script of the background and all perfectly works!

Unity Image Component for 2D Animation

I'm developing a 2D game in Unity (Version 5.1.2) which has an animation.
The animation is generated by flipping through the sprites in the sprite sheet.
My problem is that the animation is playing as it should in the "Scene View" but not in "Game View".
I normally create animations by using the sprite editor and then drag & drop all the sprites on the screen (Scene View).
It creates a Sprite Renderer to switch the sprites but I would like the Image Component to flip through the sprites. It seems like only sprites in the Image Component is being displayed in the "Game View".
Is there any way I can get some assistance on this please.
Its really strange that you are only seeing that in your Scene View. If the animation is your default animation? Otherwise, make sure that you are sending the right parameters to your animator. A good way to test it is opening your Animator windows, checking all transitions, and manually filling the parameters to see how it works wile the game is running. Also, check if the transitions between animations has exit times and transition durations, and disable them.

Three.js transparent background with a masked scene

I'm new to coding and everyday I learn something new. I want to have a masked scene with other 3D objects, that are from other scenes, to be behind and around it. At this moment I'm stuck with a masked scene. Right now I have this working. As u can see the background is grey and other 3D objects, that are from other scenes and behind the masked scene, can't be seen because of it.
How can I make the backround from masked scene to be transparent with other 3D objects from other scenes?
I think it has something to do with the fragmentShader, because I can change the background color by changing this line "vec4 background = vec4(1.0, 1.0, 1.0, 0.0);",.
Thanks in advance for all the help!
Update
I understand that the background is transparent on this previous link, but I fail to create a new scene with new 3D objects that is visible and is outside the masked scene. Just helping me make a new scene on that previous link would help me out with my problem.

Occlusion of real-world objects using three.js

I’m using three.js inside an experimental augmented-reality web browser. (The browser is called Argon. Essentially, Argon uses Qualcomm’s Vuforia AR SDK to track images and objects in the phone camera. Argon sends the tracking information into Javascript, where it uses transparent web pages with three.js to create 3D graphics on top of the phone video feed.) My question is about three.js, however.
The data Argon sends into the web page allows me to align the 3D camera with the physical phone camera and draw 3D graphics such that they appear to align with the real world as expected. I would also like to have some of the things in the physical world occlude the 3D graphics (I have 3D models of the physical objects, because I’ve set the scene up or because they are prepared objects like boxes that are being tracked by Vuforia).
I’m wondering if folks have suggestions on the best way to accomplish this occlusion with three.js. Thanks.
EDIT: it appears that the next version of three.js (R71) will have a simpler way to do this, so if you can use the dev branch (or just wait), you can do this much more easily. See this post: three.js transparent object occlusion
MY ORIGINAL ANSWER (without using the new features in R71):
I think the best way to do this is (to avoid extra work by creating new rendering passes for example) to modify the WebGL renderer (src/renderers/WebGLRenderer.js) and add support for a new kind of object, perhaps call them “occlusionObjects”.
If you look in the renderer, you will see two current object lists, opaqueObjects and transparentObjects. The renderer sorts the renderable objects into these two lists, so that it can render the opaque objects first, and then the transparent objects after them. What you need to do is store all of your new objects into the occlusionObjects list rather than those two. You will see that the opaque and transparent objects are sorted based on their material properties. I think here, you may want to add a property to an object you want to be an occluder (“myObject.occluder = true”, perhaps), and just pull those objects out.
Once you have the three lists, look what the render() function does with these object lists. You’ll see a couple of places with rendering calls like this:
renderObjects( opaqueObjects, camera, lights, fog, true, material );
Add something like this before that line, to turn off writing into the color buffers, render the occlusion objects into the depth buffer only, and then turn color buffer writes back on before you render the remaining objects.
context.colorMask( false, false, false, false);
renderObjects( occluderObjects, camera, lights, fog, true, material );
context.colorMask(true, true, true, true);
You’ll need to do this in a couple of places, but it should work.
Now you can just mark any objects in your scene as “occluder = true” and they will only render into the depth buffer, allowing the video to show through and occluding any opaque or transparent objects rendered behind them.

Combine THREE.WebGLRenderer and Kinetic.js Layers

He Guys,
I'm trying to combine THREE.js and Kinetic.js in my web-application. I'm having problems doing this with the THREE.WebGLRenderer. How can I setup my view that I have a 3D-Layer that is rendered by the THREE.WebGLRenderer and a seperate Layer on top of that for 2D-Elements, like e.g. Labels etc., using Kinetic.js?
I've tried to give the WebGLRenderer the canvas element of an instance of a Kinetic.Layer Element. But it does not work.
this.renderer = new THREE.WebGLRenderer({
antialias: true,
preserveDrawingBuffer: true,
canvas: this.layer3D.getCanvas()._canvas
});
Until now I only found examples that do this with the THREE.CanvasRenderer.
Ideas somebody? Thanks a lot.
A canvas can have either a 2D context or a 3D context, not both as they are considered incompatible. When you pass the Canvas from kinetic layer, it already has a 2D context bound to it.
However you can have another HTML element (ex, DIV) on top of the GL rendered canvas.
Hello I just want to say this may not be possible. As far as I know KineticJS is based on Canvas. So what you wan to do is only possible using Canvas Renderer.
The workaround I can think of is, if the browser supports WebGL, you might be able to place the webGL element on top of your KineticJS element.

Resources