In the scene I've got only transparent objects, thus with enabled depth testing it causes objects hiding each other. I know depth testing doesn't consider any transparency, it just writes to the depth buffer looking at values of z. Then how to render correctly two transparent objects ?
I did this renderer.context.disable(renderer.context.DEPTH_TEST); but nothing changed
illustration of my concrete problem:
the cube is MeshLambertMaterial({color: ..., transparent: true, opacity: 0.6})
and the plane is MeshLambertMaterial({color: ..., transparent: true, opacity: 0.4})
cube is rendered after plane, but if the cube is opaque then whole of it will be rendered correctly without any discarding (also look at the points they are also opaque hence are visible).
So how to get it consider transparency and don't care about the order of rendering as well so two transparent objects don't hide each other ?
In three.js, you can turn off depth testing by setting
material.depthTest = false;
Don't be surprised if you have artifacts when the camera position is changed.
You might also want to read this answer.
three.js r.80
Related
I'm trying to use a Shader material to create an area of transparency around my camera.
To achieve that, I'm checking if a vertex is inside a given radius, if so, I set its color with a custom opacity value < 1, such as 0.4.
This is working fine, but sometimes some transparent geometries are blocking the elements behind it. I took a look into the docs and some other similar question and figured out that usually we set the depthWrite = false and transparent = true only to transparent materials.
My problem is that I have only one material representing all my geometries but I want to set depthWrite = false; transparent = true to all vertices inside the radius and depthWrite = true; transparent = false to the ones outside it. Does anyone knows if this is something achievable or if there is another possible solution?
Thanks in advance
answering my own question! Hope this can help someone
It looks like it was not possible the way I was doing because I had only one material for the visible and transparent part.
Then, I decided to have 2 different shader materials rendering the same geometries, the visible and transparent ones, to set the depthWrite and transparent properties accordingly.
Also I discarded, using the fragment shader, the visible portion inside my transparent area and discarded the invisible one in the visible area.
It looks like it's working fine now
I'm trying to make a very simple shadow shader which consists of a plane with a shader showing a radial gradient on colors and alpha.
Beneath this shadow lies another plane with the same kind of shader but linear.
And as a background of all this, a linear gradient from dark blue to light blue.
The problem is that when my camera approaches the ground, the plane of the shadow masks the floor.
Why does it happen and what can I do to prevent that?
https://codesandbox.io/s/epic-sun-po9j3
https://po9j3.csb.app/
You'd need to post code to check for sure but it likely happens because three.js sorts the order it draws things based on the center of the objects and their distance from the camera.
You can force a different order by setting Object3D.renderOrder
three.js also generally draws opaque things before transparent things so my guess is your ground plane and your shadow plane are both set to transparent: true but the ground can be set to transparent: false in which case it will be drawn first.
You might find this article useful. It shows a similar example.
As for why there is a hole it's because of the depth buffer. If something in front gets drawn first then the pixels behind are not drawn. So if the shadow happens to be drawn first it ends up looking like a hole because the pixels of plane behind it are not drawn.
See this
I have been experimenting with converting the custom attribute BufferGeometry example (https://threejs.org/examples/webgl_buffergeometry_custom_attributes_particles.html) into a fly through animation and I find that the sprites have dark backgrounds (and those that I create myself as well) if depthTest is set to true. See the image.
The sprite in the custom attribute example has a transparent background but this appears to be ignored when it is rendered if depthTest is set to true.
I have tried numerous custom blending rules but cannot find a way to remove the background, only to reduce the effect a bit. The background disappears if depthTest is set to false.
Is this a known limitation? Is there a work around?
I am modifying this question to add clearer images with a different ball sprite (also with a transparent background). This image has depthTest set to true for the custom ShaderMaterial used in the https://threejs.org/examples/webgl_buffergeometry_custom_attributes_particles.html three.js example.
By comparison, this uses multiple PointsMaterials from a different three.js example (https://threejs.org/examples/webgl_points_sprites.html), also with depthTest set to true and using the PointsMaterial map parameter for the sprite.
As you can see, the second PointsMaterial example works as expected. Because PointsMaterial only accepts one fixed size and color, I need to create 36 different point geometries to render this image.
I would prefer to use the custom shader in the first example (which has custom size and color attributes and requires only one geometry). Is there a way to define a custom shader to support depthTest like the PointsMaterial does?
WebGL has a depth buffer that's used to find out which pixels are hiding behind other pixels. If a pixel is hiding behind another, it doesn't get rendered. Typically that's what you would want, why would you waste GPU time rendering pixels you're never going to see? Consider the example depthbuffer below, white squares are closer to the camera, while darker squares are further away:
If there was a square hiding behind another one, you wouldn't see it in the depthBuffer, you'd only see the closest one. The depthBuffer doesn't know that your squares have partial transparency and you want some parts to be see-through. This is an issue in other realtime engines too, this is an example from Unity:
The squares that are closer are blocking part of the squares that are further away in the depthBuffer, so the occluded pixels don't get rendered. That's why setting depthTest to false is useful. You're basically telling the renderer to stop testing which pixels are behind others, and just render all of them. Here it is again with depthTest = false:
If you must set depthTest on, (for instance, if you want the sprites to hide behind a solid wall) you could set depthWrite to false on the sprite material. It would still check which pixels are behind others, but the pixels from the sprites wouldn't get written to the depthBuffer, so no sprite could block another sprite.
depthTest: true; // Tests pixel depth
depthTest: false; // Does not test pixel depth
depthWrite: true; // Writes pixel depth to depthBuffer
depthWrite: false; // Does not write pixel depth to depthBuffer
As far as blending mode, I prefer THREE.AdditiveBlending because it gives the particles a nice glowing effect when they accumulate one on top of the other. But that's up to you.
I have an existing WebGL canvas that is being rendered without using ThreeJS, and is for all intents and purposes a black box to me, apart from two facts: (1) I have access to the underlying webgl canvas DOM element and can position and resize it on the screen, and (2) I know the properties of the camera for the scene, and get updates on every render cycle for that camera.
The problem I need to solve can be simplified to the following: I need to have my own separate ThreeJS canvas that displays both the black box canvas data, and then elements that I draw, like a cube for a simple example. I can already easily overlay the two canvases, set the transparency on my canvas for everything but the cube, and then align the two with the camera events from the black box library. This works quite well.
The issue with this is that when I draw my objects, like a cube, they don't respect the depth buffer of the black box canvas. So I might have a cube that is properly aligned with the backing scene and movements of the scene, but then it isn't properly masked when something in the black box canvas is closer to the camera than the cube. My thought is that I need to solve this in one of two ways: (1) I can have my renderer write to the other canvas with autoClear = false and preserveDrawingBuffer = true, or (2) I can somehow copy the depth buffer from the black box canvas into my canvas, and then set up my renderer so that it respects the new depth buffer.
I haven't been successful with either approach yet, so I'm wondering if this is possible, and if so which of the above approaches, or what other approach, can solve this problem?
--Edit--
See https://jsfiddle.net/zdxyoajb/ for angular/typescript implementation of the above attempts. In the following animate loop, if I comment out the overlayRenderer lines, the below sphere will be red and offset from the center (as it should be), but if I don't comment the lines, I get the below image. I also get the following error:
WebGL: INVALID_OPERATION: uniformMatrix4fv: location is not from current program
animate() {
requestAnimationFrame(() => this.animate());
this.blackBoxCamera.copy(this.overlayCamera);
this.blackBoxRenderer.render(this.blackBoxScene, this.blackBoxCamera);
this.overlayRenderer.state.reset();
this.overlayRenderer.render(this.overlayScene, this.overlayCamera);
}
Is it possible in three.js to create transparent/invisible plane, which can receive shadows? My aim is to draw some object on 3d canvas without background (so I can see webpage behind the canvas element). I want the object to cast shadows on the background and I thought, if I could make an invisble plane, then webpage background is still visible. Shadows are casted on the plane and it seems like shadows are on the webpage.
Right now I can make a white plane with opacity 0.5 (or similar), but this way the plane is visible. Ideally the plane should be completely invisble (except for shadows).
EDIT: I created an exampled (based on this): http://jsfiddle.net/s5YGu/2/
Here I have opacity 0.5, but you can see the plane. If I set opacity to 0, then the whole plane and the shadows disappear.
I made this work by hacking on basic shader, here is working shader code http://pastie.org/5088640
It has no drawbacks AFAIK, like opacity: 0.1
Yes, you can achieve the effect you want, and it looks pretty good on my machine at least. :-)
Here is a fiddle: http://jsfiddle.net/ZXdV3/25/
There are a couple of issues. First, you will have to set alpha: true in the renderer constructor args.
Second, although I assume you'd like the plane to be completely transparent, you'll have to settle for material.opacity = 0.1, or so.
Third, if you are placing a canvas over the webpage, and you want the web page to be interactive, you are going to have to suppress pointer events in the canvas (I'm not sure if IE supports this, though): container.style.pointerEvents = 'none';
three.js r.67