I'm using three.js for a small game which utilizes textures with transparency. Things were going great until I noticed some errors due to depth testing:
Good from one angle
Bad from another
Some research has brought me to these posts, but I am still trying to figure out how to apply their suggestions in a three.js environment:
Rendering glitch with GL_DEPTH_TEST and transparent textures
Blending transparent textures with depth
As far as I know, {transparent: true} in a THREE.Material already separates transparent materials from opaque ones for proper rendering. However, I cannot get away with disabling depth writes because I don't have control over the rendering order of the transparent geometry (as far as I know). Am I missing something obvious that will help me out of this mess?
Related
I recently tried my app on mobile and noticed some weird behavior, seems like camera near plane is clipping the geometry however other objects at the same distance aren't clipped... Materials are StandarMaterials, depthTest and depthWrite are set to true.
I must add I can't reproduce this issue on my desktop. Which makes it difficult to understand what's going on, since it's working perfectly at first sight.
Here are 2 gifs showing the problem:
You can see the same wall on the left in the next gif
Thanks!
EDIT:
It seems the transparent faces (on mobile) was due to logarithmicDepthBuffer = true (but don't know why?) and I also had additional artefacts cause by camera near and far planes being too far from each other producing depth issues (see Flickering planes)...
EDIT 2:
Well I wasn't searching for the right terms... Just found this today: https://github.com/mrdoob/three.js/issues/13047#issuecomment-356072043
So logarithmicDepthBuffer uses EXT_frag_depth which is only supported by 2% of mobiles according to WebGLStats. A workaround would be to tesselate the geometries...
Well I wasn't searching for the right terms... Just found this today: https://github.com/mrdoob/three.js/issues/13047#issuecomment-356072043
So logarithmicDepthBuffer uses EXT_frag_depth which is only supported by 2% of mobiles according to WebGLStats. A workaround would be to tesselate the geometries or stay with linear depth buffer...
Additional artefacts caused by camera near and far planes being too far from each other producing depth issues (see Flickering planes)...
You don't need a logarithmic depth buffer to fix this. You've succumbed to the classic temptation to bring your near clip REALLY close to the eye and the far clip very far away. This creates a very non-linear depth precision distribution and is easily mitigated by pushing the near clip plane out by a reasonable amount. Try to sandwich your 3D data as tightly as possible between your near and far clip planes and tolerate some near plane clipping.
I'm trying to render a fairly complex lamp using Three.js: https://sayduck.com/3d/xhcn
The product is split up in multiple meshes similar to this one:
The main issue is that I also need to use transparent PNG textures (in order to achieve the complex shape while keeping polygon counts low) like this:
As you can see from the live demo, this gives really weird results, especially when rotating the camera around the lamp - I believe due to z-ordering of the meshes.
I've been reading answers to similar questions on SO, like https://stackoverflow.com/a/15995475/5974754 or https://stackoverflow.com/a/37651610/5974754 to get an understanding of the underlying mechanism of how transparency is handled in Three.js and WebGL.
I think that in theory, what I need to do is, each frame, explicitly define a renderOrder for each mesh with a transparent texture (because the order based on distance to camera changes when moving around), so that Three.js knows which pixel is currently closest to the camera.
However, even ignoring for the moment that explicitly setting the order each frame seems far from trivial, I am not sure I understand how to set this order theoretically.
My meshes have fairly complex shapes and are quite intertwined, which means that from a given camera angle, some part of mesh A can be closer to the camera than some part of mesh B, while somewhere else, part of mesh B are closer.
In this situation, it seems impossible to define a closer mesh, and thus a proper renderOrder.
Have I understood correctly, and this is basically reaching the limits of what WebGL can handle?
Otherwise, if this is doable, is the approach with two render scenes (one for opaque meshes first, then one for transparent ones ordered back to front) the right one? How should I go about defining the back to front renderOrder the way that Three.js expects?
Thanks a lot for your help!
I have created a simple human figure. The eyelashes use a texture with transparency.
However as soon as I turn on transparency for the face texture there is created transparency where it shouldn't be.
You can look through the face texture in the part that lies below the eye lashes.
See the effect by toggling face transparency with this line:
mesh.material.materials[3].transparent = false
mesh.material.materials[3].transparent = true
I wish to have transparency turned on for the face texture, so how can I solve this problem?
Demo:
http://dev.udart.dk/transparencyProblemStackOverflow/
(wait for model to load)
Code:
https://github.com/vibber/transparencyProblemStackOverflow/blob/gh-pages/index.html
Transparent geometry gets manually depth-sorted, for more information see this canonical answer by Toji: Transparent textures behaviour in WebGL.
If you want this scenario to work properly, you'll have to split up your model, and render the eyelashes as a separate (sub)mesh. This way three.js can render the rest of the face using the normal z-buffer approach, then apply the eyelashes separately (from the depth-sorted transaprent objects queue).
I've implemented a barycentric coordinates wireframe shader something like this, and in general it is working nicely.
But like Florian Boesch's WebGL demo, some of the wire faces on the far side of the mesh are obscured (maybe has to do with the order in which the GPU is constructing faces).
I've set the following in the hopes that they would clear things up:
glCullFace(GL_NONE);
glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);
...but no go so far. Is this possible in OpenGL ES 2.0?
I had forgotten to discard on a transparent output, so the depth buffer was being written in spite of the apparently transparent geometry, thus the mesh was self-obscuring due to depth tests failing.
This would be the problem in Florian's demo too, though it may be that he explicitly avoids discard for mobile performance reasons.
I need to draw transparent 2D sprites in a 3D world. I tried rendering a QUAD, texturing it(using slick_util) and rotating it to face the camera, but when there are many of them the transparency doesn't really work. The sprite closest to the camera will block the ones behind it if it's rendered before them.
I think it's because OpenGL only draws the object that is closest to the viewer without checking the alpha value.
This could be fixed by sorting them from furthest away to closest but I don't know how to do that
and wouldn't I have to use math.sqrt to get the distance? (I've heard it's slow)
I wonder if there's an easy way of getting transparency in 3D to work correctly. (Enabling something in OpenGL e.g)
Disable depth testing and render transparent geometry back to front.
Or switch to additive blending and hope that looks OK.
Or use depth peeling.