I have more than one plane at the same position and orientation with different colors and depending on the camera position the planes get mixed in different ways. The picture below shows the problem.
How can this be solved in order to see the gray object perfectly and the black just as if it were behind? The green object is just a gray object outline.
I'm sure this problem has a name, but I can't find it.
It disappears using the property
depthTest: false
when defining the materials.
Related
I'm trying to make a very simple shadow shader which consists of a plane with a shader showing a radial gradient on colors and alpha.
Beneath this shadow lies another plane with the same kind of shader but linear.
And as a background of all this, a linear gradient from dark blue to light blue.
The problem is that when my camera approaches the ground, the plane of the shadow masks the floor.
Why does it happen and what can I do to prevent that?
https://codesandbox.io/s/epic-sun-po9j3
https://po9j3.csb.app/
You'd need to post code to check for sure but it likely happens because three.js sorts the order it draws things based on the center of the objects and their distance from the camera.
You can force a different order by setting Object3D.renderOrder
three.js also generally draws opaque things before transparent things so my guess is your ground plane and your shadow plane are both set to transparent: true but the ground can be set to transparent: false in which case it will be drawn first.
You might find this article useful. It shows a similar example.
As for why there is a hole it's because of the depth buffer. If something in front gets drawn first then the pixels behind are not drawn. So if the shadow happens to be drawn first it ends up looking like a hole because the pixels of plane behind it are not drawn.
See this
I have been experimenting with converting the custom attribute BufferGeometry example (https://threejs.org/examples/webgl_buffergeometry_custom_attributes_particles.html) into a fly through animation and I find that the sprites have dark backgrounds (and those that I create myself as well) if depthTest is set to true. See the image.
The sprite in the custom attribute example has a transparent background but this appears to be ignored when it is rendered if depthTest is set to true.
I have tried numerous custom blending rules but cannot find a way to remove the background, only to reduce the effect a bit. The background disappears if depthTest is set to false.
Is this a known limitation? Is there a work around?
I am modifying this question to add clearer images with a different ball sprite (also with a transparent background). This image has depthTest set to true for the custom ShaderMaterial used in the https://threejs.org/examples/webgl_buffergeometry_custom_attributes_particles.html three.js example.
By comparison, this uses multiple PointsMaterials from a different three.js example (https://threejs.org/examples/webgl_points_sprites.html), also with depthTest set to true and using the PointsMaterial map parameter for the sprite.
As you can see, the second PointsMaterial example works as expected. Because PointsMaterial only accepts one fixed size and color, I need to create 36 different point geometries to render this image.
I would prefer to use the custom shader in the first example (which has custom size and color attributes and requires only one geometry). Is there a way to define a custom shader to support depthTest like the PointsMaterial does?
WebGL has a depth buffer that's used to find out which pixels are hiding behind other pixels. If a pixel is hiding behind another, it doesn't get rendered. Typically that's what you would want, why would you waste GPU time rendering pixels you're never going to see? Consider the example depthbuffer below, white squares are closer to the camera, while darker squares are further away:
If there was a square hiding behind another one, you wouldn't see it in the depthBuffer, you'd only see the closest one. The depthBuffer doesn't know that your squares have partial transparency and you want some parts to be see-through. This is an issue in other realtime engines too, this is an example from Unity:
The squares that are closer are blocking part of the squares that are further away in the depthBuffer, so the occluded pixels don't get rendered. That's why setting depthTest to false is useful. You're basically telling the renderer to stop testing which pixels are behind others, and just render all of them. Here it is again with depthTest = false:
If you must set depthTest on, (for instance, if you want the sprites to hide behind a solid wall) you could set depthWrite to false on the sprite material. It would still check which pixels are behind others, but the pixels from the sprites wouldn't get written to the depthBuffer, so no sprite could block another sprite.
depthTest: true; // Tests pixel depth
depthTest: false; // Does not test pixel depth
depthWrite: true; // Writes pixel depth to depthBuffer
depthWrite: false; // Does not write pixel depth to depthBuffer
As far as blending mode, I prefer THREE.AdditiveBlending because it gives the particles a nice glowing effect when they accumulate one on top of the other. But that's up to you.
Is there a way to correctly display more than one transparent texture in three.js? There are no problems if you try to render a transparent texture on a non-transparent textured plane below, but if you have more than a transparent plane, the nearest will "delete" the others below as you can see here:
On the left pic, there's what I would have (achieved adding depthWrite = false to every transparent material), on the right there's what I have, only setting transparent = true to materials with RGBA textures.
I already tried using alphaTest, but it isn't what I need, and depthWrite sometimes can't satisfy my needs (look at the green line bounding the path in the first screen, which hasn't been covered by the house shadow).
i am trying to cast an shadow on an totally transparent plane in SceneKit on OSX. I am struggling with this problem since several hours and do not come to any solution.
My Purpose is to generate an Screenshot of several objects with an transparent background and just the shadow on an invisible Plane.
Do you have any suggestions for me how i can make this with apples SceneKit?
Do i have to program my own shader, can i make this work with shadermodifiers or can i use built in functionallity?
UPDATE:
I find an alternative solution for anyone who needs:
create a white plane under 3D model, note that the color of plane must be pure white.
set blend mode of plane's material to SCNBlendModeMultiply.
set light model of plane's material to SCNLightingModelLambert.
This works because any color multiply white color (1, ,1, 1) return itself And lambert light model will not take account of directional light, So the plane will always be background color which look like transparent. Another benefit of this solution is you don't need change light‘s shadow rendering mode.
For people who used to inspector of Xcode.
According to SceneKit: What's New.
First, add a plane under you model. Then prevent it from writing to colorBuffer.
Second, change your light model's shadow rendering mode to deferred. Notice that you must use light which can cast shadows.
Oily Guo, your solution works. Here the solution is in code:
Configuration of the light source:
light.shadowColor = UIColor(red: 0, green: 0, blue: 0, alpha: 0.4)
light.shadowMode = .deferred
And for the floor (ie. SCNFloor underneath your objects):
material.diffuse.contents = UIColor.white
material.colorBufferWriteMask = SCNColorMask(rawValue: 0)
I do not have an answer to your question, however I have a workaround:
Render your scene and keep the image in memory
Change all the materials in your object for pure black, no specular
Change the plane and the sky to a fully white material, lights to white
Render the scene to another image
On the second image, apply the CIColorInvertand CIMaskToAlpha Core Image filters
Using Core Image apply the Alpha Mask to the first render.
You'll get an image with a correct Alpha channel, and transparent shadows. You will need to tweak the materials and lights to get the results you want.
The shadow may become lighter on the edges, and the only way around that is rendering it as yet another image, and filling it with black after the Mask to Alpha step.
When i put an object to scene, re-size him by morph targets like in this example:
https://threejsdoc.appspot.com/doc/three.js/examples/webgl_morphtargets.html
when i move camera out of the original size of object, but still looking on re-sized parts, object is disappear.
That means, when i stretch box 100 times, i still need to look on his original center, when i look only on his re-sized part, it becomes invisible.
Do you have any experience with this behavior?
How can i set the object to be visible of full length?
Meshes are frustum-culled based on their "un-morphed" geometry.
You can prevent a mesh from being frustum-culled by setting:
mesh.frustumCulled = false;
three.js r.68