Three.js canvas renderer seams between polygons - three.js

I'm already feeling comfortable enough with this library, but this one made me tired already:
When I'm trying to simply render a mesh without any textures, exported from blender to .obj (triangulating, smoothing groups enabled), the WebGL renderer does that perfectly, but I also need to make it possible with canvas renderer, and here comes the trouble; polygon edges have seams between them and become partially seen through
Just to make it clear, providing screenshots
WebGL Renderer
https://www.monosnap.com/image/OVaQO8yLDU9Wl6ufhADDVCEWg
Canvas renderer
https://www.monosnap.com/image/1AYeyHjWkGx9fQ6vg6xLr0EcV
mesh is quite complex, ~7k triangles

When using CanvasRenderer, you need to set
material.overdraw = 0.5; // or some number between 0 and 1
This will help to alleviate the problem.
Note: overdraw used to be a boolean; it is now a float.
three.js r.63

Related

ThreeJS Points (Point Cloud) with Lighting using custom Shader Material

Coded using:
Using ThreeJS v0.130.1
Framework: Angular 12, but that's not relevant to the issue.
Testing on Chrome browser.
I am building an application that gets more than 100K points. I use these points to render a THREE.Points object on the screen.
I found that default THREE.PointsMaterial does not support lighting (the points are visible with or without adding lights to the scene).
So I tried to implement a custom ShaderMaterial. But I could not find a way to add lighting to the rendered object.
Here is a sample of what my code is doing:
Sample App on StackBlitz showing my current attempt
In this code, I am using sample values for point cloud data, normals and color but everything else is similar to my actual application. I can see the 3D object, but need more proper lighting using normals.
I need help or guidance to implement the following:
Add lighting to custom shader material. I have Googled and tried many things, no success so far.
Using normals, show the effects of lighting (In this sample code, the normals are fixed to Y-axis direction, but I am calculating them based on some vector logic in actual application). So calculating normals is already done, but I want to use them to show light shine/shading effect in the custom shader material.
And in this sample, color attribute is set to fixed red color, but in actual application I am able to apply colors using UV range from a texture to color attribute.
Please advise how/if I can get lighting based on normals for Point Cloud. Thanks.
Note: I looked at this Stackoveflow question but it only deals with changing the alpha/transparency of points and not lighting.
Adding lighting to a custom material is a very complex process. Especially since you could use Phong, Lambert, or Physical lighting methods, and there's a lot of calculations that need to pass from the vertex to the fragment shader. For instance, this segment of shader code is just a small part of what you'd need.
Instead of trying to re-create lighting from scratch, I recommend you create a PlaneGeometry with the material you'd like (Phong, Lambert, Physical, etc...) and use an InstancedMesh to create thousands of instances, just like in this example.
Based on that example, the pseudo-code of how you could achieve a similar effect is something like this:
const count = 100000;
const geometry = new PlaneGeometry();
const material = new THREE.MeshPhongMaterial();
mesh = new THREE.InstancedMesh( geometry, material, count );
mesh.instanceMatrix.setUsage( THREE.DynamicDrawUsage ); // will be updated every frame
scene.add( mesh );
const dummy = new THREE.Object3D();
update() {
// Sets the rotation so it's always perpendicular to camera
dummy.lookAt(camera);
// Updates positions of each plane
for (let i = 0; i < count; i++){
dummy.position.set( x, y, z );
dummy.updateMatrix();
mesh.setMatrixAt( i ++, dummy.matrix );
}
}
The for() loop would be the most expensive part of each frame, so if you need to update it on each frame, you might want to calculate this in the vertex shader, but that's another question altogether.

How can I prevent certain instances of a instance geometry from disappearing when center position no longer in view?

I'm learning webgl, threejs and glsl shaders. The scene you see below is my attempt at working with instanced geometry. I have 3 patches of "grass". The grass is actually made of instanced cones around the central position of the mesh I'm building.
As you can see, if the central position (marked with a white wireframe cone) of the mesh is no longer in the camera view, all instances disappear.
How can I prevent this?
And to be even more specific: Are all the grass patches, or all fire instances, or all particles of the same time supposed to be instanced at once and place around the scene how we see fit? My assumption is that they should. Right?
If you are using InstancedBufferGeometry, there are two ways you can deal with frustum culling.
One way is to turn frustum culling off for the mesh:
mesh.frustumCulled = false;
The other option is to set the bounding sphere of the mesh's geometry manually, if it is known in your case:
geometry.boundingSphere = new THREE.Sphere( new THREE.Vector3(), radius );
three.js r.88

Circles on click using WebGL renderer in three.js

I want to place a circle on an object when you click on it.
I updated this example to work with webGL renderer (I basically changed THREE.SpriteCanvasMaterialto THREE.SpriteMaterial), but not only is the circle now a square, it also glitches with the surface of the object. Here's a JSFiddle that demonstrates my problem (click an object to test).
I found some similar questions on stackoverflow, but I can't seem to make it work in my example. Any tips?
you are adding sprites into a moving 3d scene, easy way to do what you want is to add spheres
var particle = new THREE.Mesh(new THREE.SphereGeometry(10,10,10),
new THREE.MeshBasicMaterial({color:0xff0000}) );
particle.position.copy( intersects[ 0 ].point );
scene.add( particle );
if you want real flat circles you will have to create a circular geometry and align it correctly to the object so it sits atop of the rectangle wall and if you want circles that do not deform when object is turned you will need some custom shader stuff

three.js create texture from cubecamera

When using a cube camera one normally sets the envMap of the material to the cubeCamera.renderTarget, e.g.:
var myMaterial = new THREE.MeshBasicMaterial({color:0xffffff,
envMap: myCubeCamera.renderTarget,
side: THREE.DoubleSide});
This works great for meshes that are meant to reflect or refract what the cube camera sees. However, I'd like to simply create a texture and apply that to my mesh. In other words, I don't want my object to reflect or refract. I want the face normals to be ignored.
I tried using a THREE.WebGLRenderTarget, but it won't handle a cube camera. And using a single perpspective camera with WebGLRenderTarget does not give me a 360 texture, obviously.
Finally, simply assigning the cubeCamera.renderTarget to the 'map' property of the material doesn't work either.
Is it possible to do what I want?
r73.
Edit: this is not what the author of the question is looking for, I'll keep my answer below for other people
Your envmap is already a texture so there's no need to apply it as a map. Also, cubemaps and textures are structurally different, so it won't be possible to swap them, or if you succeed in doing that the result is not what you probably you might expect.
I understand from what you're asking you want a static envmap instead to be updated at each frame, if that's the case simply don't run myCubeCamera.updateCubeMap() into your render function. Instead place it at the end of your scene initialization with your desired cube camera position, your envmap will show only that frame.
See examples below:
Dynamic Cubemap Example
Static Cubemap Example
The answer is: Set the refractionRatio on the material to 1.0. Then face normals are ignored since no refraction is occurring.
In a normal situation where the Cube Camera is in the same scene as the mesh, this would be pointless because the mesh would be invisible. But in cases where the Cube Camera is looking at a different scene, then this is a useful feature.

THREE.ShaderMaterial is seen inverted by the shadow Camera

I have a mesh that i am loading from 3d studio max into three.js. I modified three.js to hold another typed array for the binormal data. It all seems to be working fine and dandy until shadows are involved. For some reason, the shadow map is wrong, and it seems as if its rendering the mesh with faces flipped.
In this example, the shadows are showing up correctly on the floor, because the renderer has
.shadowMapCullFace = THREE.CullFaceBack
http://dusanbosnjak.com/test/webGL/new/StojadinCeo/stojadinCeo.html
I can get other shadows to show up on my shader, but self shadowing leads to horrible artifacts, and the shadow that my mesh casts on other meshes is always inverted.
I've tried reversing the order in which the face indecis come in, (acb instead of abc), which flips the faces. This creates proper shadow cast, but the mesh shows flipped.
What im thinking of doing at the moment is exporting a flipped mesh, and reversing the cull order in the shaderMaterial, but it would be wonderful to find out why this is happening.
I basically connected the phong and shadow mapping shader chunks with what i've had.
edit
Here is an updated scene with some shadow casters and receive shadows on imported meshes
http://dusanbosnjak.com/test/webGL/new/StojadinCeo/stojadinCeo2.html
light = new THREE.SpotLight(0xaaaaaa);
light.position.set(10,10,10);
light.shadowCameraVisible = true;
light.shadowDarkness = .5;
light.castShadow = true;
light.shadowCameraNear = 1;
light.shadowCameraFar = 250;
light.shadowCameraFov = 57;
light.shadowMapWidth = 2048;
light.shadowMapHeight = 2048;
scene.add(light);
the rest of the meshes just have receiveShadow and castShadow set to true
The shadow shows on the shaderMaterial (i copied the shadowfrag chunk)
THREE.Mesh() with THREE.CubeGeometry() both casts shadows and receives shadows properly, but the shadow cast by the shaderMaterial mesh is inverted.
I can't really isolate this to 50 lines of code as it's a whole import/export process from max.
I don't understand why would the shadow camera render this one particular mesh inverted, while the normal camera renders it correctly, if that is what is happening?
You can zoom out and move the car using wasd
Unless you changed the default settings in three.js, only back-faces cast shadows. A work-around is to set:
renderer.shadowMapCullFace = THREE.CullFaceBack;
or
renderer.shadowMapCullFace = THREE.CullFaceNone;
But these options can lead to other issues.
The best approach is to make sure every mesh has depth. Avoid planes, like the car roof.
For example, you can add an interior liner to the car roof to give it depth.
Shadow mapping in WebGL can be tricky, so read all you can about it so you will be familiar with the issues involved.
three.js r.66

Resources