Twisted normals with the Three.js normal shader - r.58 - three.js

I'm attempting to use the Three.js r.58 normal shader to make a displacement map. I have it displacing correctly, but the lighting doesn't seem to be respecting the post-displacement normals, even when I use computeTangents().
When I turn off the displacement, I see that the default normals are definitely funny. Here's a top view of a sphere, lit from the side (the white dot marks a pointLight):
And here's a demo page:
http://meetar.github.io/three.js-normal-map-0/index0.html
What's causing this? And is there documentation for the Three.js normal shader anywhere?

You are not passing in a normalMap, which is required. Try passing in a flat one.
ComputeTangents() can do strange things on vertices that have discontinuous UVs -- like at the north pole.
The code is the doucmentation. :-)

The "twisted" normals are the result of each vertex normal being evaluated as the RGB value (255, 255, 255), which corresponds to the tangent space XYZ coordinates (1.0, 1.0, 1.0). This seems to be the default behavior when a three.js normalmap material is used without passing a normal map. If you pass an all-white normal map, you'll see the same behavior.
To pass a normal map to the normalmap shader, add this line to your uniform declarations:
uniforms[ "tNormal" ].value = new THREE.ImageUtils.loadTexture( 'normalmap.png' );
To pass a "flat" normal map, make your "normalmap.png" solid (128, 128, 255) lavender, which normalizes to tangent-space coordinates (0.0, 0.0, 1.0).
For a great breakdown of normal maps including lots of examples, check out this link: http://wiki.polycount.com/NormalMap/

Related

Wrong rendering using CylinderGeometry and MeshPhongMaterial with vertexColors

I'm using CylinderGeometry to create a cylinder, and loop over its faces to set each vertex color according to the vertex y coordinate through a rainbow colormap. Then I use a MeshPhongMaterial on it, setting vertexColors to true.
I also have an AmbientLight and a DirectionalLight.
The cylinder renders correctly except on parts receiving directed light:
Please see the code here: https://codepen.io/thatcodepenaccount/pen/YzrgpYp
I looked at examples such as https://threejs.org/examples/#webgl_buffergeometry where using MeshPhongMaterial with some vertex colors seems to work, but I can't find what I'm doing wrong. Is it because I must use BufferGeometry?
edit: reducing light intensity from 10 to 1 as per comment fixed it:

ThreeJS Points (Point Cloud) with Lighting using custom Shader Material

Coded using:
Using ThreeJS v0.130.1
Framework: Angular 12, but that's not relevant to the issue.
Testing on Chrome browser.
I am building an application that gets more than 100K points. I use these points to render a THREE.Points object on the screen.
I found that default THREE.PointsMaterial does not support lighting (the points are visible with or without adding lights to the scene).
So I tried to implement a custom ShaderMaterial. But I could not find a way to add lighting to the rendered object.
Here is a sample of what my code is doing:
Sample App on StackBlitz showing my current attempt
In this code, I am using sample values for point cloud data, normals and color but everything else is similar to my actual application. I can see the 3D object, but need more proper lighting using normals.
I need help or guidance to implement the following:
Add lighting to custom shader material. I have Googled and tried many things, no success so far.
Using normals, show the effects of lighting (In this sample code, the normals are fixed to Y-axis direction, but I am calculating them based on some vector logic in actual application). So calculating normals is already done, but I want to use them to show light shine/shading effect in the custom shader material.
And in this sample, color attribute is set to fixed red color, but in actual application I am able to apply colors using UV range from a texture to color attribute.
Please advise how/if I can get lighting based on normals for Point Cloud. Thanks.
Note: I looked at this Stackoveflow question but it only deals with changing the alpha/transparency of points and not lighting.
Adding lighting to a custom material is a very complex process. Especially since you could use Phong, Lambert, or Physical lighting methods, and there's a lot of calculations that need to pass from the vertex to the fragment shader. For instance, this segment of shader code is just a small part of what you'd need.
Instead of trying to re-create lighting from scratch, I recommend you create a PlaneGeometry with the material you'd like (Phong, Lambert, Physical, etc...) and use an InstancedMesh to create thousands of instances, just like in this example.
Based on that example, the pseudo-code of how you could achieve a similar effect is something like this:
const count = 100000;
const geometry = new PlaneGeometry();
const material = new THREE.MeshPhongMaterial();
mesh = new THREE.InstancedMesh( geometry, material, count );
mesh.instanceMatrix.setUsage( THREE.DynamicDrawUsage ); // will be updated every frame
scene.add( mesh );
const dummy = new THREE.Object3D();
update() {
// Sets the rotation so it's always perpendicular to camera
dummy.lookAt(camera);
// Updates positions of each plane
for (let i = 0; i < count; i++){
dummy.position.set( x, y, z );
dummy.updateMatrix();
mesh.setMatrixAt( i ++, dummy.matrix );
}
}
The for() loop would be the most expensive part of each frame, so if you need to update it on each frame, you might want to calculate this in the vertex shader, but that's another question altogether.

Properly scaling textures in three.js / proctree.js

Apologies for the vague title, I'm not sure how to describe my issue.
I'm trying to create a forest in three.js with the very cool proctree.js. The library seems to create a 3d model of the tree's trunk and main branches, then adds simple flat textures for the leaves (or 'twigs').
The resulting tree looks very nice from up close but as you zoom out the leaves visually disappear almost entirely. This is a problem as I'm trying to create a dense looking forest. See the following two screengrabs (or this online viewer):
Is there a way to prevent the leaves from becoming very pixelated and thin looking from a distance? Or, to phrase the question differently, how would one create good-looking leaves that look as dense from a distance as they do from up close?
The material used looks like this:
this.twigMaterial = new THREE.MeshStandardMaterial({
color: this.config.twigColor,
roughness: 1.0,
metalness: 0.0,
map: this.textureLoader.load('assets/twig-1.png'),
alphaTest: 0.9
});
Your problem sounds very similar to this one
I'm pretty certain that the smaller-resolution mipmaps (when you zoom out) are blending your leaf textures and changing the alpha values to which the alphaTest threshold compares to. The further away you are, the more area of your texture is considered "transparent".
You can modify your texture properties as follows to disable mipmaps:
var texture = this.textureLoader.load('assets/twig-1.png');
texture.minFilter = THREE.LinearFilter;
this.twigMaterial = new THREE.MeshStandardMaterial({
color: this.config.twigColor,
roughness: 1.0,
metalness: 0.0,
map: texture,
alphaTest: 0.9
});
However, this might give your leaves an aliased look. Alternatively, you could create your own .mipmaps and pass them to your texture as an array to have a more custom look.

Three.js - CanvasRenderer problems: flat shading?

I'm trying to use CanvasRenderer (three.js) as a fallback for devices not supporting WebGL. Is there some comparison page with explanation what is different and cannot be used with CanvasRenderer?
I'm experiencing two main issues:
flat shading, lights are completely missing (is MeshPhongMaterial supported?), I don't see any lighting nor shadows (are shadows supported in CanvasRenderer)? All I see is the diffuse texture without any lighting. In WebGL my current setup is PointLight, DirectionalLight, softShadows, antialiasing and MeshPhongMaterial (with diffuse, bump, spec and env map)
this.materialM = new THREE.MeshPhongMaterial({
ambient : 0x050505,
color : this.model.color,
specular : 0xcccccc,
shininess : 100,
bumpScale : BUMP_SCALE,
reflectivity : REFLECTIVITY,
});
transparent polygon edges (I know it can be tweaked with material.overdraw = 0.5 yet it produces other artifacts (as it probably does only some scaling of polys along the normal?), but I can do with this one
Any help on 1. or some general overview of what is not possible in CanvasRenderer when comparing to WebGLRenderer is greatly appreciated!
three.js r68
CanvasRenderer has limitations.
MeshPhongMaterial is not supported in CanvasRenderer -- it falls back to MeshLambertMaterial.
MeshLambertMaterial is supported, but not when the material has a texture -- it falls back to MeshBasicMaterial. ( MeshBasicMaterial is rendered without regards to scene lights. )
Shadows are not supported.
material.overdraw = 0.5 is helpful in hiding polygon edges when the material is opaque. It may still leave artifacts if the material is transparent.
three.js r.68

Compute normals from displacement map in three.js r.58?

I'm using the normal shader in three.js r.58, which I understand requires a normal map. However, I'm using a dynamic displacement map, so a pre-computed normal map won't work in this situation.
All the examples I've found of lit displacement maps either use flat shading or pre-computed normal maps. Is it possible to calculate the normals dynamically based on the displaced vertices instead?
Edit: I've posted a demo of a sphere with a displacement map showing flat normals:
Here's a link to the github repo with all of my examples illustrating this problem, and the solutions I eventually found:
https://github.com/meetar/three.js-normal-map-0
This answer is based on your comments above.
You can do what you want, but it is quite sophisticated, and you will of course have to modify the three.js 'normal' shader.
Have a look at http://alteredqualia.com/three/examples/webgl_cubes_indexed.html. Look at the fragment shader, and you will see
vec3 normal = normalize( cross( dFdx( vViewPosition ), dFdy( vViewPosition ) ) );
Alteredqualia is using a derivative normal in the fragment shader ( instead of an attribute normal ) because the vertex positions are changing in the vertex shader, and the normal is not known.
What he is doing is calculating the normal using the cross product of the x and y screen-space derivatives of the fragment position.
This will set the normal as the face normal. It will be discontinuous at hard edges.
three.js r.58
What I was describing above is called a "bump map" and it comes as a default with the three.js phong shader. I combined the normalmap shader with chunks of the phong shader responsible for bump mapping:
http://meetar.github.io/three.js-normal-map-0/bump.html
Though the normals are a bit noisy they are basically correct.
You can also calculate a normal map from the displacement map with JavaScript. This results in smooth normals, and is a good option if your displacement map isn't changing too often.
This method uses the code found in this demo: http://mrdoob.com/lab/javascript/height2normal/
Demo here:
http://meetar.github.io/three.js-normal-map-0/index14.html

Resources