THREE.BufferGeometry - vertex normals and face normals - three.js

In the documentation for THREE.BufferGeometry is written:
normal (itemSize: 3)
Stores the x, y, and z components of the face or vertex normal vector of each vertex in this geometry. Set by .fromGeometry().
When is this variable holding vertex normals and when face normals?
Is it as simple as if a THREE.MeshMaterial is used the normals are interpreted as face normals and when a THREE.LineMaterial is used the normals are used as vertex normals? Or is it more complicated then that.
I also understood that THREE.FlatShading can be used for rendering a mesh with flat shading (face normals point straight outward).
geometry = new THREE.BoxGeometry( 1000, 1000, 1000 );
material = new THREE.MeshPhongMaterial({
color: 0xff0000,
shading: THREE.FlatShading
});
mesh = new THREE.Mesh( geometry, material );
I would say normals are not necessary any more. Why are my buffer geometries made from for example a THREE.BoxGeometry still holding a normal attribute in such case? Is this information still used for rendering or would removing them from the buffer geometry be a possible optimization?

BufferGeometry normals are vector normals and shader interpolates normal value for each fragment from vertices belonging to that face (in most cases triangle)
when you convert THREE.BoxGeometry which has normals computed by default, they stay set up even in the BufferGeometry conversion output, as geometry does not have any way to "know" whether you need normals or any of the attributes (material program decides what attributes are used)
you can remove the normals with geometry.removeAttribute("normal")

Related

Does THREE.ShaderMaterial create one shader per particle? I'm not sure how it works, If yes, how can I share/update data between each particles shader

I would like to create a shader that simulates gravity between 2 particles. For this, each particle must know the position of the other particles, update its position accordingly, and therefore "share" its new position with the other particles.
If I understand correctly, when I do:
material = new THREE.ShaderMaterial({
depthWrite: false,
blending: AdditiveBlending,
vertexColors: true,
vertexShader: galaxyVortexShader,
fragmentShader: galaxyFragmentShader,
uniforms: {
uTime: {value: 0},
uSize: { value: 10 * renderer.getPixelRatio()},
uPositions: { value: positionsVec3}
}
});
I create a shader for each particle ? The problem is that I send the position of all the particles once in "uPositions", but if each particle has its own shader, how can they update their position in the uPositions array to share it to other particles ?
What you're describing is demonstrated in the official protoplanets demo. It basically
Calculates all velocities in a shader that is output as a 64x64 Texture.
This Texture gets passed to a second shader that uses it to calculate all positions. This way each particle has access to all velocities.
Then when rendering the planets onscreen, they all have access to both velocities & positions textures, so each vertex can access all data for their adjacent vertices. Using 64x64 textures gives you data for 4096 unique particles.

THREE.js raycaster intersectObject method returns no intersection when checking if point is inside mesh

I want to check whether a point is inside a mesh or not. To do so, I use a raycaster, set it to the point's origin and if the ray intersects the mesh only once, it must be inside. Unfortunately, the intersectObject always returns no intersection, even in cases I know that the point is located inside the mesh.
The point's origin is given in world coordinates and the mesh's matrixWorld is up to date too. Also, I set the mesh.material.side to THREE.DoubleSide, so that the intersection from inside should be detected. I tried setting the recursive attribute to true as well, but as expected, this didn't have any effect (since the mesh is a box geometry). The mesh is coming from the Autodesk Forge viewer interface.
Here is my code:
mesh.material.side = THREE.DoubleSide;
const raycaster = new THREE.Raycaster();
let vertex = new THREE.Vector3();
vertex.fromArray(positions, positionIndex);
vertex.applyMatrix4(matrixWorld);
const rayDirection = new THREE.Vector3(1, 1, 1).normalize();
raycaster.set(vertex, rayDirection);
const intersects = raycaster.intersectObject(mesh);
if (intersects.length % 2 === 1) {
isPointInside = true;
}
The vertex looks like this (and it obviosly lies inside of the bounding box):
The mesh is a box shaped room with the following bounding box:
The mesh looks like this:
The geometry of the mesh holds the vertices in the vb. After applying the world matrix, the mesh vertices are correct in world space. Here is a part of the vb list:
Why does the raycaster not return any intersection? Is the matrixWorld of the mesh taken into account when computing the intersections?
Thanks for any kind of help!
Note that Forge Viewer is based on three.js version R71, and it had to modify/reimplement some parts of the library to handle large and complex models (especially architecture and infrastructure designs), so THREE.Mesh objects might have a slightly different structure. In that case I'd suggest to raycast using Forge Viewer's own mechanisms, e.g., using viewer.impl.rayIntersect(ray, ignoreTransparent, dbIds, modelIds, intersections);.

Why is three.js inconsistent about gouraud interpolation?

I want to shade a THREE.BoxBufferGeometry using a simple THREE.MeshLambertMaterial. The material is supposed to use a Lambert illumination model to pick the colors for each vertex (and it does), and then use Gouraud shading to produce smooth gradients on each face.
The Gouraud part is not happening. Instead, the cube's faces are each shaded with one single, solid color.
I have tried various other BufferGeometrys, and gotten inconsistent results.
For example, if instead I make an IcosahedronBufferGeometry, I get the same problem: each face is one single, solid color.
geometry = new THREE.IcosahedronBufferGeometry(2, 0); // no Gouraud shading.
geometry = new THREE.IcosahedronBufferGeometry(2, 2); // no Gouraud shading.
On the other hand, if I make a SphereBufferGeometry, the Gouraud is present.
geometry = new THREE.SphereBufferGeometry(2, 3, 2); // yes Gouraud shading.
geometry = new THREE.SphereBufferGeometry(2, 16, 16); // yes Gouraud shading.
But then if I make a cube using a PolyhedronBufferGeometry, the Gouraud shading doesn't appear unless I set the detail to something other than 0.
const verticesOfCube = [
-1,-1,-1, 1,-1,-1, 1, 1,-1, -1, 1,-1,
-1,-1, 1, 1,-1, 1, 1, 1, 1, -1, 1, 1,
];
const indicesOfFaces = [
2,1,0, 0,3,2,
0,4,7, 7,3,0,
0,1,5, 5,4,0,
1,2,6, 6,5,1,
2,3,7, 7,6,2,
4,5,6, 6,7,4
];
const geometry = new THREE.PolyhedronBufferGeometry(verticesOfCube, indicesOfFaces, 1, 1); // no Gouraud shading
geometry = new THREE.PolyhedronBufferGeometry(verticesOfCube, indicesOfFaces, 1, 1); // yes Gouraud shading
I am aware of the existence of the BufferGeometry methods computeFaceNormals() and computeVertexNormals(). Normals are emphatically important here, as they are used to determine the colors for each face and vertice, respectively. But while they help with the Icosahedron, they have no effect on the Box, no matter whether they are present, only one is present, or both are present in both possible orders.
Here is the code I expect to work:
const geometry = new THREE.BoxBufferGeometry(2, 2, 2);
geometry.computeFaceNormals();
geometry.computeVertexNormals();
const material = new THREE.MeshLambertMaterial({
color: 0xBE6E37
});
const mesh = new THREE.Mesh(geometry, material);
I should be getting a cube whose faces (the real, triangular ones) are shaded with a gradient. First, the face normals should be computed, and then the vertex normals by averaging the normals of the faces formed by them. Here is a triangular bipyramid on which correct Gouraud shading is being applied:
But the code above produces this instead:
At no point does three.js log any errors or warnings to the console.
So what is it that's going on here? The only explanation I can think of is that the Box is actually comprised of 24 vertices, three at each corner of the cube, and that they form faces such that each vertex's computed normal is an average of at most two faces pointing in the same direction. But I can't find that written down anywhere, and that explanation doesn't fly for the Polyhedron where vertices and faces were explicitly specified in code.

How to add lights to a Mesh generated by BufferGeometry and drawed as TriangleStrips?

I'm trying to add lights to a scene where there is a Mesh created by BufferGeometry. The mesh.drawMode is THREE.TriangleStripDrawMode. I don't know why light is not applying to the mesh.
There is an example bellow:
https://jsbin.com/jofasabeji/edit?js,output
Is there a flag to be activated (like face culling)?
Thanks!
Your geometry is missing vertex normals.
You can specify the normals yourself, or -- if you find the result acceptable -- you can call:
geometry.computeVertexNormals();
Alternatively, you can avoid setting vertex normals if you set the material property to flat-shading (and your material supports it):
material.shading = THREE.FlatShading;
Also, you need to set a reasonable intensity for your light:
var light = new THREE.PointLight( 0xffffff, 1 );
three.js r.85

Set transparency of face by index in THREE.js

I've managed to set the colour of a mesh face using:
geometry.faces[i].color.setHex('0xff00ff');
Is there a function to set the transparency to true and opacity to say 0.5?
I'm sure there is one, just have no idea of the syntax.
Actually, you cannot achieve that by changing your geometry. Because transparency controlled by materials.
But there's way to do this.
First, each face has materialIndex (Face manual).
Next, Each mesh, drawn within three.js scene has material. And there's special material of type THREE.MeshFaceMaterial (MeshFaceMaterial manual), which takes array of materials as argument.
When faces are drawn, three.js renderer takes face's materialIndex and uses corresponding material from this material array or, if mesh contains single material type.
So you could do something like:
var opacMaterial = new THREE.MeshLambertMaterial({
transparent:true,
opacity:0.7
});
var solidMaterial = new THREE.MeshLambertMaterial({
transparent:false,
color:new THREE.Color(1,0,0)
});
var mesh = new THREE.Mesh(
geometry,
new THREE.MultiMaterial([solidMaterial, opacMaterial])
);
By default, if your geometry have materialIndex == 0 for each faces, you will see solidMaterial drawn.
If you want to make it transparent do something like this;
geometry.faces[i].materialIndex = 1;
Don't forget to update geometry in mesh: (How to update geometry in mesh question.)
Also, aware, if you have materialIndex in your faces greater than length of material array, you will get awkward error inside of deep of THREE.js

Resources