Three.js How to find vertices of the face by face index? - three.js

In my Three.js project i have a mesh using a buffergeometry.
Using raycaster i find the intersection of a ray with this mesh in a specific face of which it says me the index.
How can i find vertices position of this face?

Instead of using the faceIndex, it's easier to use the face property of an intersection object. You can use it like so:
var vA = new THREE.Vector3();
var vB = new THREE.Vector3();
var vC = new THREE.Vector3();
var face = intersection.face;
var geometry = intersection.object.geometry;
var position = geometry.attributes.position;
vA.fromBufferAttribute( position, face.a );
vB.fromBufferAttribute( position, face.b );
vC.fromBufferAttribute( position, face.c );
If you need the vertices in world space, then multiple these three vectors with the world matrix of your object.
three.js R102

Related

Create points from shape

I'm trying to draw around 3D elements of an object to make them selectable.
In a 2D shape it's pretty easy using Shap.
const shape = new Shape();
if (!points.length) return shape;
for (const point of points) shape.lineTo(point[0], point[1]);
if (mousePos) shape.lineTo(mousePos[0], mousePos[1]);
return shape;
I was thinking in 3D I could draw around the entities with the mouse, fill in the gaps with a point cloud, iterative other each point with the raycaster to so it is adjusted to the point nearest the camera that intersects with the mesh and I should have a points shape that fits the underlying mesh.
My question is - if I have a Shape what is the easiest way to create points to cover the area of the shape so I can find the point closest to the camera?
use BufferGeometry and PointMaterial to express particle point.
points require vertices with z position. give them Z position.
import * as T from 'three';
const points = [x1,y1,x2,y2,x3,y3...]; /*... your points in 2d ...*/
const shape = new T.Shape(/* ... your shape from points ... */);
const z = 3; // any z position you want
// because `THREE.Shape` is in 2D, give it a Z position from the shape
const vertices = new Float32Array(points.reduce((ac, cv) => ([...ac, ...cv, z]), []);
const geometry = new T.BufferGeometry();
geometry.setAttribute( 'position', new T.BufferAttribute( vertices, 3 ) );
const material = new T.PointsMaterial({ color: new T.Color('red') });
const mesh = new T.Mesh(geometry, material);
scene.add(mesh);
https://threejs.org/docs/index.html?q=shape#api/en/core/BufferGeometry
Edit: points array is flattened array of [x,y]. vertices array is flattened array of [x,y,z].

Three.js: Draw a vector on plane

I'm a newbie in three.js (and in stackoverflow).
I try to find answer but I'm not able to do this simple things.
I'm playing with Helpers and Plane.
I want to create a Plane (and it's PlaneHelper), and draw an arbitrary vector on this Plane.
All is right if the plane's distance from origin is set to 0.
If I give a distance to the plane, the vector is not on the plane.
Here is the commented code I use for this little experiment.
Projecting both the origin and the vector on the plane I was convinced that arrowHelper_Point remained on the plane, but it's not.
Where is my mistake? I can not understand it.
// Define ARROW_LENGTH to display ArrowHelper
const ARROW_LENGTH = 5;
// Point (0,0,0)
var origin = new THREE.Vector3(0, 0, 0);
// Axes helper in (0,0,0)
var axesHelperOrigin = new THREE.AxesHelper(100);
scene.add(axesHelperOrigin);
// Define a plane by the normal, color and distance from (0,0,0)
var vectorNormal = {
normal: new THREE.Vector3(1, 1, 0).normalize(),
color: "rgb(255, 255, 0)",
colorNormal: "rgb(255,100,0)",
colorVector: "rgb(194, 27, 255)",
distance: -3,
};
// Create Plane from the normal and distance
var plane = new THREE.Plane(vectorNormal.normal, vectorNormal.distance);
// Add PlaneHelper to scene
var planeHelper = new THREE.PlaneHelper(plane, 100, vectorNormal.color);
scene.add(planeHelper);
// Add ArrowHelper to display normal
// Find the projection of origin on plane
var originOnPlane = plane.projectPoint(origin);
var arrowHelper_Normal = new THREE.ArrowHelper(vectorNormal.normal, originOnPlane, ARROW_LENGTH, vectorNormal.colorNormal);
scene.add(arrowHelper_Normal);
// Define a point "random"
var point = new THREE.Vector3(5, -2, 6);
// Project the point on plane
var pointOnPlane = plane.projectPoint(point);
// Draw ArrowHelper to display the pointOnPlane, from originOnPlane
var arrowHelper_Point = new THREE.ArrowHelper(pointOnPlane.normalize(), originOnPlane, ARROW_LENGTH, vectorNormal.colorVector);
scene.add(arrowHelper_Point);
EDIT: OK, I think I find the error.
Looking at this Get direction between two 3d vectors using Three.js?
I need the vector between the two points:
var dir=new THREE.Vector3();
dir.subVectors(pointOnPlane,originOnPlane).normalize();
And use dir as the arrow direction.
Sorry for asking an obviously thing.
Looking at this Get direction between two 3d vectors using Three.js?
I need the vector between the two points:
var dir=new THREE.Vector3();
dir.subVectors(pointOnPlane,originOnPlane).normalize();
And use dir as the arrow direction.

Raycaster does not find intersect even though it should

I am using a raycaster for a projection of a point onto a face. But somehow that doesn't seem to work. I.e. taking point (25,25,300) and direction (0,0,-1) the raycaster doesn't find and intersect with a box of size (30,30,30) located at (0,0,0). Am I doing sth wrong?
var geometry = new THREE.BoxGeometry(30, 30, 30);
var material = new THREE.MeshBasicMaterial( );
var mesh = new THREE.Mesh(geometry, material);
var dir = new THREE.Vector3(0,0,-1);
var p = new THREE.Vector3(25,25,300);
var raycaster = new THREE.Raycaster(p, dir);
var intersects = raycaster.intersectObjects(mesh); // returns an empty array
There are two problems with your example, first is that you are using the method raycaster.intersectObjects which takes an array as argument, when you should be using raycaster.intersectObject which takes an object.
Secondly, you are missing the mesh.
Try these values: var p = new THREE.Vector3(15,15,300); instead. The image below illustrates the problem..

Applying a matrix in Three.js does not what I expect

For a project I am working on I am trying to get get a 3D-model of a building visible in a browser. Of all the elements of the building I have vertices, indices and a matrix3d. This information comes from an application that uses OpenGL to show the elements in a offline program.
Now I am trying to add these elements to my Three.js scene.
I am at the point that I can add elements to the scene defined by the vertices and indices an I can see them by using materials and lights, but I can not rotate and translate them into the right place. For example I add an element like this:
var m242242255255 = new THREE.MeshPhongMaterial({color:0xf2f2ff, transparent:true, opacity:1, side:THREE.DoubleSide});
var geometry = new THREE.BufferGeometry();
geometry.addAttribute('position', new THREE.BufferAttribute(new Float32Array([821,-15,2825.1,-821,-15,2825.1,-821,-39,2825.1,821,-39,2825.1,-821,-39,54,-821,-15,54,821,-15,54,821,-39,54,-875,-54,0,-821,-54,54,-821,-54,2825.1,821,-54,54,-875,-54,2879.1,821,-54,2825.1,875,-54,0,875,-54,2879.1,875,0,0,821,0,54,821,0,2825.1,-821,0,54,875,0,2879.1,-821,0,2825.1,-875,0,0,-875,0,2879.1]), 3));
geometry.setIndex(new THREE.BufferAttribute(new Uint16Array([8,9,10,9,8,11,12,10,13,10,12,8,11,14,13,14,11,8,13,15,12,15,13,14,16,17,18,17,16,19,20,18,21,18,20,16,19,22,21,22,19,16,21,23,20,23,21,22,8,22,16,16,14,8,14,16,20,20,15,14,15,20,23,23,12,15,12,23,22,22,8,12,13,18,17,17,11,13,11,17,19,19,9,11,9,19,21,21,10,9,10,21,18,18,13,10]), 1));
var mesh = new THREE.Mesh(geometry, m242242255255);
mesh.matrixAutoUpdate = false;
mesh.applyMatrix(new THREE.Matrix4().set(0,0,-1,0, -0.42262,-0.90631,0,0, -0.90631,0.42262,0,0, 64754.68,15569.13,-4647.5,1));
mesh.updateMatrix();
scene.add(mesh);
The element shows up in my scene and it looks like is rotated but it is not translated to its correct position.
I can add the translation before the adding of the mesh to the scene, but it feels like it should not be necessary.
mesh.applyMatrix(new THREE.Matrix4().makeTranslation(-64754.68, -15569.13, -4647.5));
mesh.updateMatrix();
It also looks like the element is rotated along the wrong axis. It is rotated along the x-axis instead of the z-axis. Can someone tell me what it is I am doing wrong? Should I changed the matrix first to be able to use it in Three.js?
Edit:
I just found out that I had to invert my matrix to correct the rotation-problem. So I now have:
var geometry = new THREE.BufferGeometry();
geometry.addAttribute('position', new THREE.BufferAttribute(new Float32Array([821,-15,2825.1,-821,-15,2825.1,-821,-39,2825.1,821,-39,2825.1,-821,-39,54,-821,-15,54,821,-15,54,821,-39,54,-875,-54,0,-821,-54,54,-821,-54,2825.1,821,-54,54,-875,-54,2879.1,821,-54,2825.1,875,-54,0,875,-54,2879.1,875,0,0,821,0,54,821,0,2825.1,-821,0,54,875,0,2879.1,-821,0,2825.1,-875,0,0,-875,0,2879.1]), 3));
geometry.setIndex(new THREE.BufferAttribute(new Uint16Array([8,9,10,9,8,11,12,10,13,10,12,8,11,14,13,14,11,8,13,15,12,15,13,14,16,17,18,17,16,19,20,18,21,18,20,16,19,22,21,22,19,16,21,23,20,23,21,22,8,22,16,16,14,8,14,16,20,20,15,14,15,20,23,23,12,15,12,23,22,22,8,12,13,18,17,17,11,13,11,17,19,19,9,11,9,19,21,21,10,9,10,21,18,18,13,10]), 1));
var mesh = new THREE.Mesh(geometry, m242242255255);
mesh.matrixAutoUpdate = false;
var matrix = new THREE.Matrix4();
matrix.set(0,0,-1,0,-0.42262,-0.90631,0,0,-0.90631,0.42262,0,0,64754.68,15569.13,-4647.5,1);
matrix.getInverse(matrix);
mesh.applyMatrix( matrix );
mesh.updateMatrix();
mesh.applyMatrix( new THREE.Matrix4().makeTranslation( 64754.68, 15569.13, -4647.5 ) );
mesh.updateMatrix();
scene.add(mesh);
But I still have a problem with translating using the matrix. How can I avoid updating the mesh twice?
You need to specify your matrix elements by rows, like so:
matrix.set( n11, n12, n13, n14,
n21, n22, n23, n24,
n31, n32, n33, n34,
n41, n42, n43, n44 );
It is done this way so it is human-readable.
three.js r.76
suppost you ratate in the x, y, z sequence.
rotationMatrix = new THREE.Matrix4().multiplyMatrices(new THREE.Matrix4().makeRotationY(rV.y), new THREE.Matrix4().makeRotationX(rV.x));
rotationMatrix.premultiply(new THREE.Matrix4().makeRotationZ(rV.z));
matrix.copy(rM).setPosition(vector3);
From the documentation for modifying the object's matrix directly
Note that matrixAutoUpdate must be set to false in this case, and you should make sure not to call updateMatrix. Calling updateMatrix will clobber the manual changes made to the matrix, recalculating the matrix from position, scale, and so on.
You'll find that after you call mesh.updateMatrix(), the mesh transformation matrix will be different than the one you set. You verify this by comparing matrix.elements to mesh.matrixWorld.elements, which will be the same after you remove updateMatrix.

Inconsistent surface behaviour in Three.js

when I set a point light at a THREE.BoxGeometry object it looks like this:
THREE.BoxGeometry with point light
var light = new THREE.PointLight (0xffffff, 1, 100);
light.position.set (10, 10, 10);
scene.add (light);
var geometry = new THREE.BoxGeometry (1, 1, 1);
var material = new THREE.MeshPhongMaterial ();
var cube = new THREE.Mesh (geometry, material);
scene.add (cube);
When I now set a point light at a THREE.PolyhedronGeometry object it looks like this:
THREE.PolyhedronGeometry with point light
var light = new THREE.PointLight (0xffffff, 1, 100);
light.position.set (10, 10, 10);
scene.add (light);
var geometry = new THREE.PolyhedronGeometry (vertices, faces, 1, 0);
var vertices = [-1,-1,-1,1,-1,-1,1,1,-1,-1,1,-1,-1,-1,1,1,-1,1,1,1,1,-1,1,1];
var faces = [2,1,0,0,3,2,0,4,7,7,3,0,0,1,5,5,4,0,1,2,6,6,5,1,2,3,7,7,6,2,4,5,6,6,7,4];
var material = new THREE.MeshPhongMaterial ();
var cube = new THREE.Mesh (geometry, material);
scene.add (cube);
I want to know, where this behaviour comes from and how I can manage to make polyhedrons' faces behave as nice as boxs?
I read that it might be related to geometry.computeFaceNormals().
So I tried it out, but it doesn't make any difference.
when something is different with how light behaves on a surface, first candidates to look at are normals
this is true for the box face
boxGeometry.faces[i].normal.equals(boxGeometry.faces[i].vertexNormals[j]);//true
so box has only simple normal for each face
the polyhedron has different face normal from the vertex normals
polyhedronGeo.faces[i].normal.equals(polyhedronGeo.faces[i].vertexNormals[j]);//not true
and some of the vertex normals are not equal among each other
polyhedronGeo.faces[i].vertexNormals[j].equals(polyhedronGeo.faces[i].vertexNormals[k]);
//not true for some j,k
that is why the light looks ~shadowy - normal is interpolated for the shader from vertexNormals
to modify the polyhedron to look like box just change the vertex notmals to match the face normal
as for
geometry.computeFaceNormals();
it will only compute the face normals, not the vertexNormals
there is another function
geometry.computeVertexNormals();
but that would create vertex normals as are in polyhedron
Thanks Derte. Your reflection got me closer to the point. So with advanced keywords I found this: https://github.com/mrdoob/three.js/issues/1982
The answer to my question is this line, flattening shading for "free forms":
material.shading = THREE.FlatShading;

Resources