Three js: How to normalize a mesh generated by vertices - three.js

I'm somewhat new to Three js, and my linear algebra days were back in the 90s so I don't recall much about quarternions. My issue is I have 8 vertices for a cube that I can use to create a custom geometry mesh from, but it doesn't set the position / rotation / scale info for its world matrix. Therefor it can not be used cleanly by other three js modules like controls. I can look up the math and calculate what position / scale / rotation (rotation gets a bit hairy with some fun acos stuff) should be and create a standard boxgeometry from that. But it seems like there should be some way to do it via three js objects if I can generate the proper matrix to apply to it. The quarternion setFromUnitVectors looked interesting, but I'd still have to do some work to generate the vectors. Any ideas would be appreciated thanks
Edit: :) So let me try and simplify. I have 8 vertices, I want to create a box geometry. But box geometry doesn't take vertices. It takes width, height, depth (relatively easy to calculate) and then you set the position/scale/rotation. So here's my code thus far:
5____4
1/___0/|
| 6__|_7
2/___3/
const box = new Box3();
box.setFromPoints(points);
const width = points[1].distanceTo(points[0]);
const height = points[3].distanceTo(points[0]);
const depth = points[4].distanceTo(points[0]);
const geometry = new BoxGeometry(width, height, depth);
mesh = new Mesh(geometry, material);
const center = box.getCenter(new Vector3());
const normalizedCorner = points[0].clone().sub(center);
const quarterian = new Quaternion();
quarterian.setFromUnitVectors(geometry.vertices[0], normalizedCorner);
mesh.setRotationFromQuaternion(quarterian);
mesh.position.copy(center);
The problem being my rotation element is wrong (besides my vectors not being unit vectors). I'm apparently not getting the correct quarternion to rotate my mesh correctly.
Edit: From WestLangley's suggestion, I'm creating a rotation matrix. However, while it rotates in the correct plane, the angle is off. Here's what I have added:
const matrix = new Matrix4();
const widthVector = new Vector3().subVectors(points[6], points[7]).normalize();
const heightVector = new Vector3().subVectors(points[6], points[5]).normalize();
const depthVector = new Vector3().subVectors(points[6], points[2]).normalize();
matrix.set(
widthVector.x, heightVector.x, depthVector.x, 0,
widthVector.y, heightVector.y, depthVector.y, 0,
widthVector.z, heightVector.z, depthVector.z, 0,
0, 0, 0, 1,
);
mesh.quaternion.setFromRotationMatrix(matrix);

Per WestLangley's comments I wasn't creating my matrix correctly. The correct matrix looks like:
const matrix = new Matrix4();
const widthVector = new Vector3().subVectors(points[7], points[6]).normalize();
const heightVector = new Vector3().subVectors(points[5], points[6]).normalize();
const depthVector = new Vector3().subVectors(points[2], points[6]).normalize();
matrix.set(
widthVector.x, heightVector.x, depthVector.x, 0,
widthVector.y, heightVector.y, depthVector.y, 0,
widthVector.z, heightVector.z, depthVector.z, 0,
0, 0, 0, 1,
);
mesh.quaternion.setFromRotationMatrix(matrix);

Related

Three js: How to get normal of rotated plane

I am trying to get normal of rotated plane. My solution is to copy the updated plane then get normals.
It is working when I rotate by only 1 angle, but not works in rotating by 2 or 3 angles. jsFiddle
Green one is copied plane, purple one rotated plane.
enter image description here
How to solve this? Please help me
My copy function:
function copyPlane() {
let copyPlaneGeom = new THREE.PlaneGeometry(3, 3, 3);
copyPlaneGeom.rotateX(plane.rotation.x);
copyPlaneGeom.rotateY(plane.rotation.y);
copyPlaneGeom.rotateZ(plane.rotation.z);
let copyPlane = new THREE.Mesh(copyPlaneGeom, new THREE.MeshBasicMaterial({color: 0x00ff00}));
scene.add(copyPlane)
let normals = copyPlane.geometry.faces[0].normal
I think with that approach, you'll always get a vector of 0, 0, 1 because a plane's face normal is always (0, 0, 1) * objectRotation.
Instead, try starting with a Vector3 of 0, 0, 1, and then apply the object's rotation to it:
var originalNormal = new Vector3(0, 0, 1);
// Get the mesh rotation
var objRotation = plane.rotation;
// Apply mesh rotation to vector
originalNormal.applyEuler(objRotation);
now you have a Vector3 with the updated wold normal, instead of the local normal! Read about .applyEuler() here.

Vertex color interpolation artifacts

I display a "curved tube" and color its vertices based on their distance to the plane the curve lays on.
It works mostly fine, however, when I reduce the resolution of the tube, artifacts starts to appear in the tube colors.
Those artifacts seem to depend on the camera position. If I move the camera around, sometimes the artifacts disappear. Not sure it makes sense.
Live demo: http://jsfiddle.net/gz1wu369/15/
I do not know if there is actually a problem in the interpolation or if it is just a "screen" artifact.
Afterwards I render the scene to a texture, looking at it from the "top". It then looks like a "deformation" field that I use in another shader, hence the need for continuous color.
I do not know if it is the expected behavior or if there is a problem in my code while setting the vertices color.
Would using the THREEJS Extrusion tools instead of the tube geometry solve my issue?
const tubeGeo = new THREE.TubeBufferGeometry(closedSpline, steps, radius, curveSegments, false);
const count = tubeGeo.attributes.position.count;
tubeGeo.addAttribute('color', new THREE.BufferAttribute(new Float32Array(count * 3), 3));
const colors = tubeGeo.attributes.color;
const color = new THREE.Color();
for (let i = 0; i < count; i++) {
const pp = new THREE.Vector3(
tubeGeo.attributes.position.array[3 * i],
tubeGeo.attributes.position.array[3 * i + 1],
tubeGeo.attributes.position.array[3 * i + 2]);
const distance = plane.distanceToPoint(pp);
const normalizedDist = Math.abs(distance) / radius;
const t2 = Math.floor(i / (curveSegments + 1));
color.setHSL(0.5 * t2 / steps, .8, .5);
const green = 1 - Math.cos(Math.asin(Math.abs(normslizedDist)));
colors.setXYZ(i, color.r, green, 0);
}
Low-res tubes with "Normals" material shows different artifact
High resolution tube hide the artifacts:

Three.js merging mesh/geometry objects

I'm creating a three.js app which consists of floor (which is composed of different tiles) and shelving units (more than 5000...). I'm having some performance issues and low FPS (lower then 20), and I think it is because I'm creating a separate mesh for every tile and shelving unit. I know that I can leverage geometry/mesh merging in order to improve performance. This is the code for rendering the floor and shelving units (cells):
// add ground tiles
const tileGeometry = new THREE.PlaneBufferGeometry(
1,
1,
1
);
const edgeGeometry = new THREE.EdgesGeometry(tileGeometry);
const edges = new THREE.LineSegments(edgeGeometry, edgeMaterial);
let initialMesh = new THREE.Mesh(tileGeometry, floorMat);
Object.keys(groundTiles).forEach((key, index) => {
let tile = groundTiles[key];
let tileMesh = initialMesh.clone();
tileMesh.position.set(
tile.leftPoint[0] + tile.size[0] / 2,
tile.leftPoint[1] + tile.size[1] / 2,
0
);
tileMesh.scale.x = tile.size[0];
tileMesh.scale.y = tile.size[1];
tileMesh.name = `${tile.leftPoint[0]}-${tile.leftPoint[1]}`;
// Add tile edges (adds tile border lines)
tileMesh.add(edges.clone());
scene.add(tileMesh);
});
// add shelving units
const cellGeometry = new THREE.BoxBufferGeometry( 790, 790, 250 );
const wireframe = new THREE.WireframeGeometry( cellGeometry );
const cellLine = new THREE.LineSegments(wireframe, shelves_material);
Object.keys(cells).forEach((key, index) => {
let cell = cells[key];
const cellMesh = cellLine.clone();
cellMesh.position.set(
cell["x"] + 790 / 2,
// cell["x"],
cell["y"] + 490 / 2,
cell["z"] - 250
);
scene.add(cellMesh);
});
Also, here is a link to a screenshot from the final result.
I saw this article regarding merging of geometries, but I don't know how to implement it in my case because of the edges, line segments and wireframe objects I'm using..
Any help would be appriciated
Taking into account #Mugen87's comment, here's a possible approach :
Pretty straightforward merging of planes
Using a shader material to draw "borders"
Note : comment out the discard; line to fill the cards with red or whatever material you might want.
JsFiddle demo

Is internally .matrixWorld is used?

I have two functions which difference is using clone of originObj.matrixWorld to multiply with 'transform' argument.
the first one does not work but second one does.
as far as I know the 'matrix' and 'matrixWorld' properties of object3d are newly calculated in every frame, more specifically in renderer.render().
So I thought that assigning arbitrary matrix to .matrix or .matrixWorld is worthless, cause it will be overwrited in rederer.render().
If my thought is right, following two functions are working well either.
But only second one does work.
What is misunderstanding I am?
applyTransform(originObj, target, transform) {
const newTransform = originObj.matrixWorld.multiply(transform);
// decompose newTransformMatrix as position, rotation and scale
const position = new THREE.Vector3();
const quaternion = new THREE.Quaternion();
const scale = new THREE.Vector3();
newTransform.decompose(position, quaternion, scale);
target.position.copy(position);
target.quaternion.copy(quaternion);
target.scale.copy(scale);
}
applyTransform(originObj, target, transform) {
const newTransform = originObj.matrixWorld.clone().multiply(transform);
// decompose newTransformMatrix as position, rotation and scale
const position = new THREE.Vector3();
const quaternion = new THREE.Quaternion();
const scale = new THREE.Vector3();
newTransform.decompose(position, quaternion, scale);
target.position.copy(position);
target.quaternion.copy(quaternion);
target.scale.copy(scale);
}
I'm using threejs r90
There is a build-in function to apply a matrix to an object. The Object3D.applyMatrix method does exactly the same what your second function is doing.
Applies the matrix transform to the object and updates the object's position, rotation and scale.
applyMatrix: function ( matrix ) {
this.matrix.multiplyMatrices( matrix, this.matrix );
this.matrix.decompose( this.position, this.quaternion, this.scale );
}
So, just call:
target.applyMatrix(originObj.matrixWorld);
EDIT: Oh, I forgot the transform;
var matrix = new THREE.Matrix4().multiplyMatrices(originObj.matrixWorld, transform);
target.applyMatrix(matrix);

Orbiting a cube in WebGL with glMatrix

https://jsfiddle.net/sepoto/Ln7qvv7w/2/
I have a base set up to display a cube with different colored faces. What I am trying to do is set up a camera and apply a combined X axis and Y axis rotation so that the cube spins around both axis concurrently. There seems to be some problems with the matrices I set up as I can see the blue face doesn't look quite right. There are some examples of how this is done using older versions of glMatrix however the code in the examples no longer works because of some changes in vec4 of the glMatrix library. Does anyone know how this can be done using the latest version of glMatrix as I have attached a CDN to the fiddle?
Thank you!
function drawScene() {
gl.viewport(0,0,gl.viewportWidth, gl.viewportHeight);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
mat4.ortho( mOrtho, -5, 5, 5, -5, 2, -200);
mat4.identity(mMove);
var rotMatrix = mat4.create();
mat4.identity(rotMatrix);
rotMatrix = mat4.fromYRotation(rotMatrix, yRot,rotMatrix);
rotMatrix = mat4.fromXRotation(rotMatrix, xRot,rotMatrix);
mat4.multiply(mMove, rotMatrix, mMove);
setMatrixUniforms();
gl.bindBuffer(gl.ARRAY_BUFFER, triangleVertexPositionBuffer);
gl.vertexAttribPointer(shaderProgram.vertexPositionAttribute, triangleVertexPositionBuffer.itemSize, gl.FLOAT, false, 0, 0);
gl.bindBuffer(gl.ARRAY_BUFFER, triangleColorBuffer);
gl.vertexAttribPointer(shaderProgram.vertexColorAttribute, triangleColorBuffer.itemSize, gl.FLOAT, false, 0, 0);
gl.drawArrays(gl.TRIANGLES, 0, triangleVertexPositionBuffer.numItems);
yRot += 0.01;
xRot += 0.01;
}
As the name says, fromYRotation() initializes a matrix to a given rotation. Hence, you need two temporary matrices for the partial rotations, which you can then combine:
var rotMatrix = mat4.create();
var rotMatrixX = mat4.create();
var rotMatrixY = mat4.create();
mat4.fromYRotation(rotMatrixY, yRot);
mat4.fromXRotation(rotMatrixX, xRot);
mat4.multiply(rotMatrix, rotMatrixY, rotMatrixX);
And the reason why your blue face was behaving strangely, was the missing depth test. Enable it in your initialization method:
gl.enable(gl.DEPTH_TEST);
You dont need to use three matrices:
// you should do allocations outside of the renderloop
var rotMat = mat4.create();
// no need to set the matrix to identity as
// fromYRotation resets rotMats contents anyway
mat4.fromYRotation(rotMat, yRot);
mat4.rotateX(rotMat,xRot);

Resources