From the 3 black points I found the plane
const { Vector3, Plane } = require('three')
const points = [new Vector3(0, 0, 0), new Vector3(1, 0, 1), new Vector3(1, 2, 0)]
const plane = new Plane().setFromCoplanarPoints(...points)
But how do I get the Z coordinate of the fourth red point (example: (0.75, 0.75, z)) that lies in the plane?
This doesn't seem to work:
const targetPoint = new Vector3()
plane.projectPoint(new Vector3(0.75, 0.75, 0), targetPoint)
/*
Vector3 {
x: 0.5833333333333334,
y: 0.8333333333333334,
z: 0.16666666666666666
}
*/
An answer with TurfJS would be also perfectly OK
Just for others to know, I solved with TurfJS using its method planepoint.
The method polygon has as 1st parameter an array of Linear Rings, and a Linear Ring must have the same first and last points, thus a triangle has 4 points. a, b, c represent the ordered heights.
const turf = require('#turf/turf')
const point = turf.point([0.75, 0.75])
const triangle = turf.polygon([[
[0, 0], [1, 0], [1, 2], [0, 0]
]], {
a: 0,
b: 1,
c: 0
})
const zValue = turf.planepoint(point, triangle) // 0.375
Related
The method applyMatrix4 seems like it does nothing...
Why can I not apply this transformation matrix to my vector?
const vec = new THREE.Vector3(1,1,1)
const geometry = new THREE.BoxGeometry(1,1,1)
const material = new THREE.MeshBasicMaterial({ color: 0xff0000 })
const mesh = new THREE.Mesh(geometry, material)
mesh.rotateX(Math.PI)
const rotatedVec = vec.applyMatrix4(mesh.matrix)
console.log(rotatedVec)
Expectation (taking the cross product):
{x: 1, y: -1, z: -1}
Reality (the vector is unchanged)
{x: 1, y: 1, z: 1}
My mesh's matrix has changed - it is not the identity matrix.
[
[1, 0, 0, 0],
[0, -1, 0, 0],
[0, 0, -1, 0],
[0, 0, 0, 1],
]
Object3D.rotateX() only affects the object's quaternion property. It does not update its local matrix. If you say your matrix has changed, I assume you have checked it at a later point when other engine logic triggers a recalculation.
You can solve this issue by adding mesh.updateMatrix(); after you have called Object3D.rotateX().
Or even better use Vector3.applyQuaternion(). In this way, you don't have to recompute the matrix because you don't need it anyway.
const rotatedVec = vec.applyQuaternion(mesh.quaternion)
I want to animate a Plane vertices to fill the screen. (Vertices as this is the effect I want, I'm hoping to animate each vertex with a short delay to then fill the screen)
As a proof of concept, I've got a vertex to animate off to a random point, using the function below -
tileClick() {
var geo = this.SELECTED.geometry;
var mat = this.SELECTED.material as THREE.MeshBasicMaterial;
TweenMax.TweenLite.to(geo.vertices[0], 0.3, {x: -5, y:5, onUpdate: () =>{
mat.needsUpdate = true;
geo.colorsNeedUpdate = true;
geo.elementsNeedUpdate = true;
}, ease: TweenMax.Elastic.easeOut.config(1, 0.5)});
}
However, now I need to work out the points of the current view of the camera. pseudo code: camera.view.getBoundingClientRect();
Plnkr of WIP - https://next.plnkr.co/edit/Jm4D2zgLtiKBGghC
I believe what you need is THREE.Vector3.unproject. With this method, you can set the vector to x, y, z in screen coordinates, and it'll return x, y, z in world coordinates:
var vector = new THREE.Vector3();
var zNearPlane = -1;
var zFarPlane = 1;
// Top left corner
vector.set( -1, 1, zNearPlane ).unproject( camera );
// Top right corner
vector.set( 1, 1, zNearPlane ).unproject( camera );
// Bottom left corner
vector.set( -1, -1, zNearPlane ).unproject( camera );
// Bottom right corner
vector.set( 1, -1, zNearPlane ).unproject( camera );
Notice that all inputs are in the [-1, 1] range:
x:-1 = left side of screen
x: 1 = right side of screen
y: 1 = top
y:-1 = bottom
z: 1 = far plane
z: -1 = near plane
I'm trying to understand the rotation of the matrices using WebGL.
I got this mat4() matrix and I have to apply these transformations :
m = translate(torsoHeight+1*headHeight, 5, 0.0);
m = mult(m, rotate(theta[head1Id], 1, 0, 0))
m = mult(m, rotate(theta[head2Id], 0, 1, 0));
m = mult(m, translate(0.0, -0.5*headHeight, 0.0));
figure[headId] = createNode( m, head, leftUpperArmId, null);
break;
I did not understand exactly how the mult function works. The first parameter is my matrix.
The theta[] is built in this way :
var theta = [0, 0, 0, 0, 0, 0, 180, 0, 180, 0, 0];
and
var headId = 1;
var head1Id = 1;
var head2Id = 10;
Am I right if I thought that the second parameter is another matrix build with the rotate() function ? In this case how does the rotate function work ?
rotate and translate are functions that create matrices.
rotate looks like it's arguments are (angle, vectorx, vectory, vectorz) to create a matrix rotating points around the given vectory.
mult is the standard mathematical multiplication for 4x4 matrices.
You probably should dig in linear algebra tutorials such as https://open.gl/transformations
I have vertices(x,y,z coords) of a polygon as input. How can I render a polygon having these vertices in three.js?
There is this documentation.But it seems to involve bezier. I need simple straight edged polygon.
You can create a polygon from vertices with the following code:
var geom = new THREE.Geometry();
var v1 = new THREE.Vector3(0,0,0);
var v2 = new THREE.Vector3(0,500,0);
var v3 = new THREE.Vector3(0,500,500);
geom.vertices.push(v1);
geom.vertices.push(v2);
geom.vertices.push(v3);
geom.faces.push( new THREE.Face3( 0, 1, 2 ) );
geom.computeFaceNormals();
var object = new THREE.Mesh( geom, new THREE.MeshNormalMaterial() );
scene.add(object);
Copy and paste this code in and then change x, y, and z coordinates of v1, v2, and v3 (or however many vertices you need) to the coordinates of your vertices.
Essentially, you are creating vertices using THREE.Vector3 to supply the coordinates and then pushing them to the vertices property of an empty THREE.Geometry();
Code is from this answer
THREE.Geometry is removed, try the following method
let coordinates = [
{
x : 1,
y : 1,
z: 10
},
{
x : 2,
y : 1,
z: 10
},
{
x : 2,
y : 2,
z: 10
},
{
x : 1,
y : 2,
z: 10
}
]
let polyShape = new THREE.Shape(coordinates.map((coord) => new THREE.Vector2(coord.x, coord.y)))
const polyGeometry = new THREE.ShapeGeometry(polyShape);
polyGeometry.setAttribute("position", new THREE.Float32BufferAttribute(coordinates.map(coord => [coord.x, coord.y, coord.z]).flat(), 3))
let polygon = new THREE.Mesh(polyGeometry, new THREE.MeshBasicMaterial({ color: "colorYOuWant", side: THREE.DoubleSide}))
scene.add(polygon);
I have created some box geometries in my threejs app and I have successfully drawn a cylinder from the center of one to the center of another using the code below:
function cylinderMesh(pointX, pointY, material) {
var direction = new THREE.Vector3().subVectors(pointY, pointX);
var orientation = new THREE.Matrix4();
orientation.lookAt(pointX, pointY, new THREE.Object3D().up);
orientation.multiply(new THREE.Matrix4(1, 0, 0, 0,
0, 0, 1, 0,
0, -1, 0, 0,
0, 0, 0, 1));
var edgeGeometry = new THREE.CylinderGeometry(2, 2, direction.length(), 8, 1);
var edge = new THREE.Mesh(edgeGeometry, material);
edge.applyMatrix(orientation);
edge.position.x = (pointY.x + pointX.x) / 2;
edge.position.y = (pointY.y + pointX.y) / 2;
edge.position.z = (pointY.z + pointX.z) / 2;
return edge;
}
scene.add(cylinderMesh(vertex1, vertex2, globalMaterial));
My question is: How to I keep the cylinder "connected" to the two vertices I provide if they move?
I don't want to use a THREE.Line because I can't control the width of the line and I have noticed weird issues with clipping if the camera gets too close.
Any ideas?