Is there a way to rotate a generated linestring geometry around one of its points? I've built a length string that is pointing north (only adding length to the one co-oordinate) but I now need to rotate it to a given compass heading.
Geometry objects don't seem to have the ability to be rotated around a point (OL2 did?)
What can I do to rotate this geometry?
I eventually went with generating the geometry dynamically and solving pythagoras.
Given the length of the current linestring geometry segment and the angle in radians, I worked out how to offset the coordinates when extending the LineGeometry to correctly angle the segments.
calculateCoordinateOffset = function(length, angle) {
var _a = angle,
_l = length,
_x,
_y;
_x = _l * Math.sin(_a);
_y = _l * Math.cos(_a);
return [_x, _y];
};
The I add X and Y to the geometry coordinates of the last segment and add those coordinates onto the linestring geometry (addCoordinates()).
Any feedback would be good. My maths is traditionally VERY bad.
Related
There's another answer to this question, but after implementing it, I saw the positions were not true representations of the absolute x,y,z locations of the vertices. I examined the mesh, and looked at the _corners array in the geometry:BoxGeometry property. There I can also find the values. If the cube is rotating on two different axis, and moving on the x and z axis, the y value of the corner position does not change, so these are not accurate values of the x, y and z positions of the corners. I can visibly see the corner positions are moving on the x, y and z axis.
How else can I derive accurate, absolute positions of these corners? So far I cannot find the properties to determine this. Will I need to apply some kind of math using the rotation values with these corner positions?
You only need to get the WorldMatrix of your cube (which contains all transformations) and apply it to a Vector3 with the original vertex positions:
var w = 10;
var h = 15;
var d = 7;
const geom = new THREE.BoxBufferGeometry(w, h, d);
const mat = new THREE.MeshBasicMaterial();
const box = new THREE.Mesh(geom, mat);
// Get world matrix transformation of the box
var boxMatrix = box.matrixWorld;
// Set the initial position of the desired vertex into a Vector3
var vertex = new THREE.Vector3(w / 2, h / 2, d / 2);
// Apply the matrix transformation to the vector
vertex.applyMatrix4(boxMatrix);
// Read the result
console.log(vertex);
I have a THREE.Plane plane which is intersected by a number of THREE.Line3 lines[].
Using only this information, how can I acquire a 2D coordinate set of points?
Edit for better understanding the problem:
The 2D coordinate is related to the plane, so imagine the 3D plane becomes a Cartesian plane drawn on a blackboard. It is pretty much a 3D drawing of a 2D plane. What I want to find is the X, Y values of points previously projected onto this Cartesian plane. But they are 3D, just like the 3D plane.
You don't have enough information. In this answer I'll explain why, and provide more information to achieve what you want, should you be able to provide the necessary information
First, let's create a plane. Like you, I'm uing Plane.setFromNormalAndCoplanarPoint. I'm considering the co-planar point as the origin ((0, 0)) of the plane's Cartesian space.
let normal = new Vector3(Math.random(), Math.random(), Math.random()).normalize()
let origin = new Vector3(Math.random(), Math.random(), Math.random()).normalize().setLength(10)
let plane = new Plane.setFromNormalAndCoplanarPoint(normal, origin)
Now, we create a random 3D point, and project it onto the plane.
let point1 = new Vector3(Math.random(), Math.random(), Math.random()).normalize()
let projectedPoint1 = new Vector3()
plane.projectPoint(point1, projectedPoint1)
The projectedPoint1 variable is now co-planar with your plane. But this plane is infinite, with no discrete X/Y axes. So currently we can only get the distance from the origin to the projected point.
let distance = origin.distanceTo(projectedPoint1)
In order to turn this into a Cartesian coordinate, you need to define at least one axis. To make this truly random, let's compute a random +Y axis:
let tempY = new Vector3(Math.random(), Math.random(), Math.random())
let pY = new Vector3()
plane.projectPoint(tempY, pY)
pY.normalize()
Now that we have +Y, let's get +X:
let pX = new Vector3().crossVectors(pY, normal)
pX.normalize()
Now, we can project the plane-projected point onto the axis vectors to get the Cartesian coordinates.
let x = projectedPoint1.clone().projectOnVector(pX).distanceTo(origin)
if(!projectedPoint1.clone().projectOnVector(pX).normalize().equals(pX)){
x = -x
}
let y = projectedPoint1.clone().projectOnVector(pY).distanceTo(origin)
if(!projectedPoint1.clone().projectOnVector(pY).normalize().equals(pY)){
y = -y
}
Note that in order to get negative values, I check a normalized copy of the axis-projected vector against the normalized axis vector. If they match, the value is positive. If they don't match, the value is negative.
Also, all the clone-ing I did above was to be explicit with the steps. This is not an efficient way to perform this operation, but I'll leave optimization up to you.
EDIT: My logic for determining the sign of the value was flawed. I've corrected the logic to normalize the projected point and check against the normalized axis vector.
Even after hours of googling, I can't really get my head around this. Maybe somebody here can help me a bit with this.
I basically want to determine the Y-rotation of an object (that is always in the viewport's center), relative to my camera. Imagine the object standing on the center of a record player/turntable, that slowly rotates around its Y axis, and my camera always facing the center of that object while using OrbitControls to change the cam's position around the object. Imagine the camera not moving, but the turntable turning, one revolution equals this Y rotation to be between 0° and 360°.
For example, this Y rotation would be:
0° when cam's position is [x=0, y=0, z=100], or [x=0, y=100, z=200] (the cam's y position doesn't matter, it always looks down/up to the group's center),
45° when cam's position is [x=100, y=0, z=100] or [x=100, y=200, z=100],
90° when cam's position is [x=100, y=0, z=0] or [x=200, y=100, z=0], etc.
Thing is, both of these can have some pretty random positions & rotations in the world coordinate system, so it's not given that the object's position is [x=0, y=0, z=0].
Any ideas? Thanks a lot!
I'm not sure if I'm being helpful, but perhaps Object3D.getWorldQuaternion and Quaternion.angleTo() might help?
something like :
const cameraQuaternion = new THREE.Quaternion();
camera.getWorldQuaternion(cameraQuaternion);
const targetQuaternion = new THREE.Quaternion();
target.getWorldQuaternion(targetQuaternion);
const delta = cameraQuaternion.angleTo(targetQuaternion);
const euler = new THREE.Euler().setFromQuaternion(delta);
console.log(euler.y / Math.PI * 180);
I want to have a DOM node track a particle in my THREE.js simulation. My simulation is built with the Points object, using a bufferGeometry. I'm setting the positions of each vertex in the render loop. Over the course of the simulation I'm moving / rotating both the camera and the Points object (through its parent Object3d).
I can't figure out how to get reliable screen coordinates for any of my particles. I've followed the instructions on other questions, like Three.JS: Get position of rotated object, and Converting World coordinates to Screen coordinates in Three.js using Projection, but none of them seem to work for me. At this point I can see that the calculated projections of the vertices are changing with my camera movements and object rotations, but not in a way that I can actually map to the screen. Also, sometimes two particles that neighbor each other on the screen will yield wildly different projected positions.
Here's my latest attempt:
const { x, y, z } = layout.getNodePosition(nodes[nodeHoverTarget].id)
var m = camera.matrixWorldInverse.clone()
var mw = points.matrixWorld.clone()
var p = camera.projectionMatrix.clone()
var modelViewMatrix = m.multiply(mw)
var position = new THREE.Vector3(x, y, z)
var projectedPosition = position.applyMatrix4(p.multiply(modelViewMatrix))
console.log(projectedPosition)
Essentially I've replicated the operations in my shader to derive gl_Position.
projectedPosition is where I'd like to store the screen coordinates.
I'm sorry if I've missed something obvious... I've tried a lot of things but so far nothing has worked :/
Thanks in advance for any help.
I figured it out...
var position = new THREE.Vector3(x, y, z)
var projectedPosition = position.applyMatrix4(points.matrixWorld).project(camera)
i have three 3d (x,y,z) points.
i get them by tracking the corners an object with a kinect.
i now want to translate and rotate a 3d model accordingly.
i get roll and pitch by doing this:
(i am using openframeworks.cc so some of the class methods might seem strange to people)
ofVec3f v10 = pointB - pointA;
ofVec3f v20 = pointC - pointA;
v10.normalize();
v20.normalize();
//create rotation matrix for roll+pitch relative to up vector 0,0,1
ofVec3f normaleVec = v10.crossed(v20);
ofVec3f fromVec = ofVec3f(0,0,1);
ofVec3f toVec = normaleVec;
mMR0.makeRotationMatrix(fromVec,toVec);
to get the heading / yaw i do this:
ofVec3f myV0_flat = avePointA*mMR0.getInverse();
ofVec3f myV1_flat = avePointB*mMR0.getInverse();
//get points relative to origion
ofVec3f myV10_flat = myV1_flat - myV0_flat;
//create rotation matrix for heading relative to flat 2d plane
float angle = atan2(myV10_flat.x,myV10_flat.y)/M_PI*180;
mMR1.makeRotationMatrix(angle,fromVec);
and finally create translation matrix and combine all the matrices:
mMT1.makeTranslationMatrix(avePointD); //translate from origin
ofMatrix4x4 mMc;
mMc = mMR0 * mMR1 * mMT1;
but when my 3d model rotates around it seems to dip always at the same angle.
my question is. how would i calculate roll and pitch separately, so i can where it dips and how to fix it.
thx.
s.