WebGL rotate function - rotation

I'm trying to understand the rotation of the matrices using WebGL.
I got this mat4() matrix and I have to apply these transformations :
m = translate(torsoHeight+1*headHeight, 5, 0.0);
m = mult(m, rotate(theta[head1Id], 1, 0, 0))
m = mult(m, rotate(theta[head2Id], 0, 1, 0));
m = mult(m, translate(0.0, -0.5*headHeight, 0.0));
figure[headId] = createNode( m, head, leftUpperArmId, null);
break;
I did not understand exactly how the mult function works. The first parameter is my matrix.
The theta[] is built in this way :
var theta = [0, 0, 0, 0, 0, 0, 180, 0, 180, 0, 0];
and
var headId = 1;
var head1Id = 1;
var head2Id = 10;
Am I right if I thought that the second parameter is another matrix build with the rotate() function ? In this case how does the rotate function work ?

rotate and translate are functions that create matrices.
rotate looks like it's arguments are (angle, vectorx, vectory, vectorz) to create a matrix rotating points around the given vectory.
mult is the standard mathematical multiplication for 4x4 matrices.
You probably should dig in linear algebra tutorials such as https://open.gl/transformations

Related

Creating a rotate3D() function for PMatrix3D in Processing

Some time ago, I coded a little fidgetable logo based on CSS transforms alone.
You can fiddle with it over https://document.paris/
The result feels nice, it feels natural to click/touch and drag to rotate the logo.
I remember banging my head against the walls until I found out that I could chain CSS transforms quite easily just by chaining them.
transform: matrix3d(currentMatrix) rotate3d(x, y, z, angle);
And most importantly to get the currentMatrix I would simply do m = $('#logobackground').css('transform'); with jQuery, the browser would magically return the computed matrix instead of the raw "css" which actually avoided me to deal with matrices or to infinitely stack rotate3D() properties.
So the hardest part was then to calculate the rotate3D arguments (x, y, z, angle) based on mouse inputs. In theory shouldn't have problems transposing this part to java so i'll just skip over it.
Now
I'm trying to do the exact same thing with Processing and there is two problems :
There is no rotate3D() in processing.
There is no browser to apply/chain transformations and return me the current matrix state automatically.
Here's the plan/implementation I'm working on :
I need a "currentMatrix" to apply every frame to the scene
PMatrix3D currentMatrix = new PMatrix3D();
In the setup() I set it to the "identity matrix" which from what I understand is equivalent to "no transformation".
// set currentMatrix to identity Matrix
currentMatrix.set(1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1);
Every frame I would calculate a transformation matrix and apply it to the currentMatrix.
Then I would apply this matrix to the scene.
// Apply Matrix to the currentMatrix
void mouseRotate() {
float diag = sqrt(pow(width,2)+pow(height,2));
float x = deltaX()/ diag * 10; // deltaX = difference between previous prevous MouseX and current mouseX)
float y = deltaY()/ diag * 10; // deltaY = same with Y axis
float angle = sqrt( pow(x, 2) + pow(y, 2) );
currentMatrix.apply( rotate3D(y,x,0,angle) );
}
// Apply Matrix to the scene
applyMatrix(currentMatrix);
PMatrix3D reference : https://processing.github.io/processing-javadocs/core/processing/core/PMatrix3D.html
ApplyMatrix() reference : https://processing.org/reference/applyMatrix_.html
All I need to do then is to implement the rotate3D css transform as a function which returns a transformation matrix.
Based on what I found on this page https://developer.mozilla.org/en-US/docs/Web/CSS/transform-function/rotate3d()
I implemented this first function :
PMatrix3D rotate3D(float x, float y, float z, float a) {
PMatrix3D rotationMatrix = new PMatrix3D();
rotationMatrix.set(
1+(1-cos(a))*(pow(x,2)-1), z*sin(a)+x*y*(1-cos(a)), -y*sin(a)+x*z*(1-cos(a)), 0,
-z*sin(a)+x*y*(1-cos(a)), 1+(1-cos(a))*(pow(y,2)-1), x*sin(a)+y*z*(1-cos(a)), 0,
y*sin(a)+x*z*(1-cos(a)), -x*sin(a)+y*z*(1-cos(a)), 1+(1-cos(a))*(pow(z,2)-1), 0,
0,0,0,1
);
return rotationMatrix;
}
and based on what I found on this page https://drafts.csswg.org/css-transforms-2/#Rotate3dDefined
I implemented this other function :
PMatrix3D rotate3Dbis(float getX, float getY, float getZ, float getA) {
float sc = sin(getA/2)*cos(getA/2);
float sq = pow(sin(getA/2),2);
float normalizer = sqrt( pow(getX,2) + pow(getY,2) + pow(getZ,2) );
float x = getX/normalizer;
float y = getY/normalizer;
float z = getZ/normalizer;
PMatrix3D rotationMatrix = new PMatrix3D();
rotationMatrix.set(
1-2*(pow(y,2)+pow(z,2))*sq, 2*(x*y*sq-z*sc), 2*(x*z*sq+y*sc), 0,
2*(x*y*sq+z*sc), 1-2*(pow(x,2)+pow(z,2))*sq, 2*(y*z*sq-x*sc), 0,
2*(x*z*sq-y*sc), 2*(y*z*sq+x*sc), 1-2*(pow(x,2)+pow(y,2)*sq), 0,
0, 0, 0, 1
);
return rotationMatrix;
}
When testing, they don't produce exactly the same result with the same inputs (although the differences are kind of "symmetric" which makes me think that they are kind of equivalent at least in some way ?) Also rotate3Dbis() has a tendency to produce NaN numbers, especially when i'm not moving the mouse (x & y = 0).
But most importantly, in the end it doesn't work. Instead of rotating, the drawing just zooms out progressively when I'm using rotate3D(), and rotate3Dbis() doesn't render correctly because of the NaNs.
The overall question :
I'm trying to get guidance from people who understand transformations Matrices and trying to narrow down where the issue is. Are my processing/java implementations of rotate3D() flawed ? Or would the issue come from somewhere else ? And are my rotate3D() and rotate3Dbis functions equivalent ?
You might get away with simply rotating on X and Y axis, as you already mentioned, using the previous and current mouse coordinates:
PVector cameraRotation = new PVector(0, 0);
void setup(){
size(900, 900, P3D);
rectMode(CENTER);
strokeWeight(9);
strokeJoin(MITER);
}
void draw(){
//update "camera" rotation
if (mousePressed){
cameraRotation.x += -float(mouseY-pmouseY);
cameraRotation.y += float(mouseX-pmouseX);
}
background(255);
translate(width * 0.5, height * 0.5, 0);
rotateX(radians(cameraRotation.x));
rotateY(radians(cameraRotation.y));
rect(0, 0, 300, 450);
}
The Document Paris example you've shared also uses easing. You can have a look at this minimal easing Processing example
Here's a version of the above with easing applied:
PVector cameraRotation = new PVector();
PVector cameraTargetRotation = new PVector();
float easing = 0.01;
void setup(){
size(900, 900, P3D);
rectMode(CENTER);
strokeWeight(9);
strokeJoin(MITER);
}
void draw(){
//update "camera" rotation
if (mousePressed){
cameraTargetRotation.x += -float(mouseY-pmouseY);
cameraTargetRotation.y += float(mouseX-pmouseX);
}
background(255);
translate(width * 0.5, height * 0.5, 0);
// ease rotation
rotateX(radians(cameraRotation.x -= (cameraRotation.x - cameraTargetRotation.x) * easing));
rotateY(radians(cameraRotation.y -= (cameraRotation.y - cameraTargetRotation.y) * easing));
fill(255);
rect(0, 0, 300, 450);
fill(0);
translate(0, 0, 3);
rect(0, 0, 300, 450);
}
Additionally there's a library called PeasyCam which can make this much simpler.
If you do want to implement your own version using PMatrix3D here are a couple of tips that could save you time:
When you instantiate PMatrix3D() it's the identity matrix. If you have transformations applied and you want to reset() to identity.
If you want to rotate a PMatrix3D() around and axis the rotate(float angleInRadians, float axisX, float axisY, float axisZ) override should help.
Additionally you could get away without PMatrix3D since resetMatrix() will reset the global transformation matrix and you can call rotate(float angleInRadians, float axisX, float axisY, float axisZ) directly.
Part of the answer is a fix added to the first rotate3D function.
I needed to normalize the x,y,z values to avoid the weird scaling.
I'm posting the current state of the code (i'm skipping a few parts for the sake of simplicity):
// Mouse movement since last fame on X axis
float deltaX() {
return (float)(mouseX-pmouseX);
}
// Mouse movement since last fame on Y axis
float deltaY() {
return (float)(mouseY-pmouseY);
}
// Convert user input into angle and amount to rotate to
void mouseRotate() {
double diag = Math.sqrt(Math.pow(width,2)+Math.pow(height,2));
double x = deltaX()/ diag * 50;
double y = -deltaY()/ diag * 50;
double angle = Math.sqrt( x*x + y*y );
currentMatrix.apply( rotate3D((float)y,(float)x,0,(float)angle) );
}
// Convert those values into a rotation matrix
PMatrix3D rotate3D(float getX, float getY, float getZ, float getA) {
float normalizer = sqrt( getX*getX + getY*getY + getZ*getZ );
float x = 0;
float y = 0;
float z = 0;
if (normalizer != 0) {
x = getX/normalizer;
y = getY/normalizer;
z = getZ/normalizer;
}
float x2 = pow(x,2);
float y2 = pow(y,2);
float z2 = 0;
float sina = sin(getA);
float f1cosa = 1-cos(getA);
PMatrix3D rotationMatrix = new PMatrix3D(
1+f1cosa*(x2-1), z*sina+x*y*f1cosa, -y*sina+x*z*f1cosa, 0,
-z*sina+x*y*f1cosa, 1+f1cosa*(y2-1), x*sina+y*z*f1cosa, 0,
y*sina+x*z*f1cosa, -x*sina+y*z*f1cosa, 1+f1cosa*(z2-1), 0,
0, 0, 0, 1
);
return rotationMatrix;
}
// Draw
draw() {
mouseRotate();
applyMatrix(currentMatrix);
object.render();
}
I thought that using this method would allow me to "stack" cumulative rotations relative to the screen and not relative to the object. But the result seems to always do the rotation relative to the object drawn.
I am not using a camera because I basically only want to rotate the object on itself. I'm actually a bit lost atm on what I should rotate and when to that the newly applied rotations are relative to the user, and the previously applied rotation are conserved.

Three.js Vector.angleTo: How to also get the bigger value

Given a vector v1=(0,0,1) and two vectors v2=(1,0,0) and v3=(-1,0,0) I would expect v1.angleTo(v2) and v1.angleTo(v3) to to return different results, i.e. 1/2 PI and 3/2 PI.
However, both return 1/2 PI:
var v1 = new THREE.Vector3(0, 0, 1);
var v2 = new THREE.Vector3(1, 0, 0);
var v3 = new THREE.Vector3(-1, 0, 0);
v1.angleTo(v2)
1.5707963267948966
v1.angleTo(v3)
1.5707963267948966
It seems that angleTo always returns the smaller angle, i.e. values between 0 and PI.
How can I get the expected value/behavior?
angleTo always returns the smaller angle. See the implementation of angleTo in https://github.com/mrdoob/three.js/blob/master/src/math/Vector3.js.
If the angle should always be determined in one direction (i.e. either counterclockwisely or clockwisely), a simple solution for 2d vectors (or n-d vectors in a 2d plane parallel to a two-axes-plane of the coordinate system, as in the example given in the question) is:
var orientation = v1.x * v2.z - v1.z * v2.x;
if(orientation > 0) angle = 2*Math.PI - angle;

OpenSceneGraph osg::Quat: shape not rotating

I have a small function to create a new instance of a WorldObject.
I want to use osg::ref_ptr<osg::PositionAttitudeTransform> for translation and rotation but there is a problem I can't figure out.
I use setTranslation() with a Vec3 which works very well. But the Quat with makeRotation() just does nothing.
Here is the code:
osg::ref_ptr <osg::PositionAttitudeTransform> getWorldObjectClone(const char* name, osg::Vec3 position = osg::Vec3(0, 0, 0), osg::Vec3 rotation = osg::Vec3(0, 0, 0))
{
osg::ref_ptr <osg::PositionAttitudeTransform> tmp = new osg::PositionAttitudeTransform;
osg::Quat q(0, osg::Vec3(0, 0, 0));
tmp = dynamic_cast<osg::PositionAttitudeTransform*>(WorldObjects[name]->clone(osg::CopyOp::DEEP_COPY_ALL));
tmp->setPosition(position);
q.makeRotate(rotation.x(), 1, 0, 0);
q.makeRotate(rotation.y(), 0, 1, 0);
q.makeRotate(rotation.z(), 0, 0, 1);
tmp->setAttitude(q);
return tmp;
}
I tried rotation = {90,0,0} (degrees) and rotation = {1,0,0} (radians) but both have no effect. Is there an mistake in how the code is using the Quat?
The rotation method you are using works with radians.
If you want to rotate 90 degrees around the X axis, you need to call:
q.makeRotate(osg::PI_2, 1, 0, 0 );
// or the equivalent
q.makeRotate(osg::PI_2, osg::X_AXIS);
Keep in mind that every call to makeRotate will reset the full quaternion to the given rotation. If you're trying to concatenate several rotations, you have to multiply the corresponding quaternions.
For instance:
osg::Quar xRot, yRot;
// rotate 90 degrees around x
xRot.makeRotate(osg::PI_2, osg::X_AXIS);
// rotate 90 degrees around y
yRot.makeRotate(osg::PI_2, osg::Y_AXIS);
// concatenate the 2 into a resulting quat
osg::Quat fullRot = xRot * yRot;

Correct transformation order for scene graph

I am working on a quick WebGL Engine with a scene graph to quickly prototype my game idea on reddit (https://www.reddit.com/r/gameideas/comments/3dsy8m/revolt/). Now, after I have got some basic rendering done, I can't figure out the correct order, well the one that looks right to most people, that I am supposed to use in order to transform the nodes in the scene graph.
It's hard to explain what is happening but I hope you get a understanding that it just isn't rotating the way that most would expect it to happen in any other engine.
Here is a simplified version of what I am currently doing.
Mat4 = glMatrix 0.9.5
Utils = Custom Utilitys
Node(Render):
#param {parentMatrix}
// Create Local Matrix
self.lMatrix = mat4.create();
mat4.identity(self.lMatrix);
mat4.translate(self.lMatrix, self.position);
mat4.rotate(self.lMatrix, self.rotation[0], [1, 0, 0]);
mat4.rotate(self.lMatrix, self.rotation[1], [0, 1, 0]);
mat4.rotate(self.lMatrix, self.rotation[2], [0, 0, 1]);
var wMatrix = mat4.create();
mat4.identity(wMatrix);
if(parentMatrix){
mat4.multiply(self.lMatrix, parentMatrix, wMatrix);
}
else{
mat4.set(self.lMatrix, wMatrix);
}
// Render
var children = this.children,
numChildren = children.length,
child;
for(var i = 0; i < numChildren; i++){
child = children[i];
child.render(wMatrix);
}
Entity(Render):
#param {parentMatrix}
// Set Transformation matrix
var tMatrix = mat4.create();
mat4.identity(tMatrix);
mat4.translate(tMatrix, self.position);
mat4.rotate(tMatrix, self.rotation[0], [1, 0, 0]);
mat4.rotate(tMatrix, self.rotation[1], [0, 1, 0]);
mat4.rotate(tMatrix, self.rotation[2], [0, 0, 1]);
mat4.scale(tMatrix, self.scale || [1, 1, 1]);
var wMatrix = mat4.create();
mat4.identity(wMatrix);
mat4.multiply(tMatrix, parentMatrix, wMatrix);
Utils.loadTMatrix(wMatrix);
this.texture.bind();
this.mesh.render();
The usual order is SRT, or scale, rotate then translate.
Also I am not sure if you can just do
mat4.rotate(tMatrix, self.rotation[0], [1, 0, 0]);
mat4.rotate(tMatrix, self.rotation[1], [0, 1, 0]);
mat4.rotate(tMatrix, self.rotation[2], [0, 0, 1]);
with euler angles and get the correct result orientation. I dont use euler angles so I dont fully grasp the details. Somebody please correct me if Im wrong. See this page for conversions between euler angle and rotation matrix: http://www.euclideanspace.com/maths/geometry/rotations/conversions/eulerToMatrix/.
I didn't find the original way that I was hoping for because I was previously caching matrices, and was hoping to continue doing it, but now I have found a much easier way after recreating my old engine from scratch.
Engine.prototype.NODE.prototype.render = function(parentMatrix){
var children = this.children,
numChildren = children.length,
child, pos, rot, scale;
// If has set matrix to a copy of it
if(parentMatrix){
this.matrix = mat4.clone(parentMatrix);
}
else{
// Else set it to a identity matrix
mat4.identity(this.matrix);
}
// If matrix needs updating reconstruct it
pos = [this.position.x,
this.position.y,
this.position.z];
rot = [this.rotation.x,
this.rotation.y,
this.rotation.z];
scale = [this.scale.x,
this.scale.y,
this.scale.z];
// Recreate Transformation matrix
mat4.translate(this.matrix, this.matrix, pos);
mat4.rotate(this.matrix, this.matrix, rot[0], [1, 0, 0]);
mat4.rotate(this.matrix, this.matrix, rot[1], [0, 1, 0]);
mat4.rotate(this.matrix, this.matrix, rot[2], [0, 0, 1]);
mat4.scale(this.matrix, this.matrix, scale);
// Render Children with this matrix
for(var i = 0; i < numChildren; i++){
child = children[i];
child.render(this.matrix);
}
}
what I am basically doing is that, if the matrix has a parent (it isn't the root node) then I am starting the matrix off as a clone of its parent, else I am setting the matrix to it's identity matrix. Then applying the regular transformations to it. If I find a way in order to continue caching matrices I will uploaded it as soon as possible.

when using solvePnP function in OPENCV 3.0

what i wanna do is to get 4 vertice pixel points(2D coordinate) of QR-code,
and input both them and World-3D-coordination of QR-code as parameter of function, solvePnP.
but when i compile, solvePnP doesn't work! the error occurred something like this..
Assertion failed (npoints >= 0 && npoints == std::max(ipoints.checkVector(2, CV_32F), ipoints,checkVector(2, CV_64F))) in cv::solvePnP
in solvePnP, it declared that it can use std::Vector type, or cv::Mat type, so i tried to change both of those date types. but it still can't..
my source code is below,
***Point3d pt[4];
pt[0] = Point3d(0, 0, 0);
pt[1] = Point3d(0, 178, 0);
pt[2] = Point3d(178, 178, 0);
pt[3] = Point3d(178, 0, 0);
vector<Point3f> objectPoints;
for (int i = 0; i < 4; i++)
objectPoints.push_back(pt[i]); // 3d world coordinates
Point2d point[4];***
and after this procedure, i got the 4 vertices coordinates into point[] from QR code. and next is,
vector<Point2f> imagePoints;
for (int i = 0; i < 4; i++)
imagePoints.push_back(point[i]); // 2d image coordinates
//Mat objPts(4, 1, CV_64F, pt);
//Mat imgPts(4, 1, CV_64F, point);
// camera parameters
double Intrinsic[] = { fx, 0, cx, 0, fy, cy, 0, 0, 1 };
Mat Camera_Matrix(3, 3, CV_64FC1, Intrinsic);
double Distort[] = { k1, k2, p1, p2 };
Mat DistortCoeffs(4, 1, CV_64FC1, Distort);
// estimate camera pose
Mat rvec, tvec; // rotation & translation vectors
solvePnP(objectPoints, imagePoints, Camera_Matrix, DistortCoeffs, rvec, tvec);
please help!
In your code, the array "point/pt" is of Point2d, but "objectPoints/imagePoints" is a vector of Point2f.
By the way, different to the documentation, it seems that the solvePnP function requires the object points and image points to be in the format of vector or cv::Mat(N*2/3). I tried using cv::Mat(2/3*N) as input, but the same error of assertion failure appears.
You may follow the official example to help debug. It locates in /samples/cpp/tutorial_code/calib3d/real_time_pose_estimation/src.

Resources