How to change quaternion to rotation matrix in world coordinate - rotation

I am trying to transform point cloud for register.
I have got quaternion, but when I do the transformation using point cloud library. It seem to transform point cloud in local coordinate instead of world coordinate. So, it can't do the register. I want to know the formula of Quaternion convert to rotation matrix
https://pointclouds.org/documentation/tutorials/matrix_transform.html#
Here is my transformation code
float qw, qx, qy, qz, tx, ty, tz;
qw = kinectmatrix4f[num][0];
qx = kinectmatrix4f[num][1];
qy = kinectmatrix4f[num][2];
qz = kinectmatrix4f[num][3];
tx= kinectmatrix4f[num][4];
ty = kinectmatrix4f[num][5];
tz = kinectmatrix4f[num][6];
//qx = -qx;
qy = -qy;
//init transformation matrix
Eigen::Matrix4f transform_matrix = Eigen::Matrix4f::Identity();
transform_matrix <<
1 - 2 * pow(qy, 2) - 2 * pow(qz, 2), 2 * qx*qy - 2 * qz*qw, 2 * qx*qz + 2 * qy*qw, 0,
2 * qx*qy + 2 * qz*qw, 1 - 2 * pow(qx, 2) - 2 * pow(qz, 2), 2 * qy*qz - 2 * qx*qw, 0,
2 * qx*qz - 2 * qy*qw, 2 * qy*qz + 2 * qx*qw, 1 - 2 * pow(qx, 2) - 2 * pow(qy, 2), 0,
0, 0, 0, 1;
printf("Transforming point cloud %i in rough\n", num);
std::cout << transform_matrix << std::endl;
pcl::transformPointCloud(*cloud_input, *cloud_input, transform_matrix);
However, I try transformation in unity by this code. It seems to change rotation well.
Quaternion rot=new Quaternion(0.4f,0.5f,0.9f,1);
transform.rotation=rot;

I'm not quite sure what you are asking for. But If you are using Eigen here.
You can simply convert quaternion rotation to rotation matrix using Eigen. Quaternionf.toRotationMatrix()
Eigen::Quaternionf quat(qw, qx, qy, qz);
//init transformation matrix
Eigen::Matrix4f transform_matrix = Eigen::Matrix4f::Identity();
// convert Quaternion to rotation matrix
Matrix3f rot_mat = quat.toRotationMatrix();
// assign the rotation part of transform matrix
transform_matrix.block(0, 0, 3, 3) = rot_mat;
// you should create another point cloud to acquire your result
pcl::PointCloud<pcl::PointXYZ>::Ptr transformed_cloud(new pcl::PointCloud <pcl::PointXYZ>)
pcl::PointCloud<pcl::PointXYZ>::Ptr transformed_cloud2(new pcl::PointCloud <pcl::PointXYZ>)
// do transform
pcl::transformPointCloud(*cloud_input, *transformed_cloud, transform_matrix);
// if above transformed_cloud is not what you expected, try inverse the transform matrix to check if your rotation is "back-ward"
pcl::transformPointCloud(*cloud_input, *transformed_cloud2, transform_matrix.inverse());

Related

Controlling a virtual character's joints rotation with OpenNI + Kinect

I'm starting a project where I need to control a virtual character. The character is being rendered in multiple 3D engines, such as Three.JS and iOS SceneKit.
I'm getting the Quaternions of every joint of the skeleton with OpenNI, and it looks kind of like this:
The code that saves and pass the quaternion, looks like this:
float confidence = context.getJointOrientationSkeleton(userId, jointName,
joint);
joints[jointIndex]=joint.m00;
joints[jointIndex+1]=joint.m01;
joints[jointIndex+2]=joint.m02;
joints[jointIndex+3]=joint.m10;
joints[jointIndex+4]=joint.m11;
joints[jointIndex+5]=joint.m12;
joints[jointIndex+6]=joint.m20;
joints[jointIndex+7]=joint.m21;
joints[jointIndex+8]=joint.m22;
jointIndex+=9;
This repeats for every joint of the skeleton.
The last row and last column is always [0 0 0 1] [0, 0, 0, 1], so I just append that on the client once I receive it and build a 4x4 matrix.
I want to be able to make the right rotations with this data, but the rotations I'm getting are definitely wrong.
This is how I'm processing the matrix: (pseudo-code)
row1 = [m00 m01 m02 0]
row2 = [m10 m11 m12 0]
row3 = [m20 m21 m22 0]
row4 = [0 0 0 1]
matrix4by4 = matrix4x4(rows:[row1,row2,row3,row4])
And then I got the quaternion with two methods, and both methods were showing bad rotations, I cannot find what's wrong or what I'm missing.
First method
There's an iOS function that gets a 3x3 or 4x4 matrix, and transforms it into quaternion:
boneRotation = simd_quatf.init(matrix4by4).vector //X,Y,Z,W
Second method
I found the following code on the web:
let tr = m00 + m11 + m22
var qw = 0
var qx = 0
var qy = 0
var qz = 0
if (tr > 0) {
var S = sqrt(tr+1.0) * 2 // S=4*qw
qw = 0.25 * S
qx = (m21 - m12) / S
qy = (m02 - m20) / S
qz = (m10 - m01) / S
} else if ((m00 > m11) && (m00 > m22)) {
var S = sqrt(1.0 + m00 - m11 - m22) * 2 // S=4*qx
qw = (m21 - m12) / S
qx = 0.25 * S
qy = (m01 + m10) / S
qz = (m02 + m20) / S
} else if (m11 > m22) {
var S = sqrt(1.0 + m11 - m00 - m22) * 2 // S=4*qy
qw = (m02 - m20) / S
qx = (m01 + m10) / S
qy = 0.25 * S
qz = (m12 + m21) / S
} else {
let S = sqrt(1.0 + m22 - m00 - m11) * 2 // S=4*qz
var = (m10 - m01) / S
qx = (m02 + m20) / S
qy = (m12 + m21) / S
qz = 0.25 * S
}
boneRotation = vector4(qx, qy, qw, qz)
I started testing only with shoulder and elbow rotation to help me visualize what could be wrong or missing, and made a video.
Here's how it's behaving: https://www.youtube.com/watch?v=xUtNiwH_AGk
What could I be missing? For example, the axis of rotation of the shoulder are like this:
X-Axis
Y-Axis
Z-Axis
Thank you in advance :-)
YouTuve Video: https://www.youtube.com/watch?v=xUtNiwH_AGk

Interpolation Between 2 4x4 Matrices

For skeletal animation using colladas, I need to linearly interpolate between 2 matrices. I saw somewhere that I can use quaternions to interpolate between matrices, but that only works for the rotational component, and I need to preserve the transform as well. Here is my code, which works, except for the translation part:
float total = (orderedBones[i]->Animation->keyFrames[nextKeyFrame] - orderedBones[i]->Animation->keyFrames[nextKeyFrame - 1]) * 100.0;
float progress = orderedBones[i]->Animation->accumTime - orderedBones[i]->Animation->keyFrames[nextKeyFrame - 1] * 100.0;
float interpolation = progress / total;
glm::quat firstQuat = glm::quat_cast(orderedBones[i]->Animation->Matrices[nextKeyFrame - 1]);
glm::quat secondQuat = glm::quat_cast(orderedBones[i]->Animation->Matrices[nextKeyFrame]);
glm::quat finalQuat = glm::slerp(firstQuat, secondQuat, interpolation);
orderedBones[i]->Animation->interpoltaedMatrix = glm::mat4_cast(finalQuat);
Is there any way that I can do this?
I ended up solving my question through a bit more web surfing. For future reference, heres how to do it.
The transformation component is stored in a 4x4 matrix like this:
r r r t
r r r t
r r r t
0 0 0 1
where r is the rotational component and t is the translation component. Because of this, we can represent the translation component as a vector. 2 Vectors can be linearly interpolated, so we interpolate those two vectors and then shove them back into the rotation matrix when they're done. Heres the final code, but its a bit messy:
float total = (orderedBones[i]->Animation->keyFrames[nextKeyFrame] - orderedBones[i]->Animation->keyFrames[nextKeyFrame - 1]) * ANIMATION_MULTIPLICATION_CONST;
float progress = orderedBones[i]->Animation->accumTime - orderedBones[i]->Animation->keyFrames[nextKeyFrame - 1] * ANIMATION_MULTIPLICATION_CONST;
float interpolation = progress / total;
glm::quat firstQuat = glm::quat_cast(orderedBones[i]->Animation->Matrices[nextKeyFrame - 1]);
glm::quat secondQuat = glm::quat_cast(orderedBones[i]->Animation->Matrices[nextKeyFrame]);
glm::quat finalQuat = glm::slerp(firstQuat, secondQuat, interpolation);
orderedBones[i]->Animation->interpoltaedMatrix = glm::mat4_cast(finalQuat);
glm::vec4 transformComp1 = glm::vec4(
orderedBones[i]->Animation->Matrices[nextKeyFrame - 1][0][3],
orderedBones[i]->Animation->Matrices[nextKeyFrame - 1][1][3],
orderedBones[i]->Animation->Matrices[nextKeyFrame - 1][2][3],
orderedBones[i]->Animation->Matrices[nextKeyFrame - 1][3][3]);
glm::vec4 transformComp2 = glm::vec4(
orderedBones[i]->Animation->Matrices[nextKeyFrame][0][3],
orderedBones[i]->Animation->Matrices[nextKeyFrame][1][3],
orderedBones[i]->Animation->Matrices[nextKeyFrame][2][3],
orderedBones[i]->Animation->Matrices[nextKeyFrame][3][3]);
glm::vec4 finalTrans = (float)(1.0 - interpolation) * transformComp1 + transformComp2 * interpolation;
// good for now, although in future the 2 transformation components need to be interpolated
orderedBones[i]->Animation->interpoltaedMatrix[0][3] = finalTrans.x;
orderedBones[i]->Animation->interpoltaedMatrix[1][3] = finalTrans.y;
orderedBones[i]->Animation->interpoltaedMatrix[2][3] = finalTrans.z;
orderedBones[i]->Animation->interpoltaedMatrix[3][3] = finalTrans.w;
Hope that answers anybody else's questions :)
This function is working for me:
glm::mat4 interpolate(glm::mat4& _mat1, glm::mat4& _mat2, float _time)
{
glm::quat rot0 = glm::quat_cast(_mat1);
glm::quat rot1= glm::quat_cast(_mat2);
glm::quat finalRot = glm::slerp(rot0, rot1, _time);
glm::mat4 finalMat = glm::mat4_cast(finalRot);
finalMat[3] = _mat1[3] * (1 - _time) + _mat2[3] * _time;
return finalMat;
}

DirectX 9 rotating around "joints"

I'm having trouble figuring out how to generate matrixes.
Hopefully that picture explains it, but basically I have an initial position, and I'm trying to rotate the main joint, 90 degrees, then following that, rotate the last joint, by 90 degrees. I then apply translation afterwards to get a final matrix (see code). That is applied to a set of points, that are relative to its joint.
The last rotation doesn't seem to work, it is ok if I don't put in the line: matrixPositions[2].appliedRotationMatrix *= (matrixRotX * matrixRotY * matrixRotZ); (the leg is straight down). I must be missing something obvious? Can you not do matrix multiplication this way for rotations?
D3DXMATRIX matrixRotX, matrixRotY, matrixRotZ;
D3DXMatrixRotationX(&matrixRotX, 0);
D3DXMatrixRotationY(&matrixRotY, 0);
D3DXMatrixRotationZ(&matrixRotZ, -PI/2);
matrixPositions[0].appliedRotationMatrix *= (matrixRotX * matrixRotY * matrixRotZ);
D3DXMATRIX matTranslationIn1;
D3DXMatrixTranslation(&matTranslationIn1, (matrixPositions[0].position.x-matrixPositions[1].position.x), (matrixPositions[0].position.y-matrixPositions[1].position.y), (matrixPositions[0].position.z-matrixPositions[1].position.z));
D3DXMATRIX matTranslationOut1;
D3DXMatrixTranslation(&matTranslationOut1, -(matrixPositions[0].position.x-matrixPositions[1].position.x), -(matrixPositions[0].position.y-matrixPositions[1].position.y), -(matrixPositions[0].position.z-matrixPositions[1].position.z));
matrixPositions[1].appliedRotationMatrix *= (matTranslationIn1 * (matrixRotX * matrixRotY * matrixRotZ) * matTranslationOut1);
D3DXMatrixTranslation(&matTranslationIn1, (matrixPositions[0].position.x-matrixPositions[2].position.x), (matrixPositions[0].position.y-matrixPositions[2].position.y), (matrixPositions[0].position.z-matrixPositions[2].position.z));
D3DXMatrixTranslation(&matTranslationOut1, -(matrixPositions[0].position.x-matrixPositions[2].position.x), -(matrixPositions[0].position.y-matrixPositions[2].position.y), -(matrixPositions[0].position.z-matrixPositions[2].position.z));
matrixPositions[2].appliedRotationMatrix *= (matTranslationIn1 * (matrixRotX * matrixRotY * matrixRotZ) * matTranslationOut1);
matrixPositions[2].appliedRotationMatrix *= (matrixRotX * matrixRotY * matrixRotZ);
D3DXMATRIX matrix[3];
for (int x = 0; x < 3; x++)
{
D3DXMatrixIdentity( &matrix[x]);
D3DXMATRIX matTranslation;
D3DXMatrixTranslation(&matTranslation, matrixPositions[x].position.x, matrixPositions[x].position.y, matrixPositions[x].position.z);
matrix[x] = matrix[x] * matrixPositions[x].appliedRotationMatrix * matTranslation;
}
There are two main steps for your requirements.
Rotate joints 0, 1 and 2 around the origin by 90 degrees.
Rotate joint 2 around joint 1 by 90 degrees.
I write some pseudo code, it almost done, but you still need some updates to use it. see comments in the code for details.
void Rotatation()
{
// Build up the rotation matrix for step 1
D3DXVECTOR3 rotAxis(0, 0, 1);
float angle = -(D3DX_PI / 2);
D3DXMATRIX rotMatrix;
D3DXMatrixRotationAxis(&rotMatrix, &rotAxis, angle);
// rotate joints 0, 1 and 2 by apply the matrix above
for (int i = 0; i < 3; i++)
{
joints[i].matrix *= rotMatrix;
}
// Build up the rotation matrix for joint 2
// Since joint 2 was not rotate around the origin(I mean the axis should pass the origin), so first you need to translate the rotation center to origin
// then rotate joint 2, and last move back
// After the rotation in step 1, joint 1 now locate at (0, 2, 0)
// to translate it to the origin.
D3DXMATRIX transMat;
D3DXMatrixTranslation(&transMat, 0, 2, 0);
// Now joint 2 can rotate around z-axis, so the rotate matrix is same as step 1
// after rotation, move back, this matrix is the inverse of transMat
D3DXMATRIX inverseTransMat;
D3DXMatrixTranslation(&transMat, 0, -2, 0);
// Combine the 3 matrix above
D3DXMATRIX rotMatjoin2 = transMat * rotMatjoin2 * inverseTransMat;
// rotate jonit 2
joints[2].matrix *= rotMatjoin2;
}

Why is my cube distorted?

Using a quaternion, if I rotate my cube along an axis by 90 degrees, I get a different front facing cube side, which appears as a straight-on square of a solid color. My cube has different colored sides, so changing the axis it is rotated along gives me these different colors as expected.
When I try to rotate by an arbitrary amount, I get quite the spectacular mess, and I don't know why since I'd expect the quaternion process to work well regardless of the angle:
I am creating a quaternion from 2 vectors using this:
inline QuaternionT<T> QuaternionT<T>::CreateFromVectors(const Vector3<T>& v0, const Vector3<T>& v1)
{
if (v0 == -v1)
return QuaternionT<T>::CreateFromAxisAngle(vec3(1, 0, 0), Pi);
Vector3<T> c = v0.Cross(v1);
T d = v0.Dot(v1);
T s = std::sqrt((1 + d) * 2);
QuaternionT<T> q;
q.x = c.x / s;
q.y = c.y / s;
q.z = c.z / s;
q.w = s / 2.0f;
return q;
}
I think the above method is fine since I've seen plenty of sample code correctly using it.
With the above method, I do this:
Quaternion quat1=Quaternion::CreateFromVectors(vec3(0,1,0), vec3(0,0,1));
It works, and it is a 90-degree rotation.
But suppose I want more like a 45-degree rotation?
Quaternion quat1=Quaternion::CreateFromVectors(vec3(0,1,0), vec3(0,1,1));
This gives me the mess above. I also tried normalizing quat1 which provides different though similarly distorted results.
I am using the quaternion as a Modelview rotation matrix, using this:
inline Matrix3<T> QuaternionT<T>::ToMatrix() const
{
const T s = 2;
T xs, ys, zs;
T wx, wy, wz;
T xx, xy, xz;
T yy, yz, zz;
xs = x * s; ys = y * s; zs = z * s;
wx = w * xs; wy = w * ys; wz = w * zs;
xx = x * xs; xy = x * ys; xz = x * zs;
yy = y * ys; yz = y * zs; zz = z * zs;
Matrix3<T> m;
m.x.x = 1 - (yy + zz); m.y.x = xy - wz; m.z.x = xz + wy;
m.x.y = xy + wz; m.y.y = 1 - (xx + zz); m.z.y = yz - wx;
m.x.z = xz - wy; m.y.z = yz + wx; m.z.z = 1 - (xx + yy);
return m;
}
Any idea what's going on here?
What does your frustum look like? If you have a distorted "lens" such as an exceptionally wide-angle field of view, then angles that actually show the depth, such as an arbitrary rotation, might not look as you expect. (Just like how a fisheye lens on a camera makes perspective look unrealistic).
Make sure you are using a realistic frustum if you want to see realistic images.

CGPathRef / Bezier Curves in OpenGL-ES

I am considering porting an iPhone project from core animation to OpenGL-ES.
I need to render a button that is constructed from CGPathRef s.
But it seems that GL has no provision for Bezier Curves.
Can anyone provide some code that renders a Bezier Curve in GL?
This will accept a series of points to draw a rounded bezier line. It must use point sprites. If you send it a line of three points, and a number of point sprites to draw, it will create a bezeir line. The code is based of something I found somewhere, but I cannot remember where.
It requires:
CGPoint origin - First Point
CGPoint control - Mid Point
CGPoint destination - End Point
int segments - Number of points to render.
To calculate the number of points, I use:
count = MAX(ceilf(sqrtf(([[currentStroke objectAtIndex:i+2] CGPointValue].x - [[currentStroke objectAtIndex:i] CGPointValue].x)
* ([[currentStroke objectAtIndex:i+2] CGPointValue].x - [[currentStroke objectAtIndex:i] CGPointValue].x)
+ ((invertedYThirdCoord - invertedYBegCoord) * (invertedYThirdCoord - invertedYBegCoord))) / 2), 1)*4;
Anyway, the code (in C++):
CGPoint vertices[segments];
CGPoint midPoint;
float x, y;
float t = 0.0;
for(int i = 0; i < (segments); i++)
{
x = pow(1 - t, 2) * origin.x + 2.0 * (1 - t) * t * control.x + t * t * destination.x;
y = pow(1 - t, 2) * origin.y + 2.0 * (1 - t) * t * control.y + t * t * destination.y;
vertices[i] = CGPointMake(x, y);
t += 1.0 / (segments);
}
midPoint = CGPointMake(x, 288 - y);
glVertexPointer(2, GL_FLOAT, 0, vertices);
glDrawArrays(GL_POINTS, 0, segments);
Following this render as normal.

Resources