Applying a "Spread" value to an XMFLOAT4X4 - rotation

I'm attempting to add a small value to a World Matrix in order to replicate the accuracy of a fired weapon [pistol, assault rifle]
Currently, my World Matrix resides at a Parent Objects' position, with the ability to rotate about the Y axis exclusively.
I've done this in Unity3D, running whenever the object needs to be created [once per]:
var coneRotation = Quaternion.Euler(Random.Range(-spread, spread), Random.Range(-spread, spread), 0);
var go = Instantiate(obj, parent.transform.position, transform.rotation * coneRotation) as GameObject;
and am attempting to replicate the results using Direct3D11.
This lambda returns a random value between [-1.5, 1.5] currently:
auto randF = [&](float lower_bound, float uppder_bound) -> float
{
return lower_bound + static_cast <float> (rand()) / (static_cast <float> (RAND_MAX / (uppder_bound - lower_bound)));
};
My first thought was to simply multiply a random x && y into the forward vector of an object upon initialization, and move it in this fashion: position = position + forward * speed * dt; [speed being 1800], though the rotation is incorrect (not to mention bullets fire up).
I've also attempted to make a Quaternion [as in Unity3D]: XMVECTOR quaternion = XMVectorSet(random_x, random_y, 0) and creating a Rotation Matrix using XMMatrixRotationQuaternion.
Afterwards I call XMStoreFloat4x4(&world_matrix, XMLoadFloat4x4(&world_matrix) * rotation);, and restore the position portion of the matrix [accessing world_matrix._41/._42/._43] (world_matrix being the matrix of the "bullet" itself, not the parent).
[I've also tried to reverse the order of the multiplication]
I've read that the XMMatrixRotationQuaternion doesn't return as an Euler Quaternion, and XMQuaternionToAxisAngle does, though I'm not entirely certain how to use it.
What would be the proper way to accomplish something like this?
Many thanks!

Your code XMVECTOR quaternion = XMVectorSet(random_x, random_y, 0); is not creating a valid quaternion. First, if you did not set the w component to 1, then the 4-vector quaternion doesn't actually represent a 3D rotation. Second, a quaternion's vector components are not Euler angles.
You want to use XMQuaternionRotationRollPitchYaw which constructs a quaternion rotation from Euler angle input, or XMQuaternionRotationRollPitchYawFromVector which takes the three Euler angles as a vector. These functions are doing what Unity's Quaternion.Euler method is doing.
Of course, if you want a rotation matrix and not a quaternion, then you can XMMatrixRotationRollPitchYaw or XMMatrixRotationRollPitchYawFromVector to directly construct a 4x4 rotation matrix from Euler angles--which actually uses quaternions internally anyhow. Based on your code snippet, it looks like you already have a base rotation as a quaternion you want to concatenate with your spread quaternion, so you probably don't want to use this option for this case.
Note: You should look at using the C++11 standard <random> rather than your home-rolled lambda wrapper around the terrible C rand function.
Something like:
std::random_device rd;
std::mt19937 gen(rd());
// spread should be in radians here (not degrees which is what Unity uses)
std::uniform_real_distribution<float> dis(-spread, spread);
XMVECTOR coneRotation = XMQuaternionRotationRollPitchYaw( dis(gen), dis(gen), 0 );
XMVECTOR rot = XMQuaternionMultiply( parentRot, coneRotation );
XMMATRIX transform = XMMatrixAffineTransformation( g_XMOne, g_XMZero, rot, parentPos );
BTW, if you are used to Unity or XNA Game Studio C# math libraries, you might want to check out the SimpleMath wrapper for DirectXMath in DirectX Tool Kit.

Related

Matrix multiplication does not work in my vertex shader

Currently, I am calculating the World View Projection Matrix in my application instead of the GPU. I want to move this calculation to the GPU, but I am currently unable to do so.
Case 1 (see below) works very well, but case 2 doesn't and I have no idea what I've done wrong.
In my camera class, I calculate the View and the Projection matrices like this:
ViewMatrix = SharpDX.Matrix.LookAtLH(_cameraPosition, _lookAtPosition, SharpDX.Vector3.UnitY);
ProjectionMatrix = SharpDX.Matrix.PerspectiveFovLH((float)Math.PI / 4.0f, renderForm.ClientSize.Width / (float)renderForm.ClientSize.Height, 0.1f, 500.0f);
Then, I calculate the World matrix for each of my models during the render process:
SharpDX.Matrix worldMatrix = SharpDX.Matrix.Translation(_position);
Case 1: Calculation of matrix in my application
When rendering a model, I calculate the World View Projection Matrix like this:
SharpDX.Matrix matrix = SharpDX.Matrix.Multiply(worldMatrix, camera.ViewMatrix);
matrix = SharpDX.Matrix.Multiply(matrix, camera.ProjectionMatrix);
matrix.Transpose();
And in my vertex shader, I calculate the final position of my vertices by calling:
output.pos = mul(input.pos, WVP);
And everything works fine!
Case 2: Calculation of matrix in HLSL
Instead of calculating anything in my application, I just write the three matrices World, View and Projection into my vertex shader's constant buffer and calculate everything in HLSL:
matrix mat = mul(World, View);
mat = mul(mat, Projection);
mat = transpose(mat);
output.pos = mul(input.pos, mat);
It does work. I don't see anything in my scene. So I assume some calculations were wrong. I checked my code several times.
Either, I am blind or stupid. What did I do wrong?
in hlsl you don't need to calculate the transpose. You shoul also use the float4x4, so it's easy to see which dimensions you use.
Your matrices just should look like:
float4x4 worldViewProj = mul(world, mul(view, proj));
float4 pos = mul(input.pos, worldViewProj);
Keep in mind that points are from type float4(x,y,z,1) and vectors are float4(x,y,z,0).
In linear algebra multiplication of a vector is
𝑝′=𝑀⋅𝑝
so you need the transpose for changing side of M
𝑝′𝑇=pT⋅𝑀𝑇 (the T means the transpose)
HLSL is a bit different.
For the easiest way, just multiply all matrices from left to right and then use the mul function with your vector on the left just as in my example above.
For more information read this: HLSL mul()
I used several days to experiment with the HLSL shaders and with my render functions. It turned out, that I transposed one matrix which I shouldn't have done. As a result, my whole scene was messed up. It was not much fun to debug, but it works now! :-)

LookAt Rotation Using Euler Axis Angles

I'm using the blender game engine and python I made a script that makes an empty follow my cursor in 3D space. (I use the keyboard for height for now).
Now I wanted to implement a LookAt function for a general object rather than a camera, using python. I want the object to look exactly at the point I'm hovering (the empty position) at the screen. For now I'm using a cube so basically one face of the cube should always face the empty.
So, I thought of using matrices or quaternions but the problem is that All I have is a direction vector and I chose the x axis for the local look direction. So either way I need to calculate the euler angles and convert them to axis-rotation angles. (theta*[axis^]).
The resources I have in the Blender Game Engine is: mathutils (provide quarternions, euler based rotations (via axis-angles), matrices) - though it doesn't have any updated documentation which is just annyoingly horrible! I have to print help to get some sort of info!
Now I've been able to make the object look at the empty when I rotate only the Z axis. I used a little trick that handles the angle sign for me using simple trigonometry, so sign is handled and I don't need any matrix trickery or quarternions. The problem begins when I try to rotate once again - I want to rotate the Y axis for the up-down look (as known in 3D we need two sorts of rotations to face someone, the third is just for rotating the view upside-down - "rolling the camrea") since this rotation axis is the look direction vector.
Here's my script:
import bge
from mathutils import Vector, Matrix
import math
# Basic stuff
cont = bge.logic.getCurrentController()
own = cont.owner
scene = bge.logic.getCurrentScene()
c = scene.objects["Cube"]
e = scene.objects["Empty"]
# axises (we're using localOrientation)
x = Vector((1.0,0.0,0.0))
y = Vector((0.0,1.0,0.0))
z = Vector((0.0,0.0,1.0))
vec = Vector(e.worldPosition - c.worldPosition) # direction vector
# Converting direction vector into euler angles
# Using trigonometry we get: tan(psi) = cos(phi2)/cos(phi1)
# Where phi1 is the angle between x axises (euler angle)
# and phi2 is the euler of the y axises.
# psi is the z rotation angle.
# get cos(euler_angle)
phi1 = vec.dot(x)/vec.length # = cos p1
phi2 = vec.dot(y)/vec.length # = cos p2
phi3 = vec.dot(z)/vec.length # = cos p3
# get the rotation/steer angles
zAngle = math.atan(phi2/phi1)
yAngle = math.atan2(phi3,phi1)
xAngle = math.atan(phi2/phi3)
# use only 2 as the third must adapt (also: view concept - x is the looking direction, rotating it would make rolling)
r = c.localOrientation.to_euler()
r.z = zAngle
r.y = -yAngle
#r.x = xAngle
c.localOrientation = r
Seperately each axis works perfectly, but when combined, there are little jump glitches when I get through the global Y axis.
Also, it seems that the "local" orientation in blender is just the same as the "worldOrientation" which is also annoying cause I'm not sure anymore in what frame of reference I'm working anymore. If anyone knows, please help !
Edit 1:
Appearantely there's a built in logic block that handles this for me and when I press "3D" it tracks AND succeeds on rotating BOTH axises. Though, I still want to know what's the problem with my script! What did the 3D button do that I didn't?
Edit 2:
I tried stop making trigo tricks and found out that when I use local orientation I ALWAYS get a gimbal lock in one axis. That's probably what happened behind the scenes. Thanks for anyone interested, if you have any good trick I'd still be glad to hear =]!
I have a youtube tutorial on how to make the camera look at specific objects. It may help.
https://www.youtube.com/watch?v=hwbObDkiJrE
But the concept, when using the gui, is to open the object->relations panel and for the object you want to be doing the LookAt, you make it the child of the object you want it to follow (the parent). You then select 'Vertex' as the relationship. This will then affect the rotation angles of the child object only.
Try this,
bpy.data.objects['child'].parent = bpy.data.objects['parent']
bpy.data.objects['child'].parent_type = 'VERTEX'
and actually there is more info here
https://blender.stackexchange.com/questions/26108/how-do-i-parent-objects

Unprojecting Screen coords to world in OpenGL es 2.0

Long time listener, first time caller.
So I have been playing around with the Android NDK and I'm at a point where I want to Unproject a tap to world coordinates but I can't make it work.
The problem is the x and y values for both the near and far points are the same which doesn't seem right for a perspective projection. Everything in the scene draws OK so I'm a bit confused why it wouldn't unproject properly, anyway here is my code please help thanks
//x and y are the normalized screen coords
ndk_helper::Vec4 nearPoint = ndk_helper::Vec4(x, y, 1.f, 1.f);
ndk_helper::Vec4 farPoint = ndk_helper::Vec4(x, y, 1000.f, 1.f);
ndk_helper::Mat4 inverseProjView = this->matProjection * this->matView;
inverseProjView = inverseProjView.Inverse();
nearPoint = inverseProjView * nearPoint;
farPoint = inverseProjView * farPoint;
nearPoint = nearPoint *(1 / nearPoint.w_);
farPoint = farPoint *(1 / farPoint.w_);
Well, after looking at the vector/matrix math code in ndk_helper, this isn't a surprise. In short: Don't use it. After scanning through it for a couple of minutes, it has some obvious mistakes that look like simple typos. And particularly the Vec4 class is mostly useless for the kind of vector operations you need for graphics. Most of the operations assume that a Vec4 is a vector in 4D space, not a vector containing homogenous coordinates in 3D space.
If you want, you can check it out here, but be prepared for a few face palms:
https://android.googlesource.com/platform/development/+/master/ndk/sources/android/ndk_helper/vecmath.h
For example, this is the implementation of the multiplication used in the last two lines of your code:
Vec4 operator*( const float& rhs ) const
{
Vec4 ret;
ret.x_ = x_ * rhs;
ret.y_ = y_ * rhs;
ret.z_ = z_ * rhs;
ret.w_ = w_ * rhs;
return ret;
}
This multiplies a vector in 4D space by a scalar, but is completely wrong if you're operating with homogeneous coordinates. Which explains the results you are seeing.
I would suggest that you either write your own vector/matrix library that is suitable for graphics type operations, or use one of the freely available libraries that are tested, and used by others.
BTW, the specific values you are using for your test look somewhat odd. You definitely should not be getting the same results for the two vectors, but it's probably not what you had in mind anyway. For the z coordinate in your input vectors, you are using the distances of the near and far planes in eye coordinates. But then you apply the inverse view-projection matrix to those vectors, which transforms them back from clip/NDC space into world space. So your input vectors for this calculation should be in clip/NDC space, which means the z-coordinate values corresponding to the near/far plane should be at -1 and 1.

LibGDX Matrix4 3d rotation problems given direction vector3

By now I am so confused that I'm not sure of my vector math anymore.. I have a Matrix4: MatrixA representing an objects (sensor cube) world transform. I want to place this object so that it's forward direction is pointing in the same direction as a given normalized Vector3: VecA . I also want to translate the objects (i.e. 4 units) in VecA's direction from a given point: VecB (the translation part works, using the same direction vector, VecA)
I have tried all the ways I can think of including rotate()+translate(), setToWorld(), setToLookAt(), setToRotation(), manually editing the values (column 3) of the Matrix4 (this gave the best results in terms of rotation but I get a skewed cube)
I know my direction vector (VecA) is OK. (by printing it's value and also visually confirming it by looking at the working translation using the same vector)
Can someone please tell me how I should do to achieve my desired results, thanks!
Assuming you're unrotated "forward direction" is (0,0,1), your unrotated "up direction" is (0,1,0) and you don't want to rotate the up direction (if possible), then something like this (untested code) should be what you need:
Vector3 vx = new Vector3(), vy = new Vector3(), vz = new Vector3();
Matrix4 m = new Matrix4();
...
vecB.set(vecA).scl(4.f); // if understand correctly, this is what you want
vz.set(vecA).nor();
vx.set(vz).crs(0, 1, 0).nor();
vy.set(vz).crs(vx).nor();
m.idt();
m.val[Matrix4.M00] = vx.x; m.val[Matrix4.M01] = vx.y; m.val[Matrix4.M02] = vx.z;
m.val[Matrix4.M10] = vy.x; m.val[Matrix4.M11] = vy.y; m.val[Matrix4.M12] = vy.z;
m.val[Matrix4.M20] = vz.x; m.val[Matrix4.M21] = vz.y; m.val[Matrix4.M22] = vz.z;
m.trn(vecB);
It is possible that you need to switch the crs arguments though (e.g. vy.set(vx).crs(vz).nor(), in case the rotation is upside-down). Alternatively you could use a Quaternion to specify the rotation and use m.set(vecB, rotationQuaternion);.

Rotate vector using Java 3D

I'm attempting to use Java3D to rotate a vector. My goal is create a transform that will make the vector parallel with the y-axis. To do this, I calculated the angle between the original vector and an identical vector except that it has a z value of 0 (original x, original y, 0 for z-value). I then did the same thing for the y-axis (original x, 0 for y-value, original z). I then used each angle to create two Transform3D objects, multiply them together and apply to the vector. My code is as follows:
Transform3D yRotation = new Transform3D();
Transform3D zRotation = new Transform3D();
//create new normal vector
Vector3f normPoint = new Vector3f (normal.getX(), normal.getY(), normal.getZ());
//****Z rotation methods*****
Vector3f newNormPointZ = new Vector3f(normal.getX(), normal.getY(),0.0F);
float zAngle = normPoint.angle(newNormPointZ);
zRotation.rotZ(zAngle);
//****Y rotation methods*****
Vector3f newNormPointY = new Vector3f(normal.getX(),0.0F, normal.getZ());
float yAngle = normPoint.angle(newNormPointY);
yRotation.rotY(yAngle);
//combine the two rotations
yRotation.mul(zRotation);
System.out.println("before trans normal = " +normPoint.x + ", "+normPoint.y+", "+normPoint.z);
//PRINT STATEMENT RETURNS: before trans normal = 0.069842085, 0.99316376, 0.09353002
//perform transform
yRotation.transform(normPoint);
System.out.println("normal trans = " +normPoint.x + ", "+normPoint.y+", "+normPoint.z);
//PRINT STATEMENT RETURNS: normal trans = 0.09016449, 0.99534255, 0.03411238
I was hoping the transform would produce x and z values of or very close to 0. While the logic makes sense to me, I'm obviously missing something..
If your goal is to rotate a vector parallel to the y axis, why can't you just manually set it using the magnitude of the vector and setting your vector to <0, MAGNITUDE, 0>?
Also, you should know that rotating a vector to be directly pointing +Y or -Y can cause some rotation implementations to break, since they operate according to the "world up" vector, or, <0,1,0>. You can solve this by building your own rotation system and using the "world out" vector <0,0,1> when rotating directly up.
If you have some other purpose for this, fastgraph helped me with building rotation matrices.
It's best to understand the math of what's going on so that you know what to do in the future.

Resources