Matrix multiplication does not work in my vertex shader - matrix

Currently, I am calculating the World View Projection Matrix in my application instead of the GPU. I want to move this calculation to the GPU, but I am currently unable to do so.
Case 1 (see below) works very well, but case 2 doesn't and I have no idea what I've done wrong.
In my camera class, I calculate the View and the Projection matrices like this:
ViewMatrix = SharpDX.Matrix.LookAtLH(_cameraPosition, _lookAtPosition, SharpDX.Vector3.UnitY);
ProjectionMatrix = SharpDX.Matrix.PerspectiveFovLH((float)Math.PI / 4.0f, renderForm.ClientSize.Width / (float)renderForm.ClientSize.Height, 0.1f, 500.0f);
Then, I calculate the World matrix for each of my models during the render process:
SharpDX.Matrix worldMatrix = SharpDX.Matrix.Translation(_position);
Case 1: Calculation of matrix in my application
When rendering a model, I calculate the World View Projection Matrix like this:
SharpDX.Matrix matrix = SharpDX.Matrix.Multiply(worldMatrix, camera.ViewMatrix);
matrix = SharpDX.Matrix.Multiply(matrix, camera.ProjectionMatrix);
matrix.Transpose();
And in my vertex shader, I calculate the final position of my vertices by calling:
output.pos = mul(input.pos, WVP);
And everything works fine!
Case 2: Calculation of matrix in HLSL
Instead of calculating anything in my application, I just write the three matrices World, View and Projection into my vertex shader's constant buffer and calculate everything in HLSL:
matrix mat = mul(World, View);
mat = mul(mat, Projection);
mat = transpose(mat);
output.pos = mul(input.pos, mat);
It does work. I don't see anything in my scene. So I assume some calculations were wrong. I checked my code several times.
Either, I am blind or stupid. What did I do wrong?

in hlsl you don't need to calculate the transpose. You shoul also use the float4x4, so it's easy to see which dimensions you use.
Your matrices just should look like:
float4x4 worldViewProj = mul(world, mul(view, proj));
float4 pos = mul(input.pos, worldViewProj);
Keep in mind that points are from type float4(x,y,z,1) and vectors are float4(x,y,z,0).
In linear algebra multiplication of a vector is
𝑝′=𝑀⋅𝑝
so you need the transpose for changing side of M
𝑝′𝑇=pT⋅𝑀𝑇 (the T means the transpose)
HLSL is a bit different.
For the easiest way, just multiply all matrices from left to right and then use the mul function with your vector on the left just as in my example above.
For more information read this: HLSL mul()

I used several days to experiment with the HLSL shaders and with my render functions. It turned out, that I transposed one matrix which I shouldn't have done. As a result, my whole scene was messed up. It was not much fun to debug, but it works now! :-)

Related

Slant/Skew a Texture - Monogame

I am trying to Slant/Skew a texture to create some shadows for my game.
I have read over this helpful answer that shows this can be done by passing a matrix to spriteBatch.Begin().
Because my linear algebra skills are not very developed, I am having some troubles meeting my desired results. I am hoping to skew my shadow so it looks similar to the following. Where the shadow is slanted by an angle, but the bottom of the shadow lines up with the (feet in this case) bottom of the sprite.
I originally tied the skew matrix provided in the solution above:
Matrix skew = Matrix.Identity;
skew.M12 = (float)Math.Tan(MathHelper.ToRadians(36.87f));
But this ends up rotation the shadow against the world's origin. I see the solution also notes this, and provides the follow to rotate again the sprite.
Matrix myMatrix = Matrix.CreateTranslation(-100, -100, 0)
* Matrix.CreateScale(2f, 0.5f, 1f)
* Matrix.CreateTranslation(100, 100, 0);
Though I'm not sure where to apply this myMatrix Matrix. I have tried applying it to both the shadow sprite, the castingShadow sprite, and also multiplying them together and applying to the shadow with no luck.
I have also tried using other methods like Matrix.CreateRotationX(MathHelper.ToRadians(0.87f)) with no luck.
There is actually a Matrix.CreateShadow() method too, but it requires a Plane, which I have no semblance of in my game.
Can anyone can help me figure out the required Matrix for this slanting, or point me in the direction of some resources?
Thanks!
Okay, so I found a transform to use to get the desired slant.
Thanks to #David Gouveia and #AndreRussell from this post
Matrix matrix = Matrix.CreateRotationX(MathHelper.ToRadians(60)) *
Matrix.CreateRotationY(MathHelper.ToRadians(30)) *
Matrix.CreateScale(1,1,0);
EDIT:
So the above solution solved how I wanted to slant my texture, but had some weird positioning side effects. To address this, I ended up with a transform like the following:
Matrix slant = Matrix.CreateTranslation(-loc.X + angleX, -loc.Y, 0f) *
Matrix.CreateRotationX(MathHelper.ToRadians(angleX)) *
Matrix.CreateRotationY(MathHelper.ToRadians(30)) *
Matrix.CreateScale(1.4f, 1f, 0) *
Matrix.CreateTranslation(loc.X + angleX, loc.Y, 0f);
Where angleX was set based on the "sun" X position and loc vector is where I want the object and object's shadow to appear.

GLSL Shader: FFT-Data as Circle Radius

Im trying to crate a shader, that converts fft-data (passed as a texture) to a bar graphic and then to on a circle in the center of the screen. Here is a image of what im trying to achieve: link to image
i experimentet a bit with shader toy and came along wit this shader: link to shadertoy
with all the complex shaders i saw on shadertoy, it thought this should be doable with maths somehow.
can anybody here give me a hint how to do it?
It’s very doable — you just have to think about the ranges you’re sampling in. In your Shadertoy example, you have the following:
float r = length(uv);
float t = atan(uv.y, uv.x);
fragColor = vec4(texture2D(iChannel0, vec2(r, 0.1)));
So r is going to vary roughly from 0…1 (extending past 1 in the corners), and t—the angle of the uv vector—is going to vary from 0…2π.
Currently, you’re sampling your texture at (r, 0.1)—in other words, every pixel of your output will come from the V position 10% down your source texture and varying across it. The angle you’re calculating for t isn’t being used at all. What you want is for changes in the angle (t) to move across your texture in the U direction, and for changes in the distance-from-center (r) to move across the texture in the V direction. In other words, this:
float r = length(uv);
float t = atan(uv.y, uv.x) / 6.283; // normalize it to a [0,1] range - 6.283 = 2*pi
fragColor = vec4(texture2D(iChannel0, vec2(t, r)));
For the source texture you provided above, you may find your image appearing “inside out”, in which case you can subtract r from 1.0 to flip it.

Applying a "Spread" value to an XMFLOAT4X4

I'm attempting to add a small value to a World Matrix in order to replicate the accuracy of a fired weapon [pistol, assault rifle]
Currently, my World Matrix resides at a Parent Objects' position, with the ability to rotate about the Y axis exclusively.
I've done this in Unity3D, running whenever the object needs to be created [once per]:
var coneRotation = Quaternion.Euler(Random.Range(-spread, spread), Random.Range(-spread, spread), 0);
var go = Instantiate(obj, parent.transform.position, transform.rotation * coneRotation) as GameObject;
and am attempting to replicate the results using Direct3D11.
This lambda returns a random value between [-1.5, 1.5] currently:
auto randF = [&](float lower_bound, float uppder_bound) -> float
{
return lower_bound + static_cast <float> (rand()) / (static_cast <float> (RAND_MAX / (uppder_bound - lower_bound)));
};
My first thought was to simply multiply a random x && y into the forward vector of an object upon initialization, and move it in this fashion: position = position + forward * speed * dt; [speed being 1800], though the rotation is incorrect (not to mention bullets fire up).
I've also attempted to make a Quaternion [as in Unity3D]: XMVECTOR quaternion = XMVectorSet(random_x, random_y, 0) and creating a Rotation Matrix using XMMatrixRotationQuaternion.
Afterwards I call XMStoreFloat4x4(&world_matrix, XMLoadFloat4x4(&world_matrix) * rotation);, and restore the position portion of the matrix [accessing world_matrix._41/._42/._43] (world_matrix being the matrix of the "bullet" itself, not the parent).
[I've also tried to reverse the order of the multiplication]
I've read that the XMMatrixRotationQuaternion doesn't return as an Euler Quaternion, and XMQuaternionToAxisAngle does, though I'm not entirely certain how to use it.
What would be the proper way to accomplish something like this?
Many thanks!
Your code XMVECTOR quaternion = XMVectorSet(random_x, random_y, 0); is not creating a valid quaternion. First, if you did not set the w component to 1, then the 4-vector quaternion doesn't actually represent a 3D rotation. Second, a quaternion's vector components are not Euler angles.
You want to use XMQuaternionRotationRollPitchYaw which constructs a quaternion rotation from Euler angle input, or XMQuaternionRotationRollPitchYawFromVector which takes the three Euler angles as a vector. These functions are doing what Unity's Quaternion.Euler method is doing.
Of course, if you want a rotation matrix and not a quaternion, then you can XMMatrixRotationRollPitchYaw or XMMatrixRotationRollPitchYawFromVector to directly construct a 4x4 rotation matrix from Euler angles--which actually uses quaternions internally anyhow. Based on your code snippet, it looks like you already have a base rotation as a quaternion you want to concatenate with your spread quaternion, so you probably don't want to use this option for this case.
Note: You should look at using the C++11 standard <random> rather than your home-rolled lambda wrapper around the terrible C rand function.
Something like:
std::random_device rd;
std::mt19937 gen(rd());
// spread should be in radians here (not degrees which is what Unity uses)
std::uniform_real_distribution<float> dis(-spread, spread);
XMVECTOR coneRotation = XMQuaternionRotationRollPitchYaw( dis(gen), dis(gen), 0 );
XMVECTOR rot = XMQuaternionMultiply( parentRot, coneRotation );
XMMATRIX transform = XMMatrixAffineTransformation( g_XMOne, g_XMZero, rot, parentPos );
BTW, if you are used to Unity or XNA Game Studio C# math libraries, you might want to check out the SimpleMath wrapper for DirectXMath in DirectX Tool Kit.

Unprojecting Screen coords to world in OpenGL es 2.0

Long time listener, first time caller.
So I have been playing around with the Android NDK and I'm at a point where I want to Unproject a tap to world coordinates but I can't make it work.
The problem is the x and y values for both the near and far points are the same which doesn't seem right for a perspective projection. Everything in the scene draws OK so I'm a bit confused why it wouldn't unproject properly, anyway here is my code please help thanks
//x and y are the normalized screen coords
ndk_helper::Vec4 nearPoint = ndk_helper::Vec4(x, y, 1.f, 1.f);
ndk_helper::Vec4 farPoint = ndk_helper::Vec4(x, y, 1000.f, 1.f);
ndk_helper::Mat4 inverseProjView = this->matProjection * this->matView;
inverseProjView = inverseProjView.Inverse();
nearPoint = inverseProjView * nearPoint;
farPoint = inverseProjView * farPoint;
nearPoint = nearPoint *(1 / nearPoint.w_);
farPoint = farPoint *(1 / farPoint.w_);
Well, after looking at the vector/matrix math code in ndk_helper, this isn't a surprise. In short: Don't use it. After scanning through it for a couple of minutes, it has some obvious mistakes that look like simple typos. And particularly the Vec4 class is mostly useless for the kind of vector operations you need for graphics. Most of the operations assume that a Vec4 is a vector in 4D space, not a vector containing homogenous coordinates in 3D space.
If you want, you can check it out here, but be prepared for a few face palms:
https://android.googlesource.com/platform/development/+/master/ndk/sources/android/ndk_helper/vecmath.h
For example, this is the implementation of the multiplication used in the last two lines of your code:
Vec4 operator*( const float& rhs ) const
{
Vec4 ret;
ret.x_ = x_ * rhs;
ret.y_ = y_ * rhs;
ret.z_ = z_ * rhs;
ret.w_ = w_ * rhs;
return ret;
}
This multiplies a vector in 4D space by a scalar, but is completely wrong if you're operating with homogeneous coordinates. Which explains the results you are seeing.
I would suggest that you either write your own vector/matrix library that is suitable for graphics type operations, or use one of the freely available libraries that are tested, and used by others.
BTW, the specific values you are using for your test look somewhat odd. You definitely should not be getting the same results for the two vectors, but it's probably not what you had in mind anyway. For the z coordinate in your input vectors, you are using the distances of the near and far planes in eye coordinates. But then you apply the inverse view-projection matrix to those vectors, which transforms them back from clip/NDC space into world space. So your input vectors for this calculation should be in clip/NDC space, which means the z-coordinate values corresponding to the near/far plane should be at -1 and 1.

Setting the projectionMatrix of a Perspective Camera in Three.js

I'm trying to set the ProjectionMatrix of a Three.js Perspective Camera to match a projection Matrix I calculated with a different program.
So I set the camera's position and rotation like this:
self.camera.position.x = 0;
self.camera.position.y = 0;
self.camera.position.z = 142 ;
self.camera.rotation.x = 0.0;// -0.032
self.camera.rotation.y = 0.0;
self.camera.rotation.z = 0;
Next I created a 4x4 Matrix (called Matrix4 in Three.js) like this:
var projectionMatrix = new THREE.Matrix4(-1426.149, -145.7176, -523.0170, 225.07519, -42.40711, -1463.2367, -23.6839, 524.3322, -0.0174, -0.11928, -0.99270, 0.43826, 0, 0, 0, 1);
and changed the camera's projection Matrix entries like this:
for ( var i = 0; i < 16; i++) {
self.camera.projectionMatrix.elements[i] = projectionMatrix.elements[i];
}
when I now render the scene I just get a black screen and can't see any of the objects I inserted. Turning the angle of the Camera doesn't help either. I still can't see any objects.
If I insert a
self.camera.updateProjectionMatrix();
after setting the camera's projection Matrix to the values of my projectionMatrix the camera is set back to the original Position (x=0,y=0,z=142 and looking at the origin where I created some objects) and the values I set in the camera's matrix seem to have been overwritten. I checked that by printing the cameras projection Matrix to the console. If I do not call the updateProjectionMatrix() function the values stay as I set them.
Does somebody have an idea how to solve this problem?
If I do not call the updateProjectionMatrix() function the values stay as I set them.
Correct, updateProjectionMatrix() calculates those 16 numbers you pasted in your projection matrix based on a bunch of parameters. Those parameters are, the position and rotation you set above, plus the parameters you passed (or default) for the camera. (these actually make the matrixWorld and its inverse.
In case of a perspective camera, you don't have much - near, far, fov and aspect. Left,right,top,bottom are derived from these, with an orthographic camera you set them directly. These are then used to compose the projection matrix.
Scratch a pixel has a REALLY good tutorial on this subject. The next lesson on the openGL projection matrix is actually more relevant to WebGL. left right top and bottom are made from your FOV and your aspect ratio. Add near and far and you've got yourself a projection matrix.
Now, in order for this thing to work, you either have to know what you're doing, or get really lucky. Pasting these numbers from somewhere else and getting it to work is short of winning the lottery. Best case scenario, you can have your scale all wrong and clipping your scene. Worst case, you've mixed a completely different matrix, different XYZ convention, and there's no way you'll get it to work, or at least make sense.
Out of curiosity, what are you trying to do? Are you trying to match your camera to a camera from somewhere else?

Resources