DirectX 9 - Quads becoming stretched when rotated in orthographic view - rotation

I'm working on some code to render a list of sprites using one large vertex buffer in DirectX 9 as textured quads. I'm having an issue with these textured quads becoming stretched out when rotated and rendered in an orthographic view. The odd thing is that no stretching occurs when rotated 0 or PI radians, and stretching is at its highest when rotated 1/2 PI or 1 1/2 PI radians.
It seems plausible that this is down to the view or projection matrices, or an issue when generating the rotation matrix, however I have been unable to find any solution after plenty of tweaking and testing. For reference, the texture used is 32x32 pixels.
Here's the code I use initially to set the projection and view matrices for all sprites:
D3DXMATRIX projection;
D3DXMatrixOrthoOffCenterLH(&projection, -(float)ScreenWidth / 2.0f,
(float)ScreenWidth / 2.0f,
(float)ScreenHeight / 2.0f,
-(float)ScreenHeight / 2.0f,
1.0f, 1000.0f);
HR(mFX->SetMatrix(mProjectionHandle, &projection));
D3DXMATRIX view;
D3DXVECTOR3 pos(0.0f, 0.0f, -100.0f);
D3DXVECTOR3 up(0.0f, 1.0f, 0.0f);
D3DXVECTOR3 target(0.0f, 0.0f, 0.0f);
D3DXMatrixLookAtLH(&view, &pos, &target, &up);
HR(mFX->SetMatrix(mViewHandle, &view));
And the code for creating the rotation matrix:
D3DXMatrixRotationZ(&_rotation,
(2.0f * D3DX_PI) - _spriteComponent.GetRotation());
And finally the shader code:
outVS.col.a = gAlpha;
outVS.posH = mul(float4(posL, 1.0f), gWorld);
outVS.posH = mul(outVS.posH, gView);
outVS.posH = mul(outVS.posH, gProjection);
outVS.posH = mul(outVS.posH, gScaling);
outVS.posH = mul(outVS.posH, gRotation);
outVS.posH = mul(outVS.posH, gTranslation);
outVS.tex0 = tex0;
return outVS;
I have also tried using the D3DXMatrixOrthoLH() function when setting the projection matrix and the same issue occurs. Any ideas what could be causing this odd scaling?

Related

glm::orto in Vulkan

I have simple triangle with indices:
{ { 0.0f, -0.1f } },
{ { 0.1f, 0.1f } },
{ { -0.1f, 0.1f } }
Matrix:
ubo.model = glm::mat4(1.0f);
ubo.view = glm::lookAt(glm::vec3(0.0f, 0.0f, 1.0f), glm::vec3(0.0f, 0.0f, 0.0f), glm::vec3(0.0f, 1.0f, 0.0f));
ubo.proj = glm::perspective(glm::radians(45.0f), swapChainExtent.width / (float)swapChainExtent.height, 0.1f, 100.0f);
ubo.proj[1][1] *= -1;
This code work fine, I see triangle. But, if I try use orthographic projection:
ubo.model = glm::mat4(1.0f);
ubo.view = glm::lookAt(glm::vec3(0.0f, 0.0f, 1.0f), glm::vec3(0.0f, 0.0f, 0.0f), glm::vec3(0.0f, 1.0f, 0.0f));
ubo.proj = glm::ortho(0.0F, swapChainExtent.width, swapChainExtent.height, 0.0F);
ubo.proj[1][1] *= -1;
I do not see anything. :(
I tried to googled this problem and found no solution. What's my mistake?
Update:
Solved:
rasterizer.frontFace = VK_FRONT_FACE_CLOCKWISE;
...
ubo.proj = glm::ortho(0.0F, swapChainExtent.width, swapChainExtent.height, 0.1F, 1000.0F);
First, I don't know what Z range this overload of glm::ortho produces. Maybe Your vertices don't fit in that range. There is an overload version of this function which allows You to provide a Z/depth range. Try providing a range that covers Your Z/depth values or try moving vertices further away or closer to the camera. Or provide a range like from -1000 to +1000.
And another problem. How big is Your swapchain? If it is, for example, 800 x 600 pixels then You specify rendering area in the [0, 0] to [800, 600] range (in pixels). But You provide vertices that lie in area smaller than a single pixel, in the [-0.1, -0.1] to [0.1, 0.1] range (still in pixels). It's nothing strange You don't see anything, because Your whole triangle is smaller than a single pixel.
Probably these two problems caused that You don't see anything. When You change Your depth, You don't see anything due to triangle being to small. When You change the size of Your triangle (without changing depth), object is view-frustum culled. Change the size of Your triangle and then try changing depth values of vertices.

OpenGL Matrix scale then Translate is still scaling my position

I am trying to position my text model mesh on screen. Using the code below, it draws mesh as the code suggests; with the left of the mesh at the center of the screen. But, I would like to position it at the left of edge of the screen, and this is where I get stuck. If I un-comment the Matrix.translateM line, I would think the position will now be at the left of the screen, but it seems that the position is being scaled (!?)
A few scenarios I have tried:
a.) Matrix.scaleM only (no Matrix.translateM) = the left of the mesh is positioned 0.0f (center of screen), has correct scale.
b.) Matrix.TranslateM only (no Matrix.scaleM) = the left of the mesh is positioned -1.77f at the left of screen correctly, but scale incorrect.
c.) Matrix.TranslateM then Matrix.scaleM, or Matrix.scaleM then Matrix.TranslateM = the scale is correct, but position incorrect. It seems the position is scaled and is very much closer to the center than to the left of the screen.
I am using OpenGL ES 2.0 in Android Studio programming in Java.
Screen bounds (as setup from Matrix.orthoM)
left: -1.77, right: 1.77 (center is 0.0), top: -1.0, bottom: 1.0 (center is 0.0)
Mesh height is 1.0f, so if no Matrix.scaleM, the mesh takes the entire screen height.
float ratio = (float) 1920.0f / 1080.0f;
float scale = 64.0f / 1080.0f; // 64px height to projection matrix
Matrix.setIdentityM(modelMatrix, 0);
Matrix.scaleM(modelMatrix, 0, scale, scale, scale); // these two lines
//Matrix.translateM(modelMatrix, 0, -ratio, 0.0f, 0.0f); // these two lines
Matrix.setIdentityM(mMVPMatrix, 0);
Matrix.orthoM(mMVPMatrix, 0, -ratio, ratio, -1.0f, 1.0f, -1.0f, 1.0f);
Matrix.multiplyMM(mMVPMatrix, 0, mMVPMatrix, 0, modelMatrix, 0);
Thanks, Ed Halferty and Matic Oblak, you are both correct. As Matic suggested, I have now put the Matrix.TranslateM first, then Matrix.scaleM second. I have also ensured that the MVPMatrix is indeed modelviewprojection, and not projectionviewmodel.
Also, now with Matrix.translateM for the model mesh to -1.0f, it is to the left edge of the screen, which is better than -1.77f in any case.
Correct position + scale, thanks!
float ratio = (float) 1920.0f / 1080.0f;
float scale = 64.0f / 1080.0f;
Matrix.setIdentityM(modelMatrix, 0);
Matrix.translateM(modelMatrix, 0, -1.0f, 0.0f, 0.0f);
Matrix.scaleM(modelMatrix, 0, scale, scale, scale);
Matrix.setIdentityM(mMVPMatrix, 0);
Matrix.orthoM(mMVPMatrix, 0, -ratio, ratio, -1.0f, 1.0f, -1.0f, 1.0f);
Matrix.multiplyMM(mMVPMatrix, 0, modelMatrix, 0, mMVPMatrix, 0);

How to scale and rotate textures in opengl es?

I am using opengl ES for my iphone game. To scale and rotate my object i do this:
glScalef( scaleX , scaleY ,1);
glRotatef(rotationZ, 0.0f, 0.0f, 1.0f)
I am using an ortho screen with orthof(-1,1,-1,1,-1,1). My problem is when i rotate objects, the image gets skewed. I understand why that is happening as i am scaling wrt to the screen size so while rotating it changes the image size.
What can i do to prevent it from getting skewed.
glViewport(0,0, (GLint)screenWidth, (GLint)screenHeight);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrthof(-1,1,-1,1,-1,1);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glPushMatrix();
glTranslatef(positionX, positionY,0.0f);
glScalef(scaleX , scaleY ,1);
glRotatef(rotationZ, 0.0f, 0.0f, 1.0f);
Use an ortho projection that matches your aspect ratio of the screen rather than just sending a bunch of ones. Unless you have a square screen, your left/right shouldn't be the same as your top/bottom or you will see skew.

How to position a textured quad in screen coordinates?

I am experimenting with different matrices, studying their effect on a textured quad. So far I have implemented Scaling, Rotation, and Translation matrices fairly easily - by using the following method against my position vectors:
enter code here
for(int a=0;a<noOfVertices;a++)
{
myVectorPositions[a] = SlimDX.Vector3.TransformCoordinate(myVectorPositions[a],myPerspectiveMatrix);
}
However, I what I want to do is be able to position my vectors using world-space coordinates, not object-space.
At the moment my position vectors are declared thusly:
enter code here
myVectorPositions[0] = new Vector3(-0.1f, 0.1f, 0.5f);
myVectorPositions[1] = new Vector3(0.1f, 0.1f, 0.5f);
myVectorPositions[2] = new Vector3(-0.1f, -0.1f, 0.5f);
myVectorPositions[3] = new Vector3(0.1f, -0.1f, 0.5f);
On the other hand (and as part of learning about matrices) I have read that I need to apply a matrix to get to screen coordinates. I've been looking through the SlimDX API docs and can't seem to pin down the one I should be using.
In any case, hopefully the above makes sense and what I am trying to achieve is clear. I'm aiming for a simple 1024 x 768 window as my application area, and want to position a my textured quad at 10,10. How do I go about this? Most confused right now.
I am not familiar with slimdx, but in native DirectX, if you want to draw a quad in screen coordinates, you should define the vertex format as Translated, that is you specify the screen coordinates directly instead of using D3D transform engine to transform your vertex. the vertex definition as below
#define SCREEN_SPACE_FVF (D3DFVF_XYZRHW | D3DFVF_DIFFUSE)
and you can define your vertex like this
ScreenVertex Vertices[] =
{
// Triangle 1
{ 150.0f, 150.0f, 0, 1.0f, 0xffff0000, }, // x, y, z, rhw, color
{ 350.0f, 150.0f, 0, 1.0f, 0xff00ff00, },
{ 350.0f, 350.0f, 0, 1.0f, 0xff00ffff, },
// Triangle 2
{ 150.0f, 150.0f, 0, 1.0f, 0xffff0000, },
{ 350.0f, 350.0f, 0, 1.0f, 0xff00ffff, },
{ 150.0f, 350.0f, 0, 1.0f, 0xff00ffff, },
};
By default screen space in 3d systems is from -1 to 1 (where -1,-1 is bottom left corner and 1,1 top right).
To convert those unit to pixel values, you need to convert pixel values into this space. So for example pixel 10,30 on a screen of 1024*768 is:
position.x = 10.0f * (1.0f / 1024.0f); // maps to 0/1
position.x *= 2.0f; //maps to 0/2
position.x -= 1.0f; // Maps to -1/1
Now for y you do
position.y = 30.0f * (1.0f / 768.0f); // maps to 0/1
position.y = 1.0f - position.y; //Inverts y
position.y *= 2.0f; //maps to 0/2
position.y -= 1.0f; // Maps to -1/1
Also if you want to apply transforms to your quads, It is better to send the transformation to the shader (and do the vector transformation in the vertex shader), rather than doing the multiplications on the vertices, since you will not need to update your vertexbuffer every time.

glMaterialfv not working for me

This is OpenGL on iPhone 4.
Im drawing scene using light and materials. Here is snippet of my code:
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glFrustumf(-1, 1, -1, 1, -1, 1);
CGFloat ambientLight[] = { 0.5f, 0.5f, 0.5f, 1.0f };
CGFloat diffuseLight[] = { 1.0f, 1.0f, 1.0f, 1.0f };
CGFloat direction[] = { 0.0f, 0.0f, -20.0f, 0 };
glEnable(GL_LIGHT0);
glLightfv(GL_LIGHT0, GL_AMBIENT, ambientLight);
glLightfv(GL_LIGHT0, GL_DIFFUSE, diffuseLight);
glLightfv(GL_LIGHT0, GL_POSITION, direction);
glShadeModel(GL_FLAT);
glEnable(GL_LIGHTING);
glDisable(GL_COLOR_MATERIAL);
float blankColor[4] = {0,0,0,1};
float whiteColor[4] = {1,1,1,1};
float blueColor[4] = {0,0,1,1};
glMaterialfv(GL_FRONT, GL_DIFFUSE, blueColor);
glEnable(GL_CULL_FACE);
glVertexPointer(3, GL_FLOAT, 0, verts.pdata);
glEnableClientState(GL_VERTEX_ARRAY);
glNormalPointer(GL_FLOAT, 0, normals.pdata);
glEnableClientState(GL_NORMAL_ARRAY);
glDrawArrays (GL_TRIANGLES, 0, verts.size/3);
Problem is that instead of seeing BLUE diffuse color I see it white. It fades out if I rotate model's side but I can't understand why its not using my blue color.
BTW if I change glMaterialfv(GL_FRONT, GL_DIFFUSE, blueColor) to glMaterialfv(GL_FRONT_AND_BACK, GL_DIFFUSE, blueColor) then I do see blue color. If I do it glMaterialfv(GL_FRONT, GL_DIFFUSE, blueColor); and then glMaterialfv(GL_BACK, GL_DIFFUSE, blueColor); I see white color again. So it looks like GL_FRONT_AND_BACK shows it but rest of combinations show white. Anyone can explain it to me?
This is because of clockwise
10.090 How does face culling work? Why doesn't it use the surface normal?
OpenGL face culling calculates the signed area of the filled primitive in window coordinate space. The signed area is positive when the window coordinates are in a counter-clockwise order and negative when clockwise. An app can use glFrontFace() to specify the ordering, counter-clockwise or clockwise, to be interpreted as a front-facing or back-facing primitive. An application can specify culling either front or back faces by calling glCullFace(). Finally, face culling must be enabled with a call to glEnable(GL_CULL_FACE); .
OpenGL uses your primitive's window space projection to determine face culling for two reasons. To create interesting lighting effects, it's often desirable to specify normals that aren't orthogonal to the surface being approximated. If these normals were used for face culling, it might cause some primitives to be culled erroneously. Also, a dot-product culling scheme could require a matrix inversion, which isn't always possible (i.e., in the case where the matrix is singular), whereas the signed area in DC space is always defined.
However, some OpenGL implementations support the GL_EXT_ cull_vertex extension. If this extension is present, an application may specify a homogeneous eye position in object space. Vertices are flagged as culled, based on the dot product of the current normal with a vector from the vertex to the eye. If all vertices of a primitive are culled, the primitive isn't rendered. In many circumstances, using this extension
from here
Also you can read here

Resources