Why my primitives are drawn on top of each other instead inside each other? - xna-4.0

Im trying to draw cube primitive and axises going through it. The axises must not rotate with cube. I have no idea why it doesnt work just like that. If I remove axises.ApplyPasses(); and call cube.ApplyPasses(); axises go through cube however they rotate along with it.
Primitive class contains fields for PrimitiveType, VertexData, BasicEffect object etc. No math or whatever is done here.
*.ApplyPasses(); is equal to Effect.CurrentTechnique.Passes[0].Apply();
What I got:
What I'm trying to do:
Cube creation:
Primitive<VertexPositionColor> cube = new Primitive<VertexPositionColor>(_graphics.GraphicsDevice);
cube.VertexOffset = 0;
cube.Type = PrimitiveType.TriangleList;
cube.VertexData = new VertexPositionColor[]
{
new VertexPositionColor(new Vector3(-0.5f, -0.5f, 0.5f), Color.Red),
new VertexPositionColor(new Vector3(-0.5f, 0.5f, 0.5f), Color.Red),
new VertexPositionColor(new Vector3(0.5f, 0.5f, 0.5f), Color.Red),
new VertexPositionColor(new Vector3(0.5f, 0.5f, 0.5f), Color.Red),
new VertexPositionColor(new Vector3(0.5f, -0.5f, 0.5f), Color.Red),
new VertexPositionColor(new Vector3(-0.5f, -0.5f, 0.5f), Color.Red),
new VertexPositionColor(new Vector3(-0.5f, -0.5f, 0.5f), Color.Green),
new VertexPositionColor(new Vector3(-0.5f, -0.5f, -0.5f), Color.Green),
new VertexPositionColor(new Vector3(-0.5f, 0.5f, -0.5f), Color.Green),
new VertexPositionColor(new Vector3(-0.5f, 0.5f, -0.5f), Color.Green),
new VertexPositionColor(new Vector3(-0.5f, 0.5f, 0.5f), Color.Green),
new VertexPositionColor(new Vector3(-0.5f, -0.5f, 0.5f), Color.Green),
new VertexPositionColor(new Vector3(-0.5f, -0.5f, 0.5f), Color.White),
new VertexPositionColor(new Vector3(0.5f, -0.5f, 0.5f), Color.White),
new VertexPositionColor(new Vector3(0.5f, -0.5f, -0.5f), Color.White),
new VertexPositionColor(new Vector3(0.5f, -0.5f, -0.5f), Color.White),
new VertexPositionColor(new Vector3(-0.5f, -0.5f, -0.5f), Color.White),
new VertexPositionColor(new Vector3(-0.5f, -0.5f, 0.5f), Color.White),
new VertexPositionColor(new Vector3(-0.5f, -0.5f, 0.5f), Color.White),
new VertexPositionColor(new Vector3(0.5f, -0.5f, 0.5f), Color.White),
new VertexPositionColor(new Vector3(0.5f, -0.5f, -0.5f), Color.White),
new VertexPositionColor(new Vector3(0.5f, -0.5f, -0.5f), Color.Blue),
new VertexPositionColor(new Vector3(0.5f, 0.5f, -0.5f), Color.Blue),
new VertexPositionColor(new Vector3(-0.5f, 0.5f, -0.5f), Color.Blue),
new VertexPositionColor(new Vector3(-0.5f, 0.5f, -0.5f), Color.Blue),
new VertexPositionColor(new Vector3(-0.5f, -0.5f, -0.5f), Color.Blue),
new VertexPositionColor(new Vector3(0.5f, -0.5f, -0.5f), Color.Blue),
new VertexPositionColor(new Vector3(0.5f, -0.5f, -0.5f), Color.Black),
new VertexPositionColor(new Vector3(0.5f, -0.5f, 0.5f), Color.Black),
new VertexPositionColor(new Vector3(0.5f, 0.5f, 0.5f), Color.Black),
new VertexPositionColor(new Vector3(0.5f, 0.5f, 0.5f), Color.Black),
new VertexPositionColor(new Vector3(0.5f, 0.5f, -0.5f), Color.Black),
new VertexPositionColor(new Vector3(0.5f, -0.5f, -0.5f), Color.Black),
new VertexPositionColor(new Vector3(0.5f, -0.5f, -0.5f), Color.Black),
new VertexPositionColor(new Vector3(0.5f, -0.5f, 0.5f), Color.Black),
new VertexPositionColor(new Vector3(0.5f, 0.5f, 0.5f), Color.Black),
new VertexPositionColor(new Vector3(0.5f, 0.5f, 0.5f), Color.Yellow),
new VertexPositionColor(new Vector3(-0.5f, 0.5f, 0.5f), Color.Yellow),
new VertexPositionColor(new Vector3(-0.5f, 0.5f, -0.5f), Color.Yellow),
new VertexPositionColor(new Vector3(-0.5f, 0.5f, -0.5f), Color.Yellow),
new VertexPositionColor(new Vector3(0.5f, 0.5f, -0.5f), Color.Yellow),
new VertexPositionColor(new Vector3(0.5f, 0.5f, 0.5f), Color.Yellow)
};
cubeMatrixes = new Matrixes();
cubeMatrixes.World = Matrix.Identity;
cubeMatrixes.View = Matrix.CreateLookAt(new Vector3(0.0f, 1.0f, 2.0f), Vector3.Zero, Vector3.Up);
cubeMatrixes.Projection = Matrix.CreatePerspectiveFieldOfView(MathHelper.ToRadians(110.0f), 4.0f / 3.0f, 0.001f, 10000.0f);
Axises creation:
Primitive<VertexPositionColor> axises = new Primitive<VertexPositionColor>(_graphics.GraphicsDevice);
axises.Type = PrimitiveType.LineList;
axises.VertexOffset = 0;
axises.VertexData = new VertexPositionColor[]
{
new VertexPositionColor(new Vector3(-5, 0, 0), Color.Red),
new VertexPositionColor(new Vector3(5, 0, 0), Color.Red),
new VertexPositionColor(new Vector3(0, -5, 0), Color.Green),
new VertexPositionColor(new Vector3(0, 5, 0), Color.Green),
new VertexPositionColor(new Vector3(0, 0, -5), Color.Blue),
new VertexPositionColor(new Vector3(0, 0, 5), Color.Blue),
};
axisesMatrixes = new Matrixes();
axisesMatrixes.World = Matrix.Identity;
axisesMatrixes.View = Matrix.CreateLookAt(new Vector3(0.0f, 1.0f, 5.0f), Vector3.Zero, Vector3.Up);
axisesMatrixes.Projection = Matrix.CreatePerspectiveFieldOfView(MathHelper.ToRadians(45.0f), 4.0f / 3.0f, 0.001f, 10000.0f);
Translating:
cube.Effect.World
= Matrix.CreateRotationX(MathHelper.ToRadians(cubeAngle.X))
* Matrix.CreateRotationY(MathHelper.ToRadians(cubeAngle.Y))
* Matrix.CreateRotationZ(MathHelper.ToRadians(cubeAngle.Z))
* cubeMatrixes.World;
cube.Effect.View = cubeMatrixes.View;
cube.Effect.Projection = cubeMatrixes.Projection;
axises.Effect.World
= Matrix.CreateRotationX(MathHelper.ToRadians(0.0f))
* Matrix.CreateRotationY(MathHelper.ToRadians(-60.0f))
* Matrix.CreateRotationZ(MathHelper.ToRadians(0.0f))
* axisesMatrixes.World;
axises.Effect.View
= axisesMatrixes.View;
axises.Effect.Projection
= axisesMatrixes.Projection;
Drawing:
protected override void Draw(GameTime gameTime)
{
GraphicsDevice.Clear(Color.CornflowerBlue);
axises.ApplyPasses();
_graphics.GraphicsDevice.DrawUserPrimitives(axises.Type, axises.VertexData, axises.VertexOffset, axises.Count);
cube.ApplyPasses();
_graphics.GraphicsDevice.DrawUserPrimitives(cube.Type, cube.VertexData, cube.VertexOffset, cube.Count);
base.Draw(gameTime);
}

The answear to this question is that I created separate projection matrixes.
When changed to single projection matrix for both axisses and cube everything worked as expected.
Translating:
cube.Effect.World
= Matrix.CreateRotationX(MathHelper.ToRadians(cubeAngle.X))
* Matrix.CreateRotationY(MathHelper.ToRadians(cubeAngle.Y))
* Matrix.CreateRotationZ(MathHelper.ToRadians(cubeAngle.Z))
* cubeMatrixes.World;
cube.Effect.View = cubeMatrixes.View;
cube.Effect.Projection = cubeMatrixes.Projection;
axises.Effect.World
= Matrix.CreateRotationX(MathHelper.ToRadians(0.0f))
* Matrix.CreateRotationY(MathHelper.ToRadians(-60.0f))
* Matrix.CreateRotationZ(MathHelper.ToRadians(0.0f))
* axisesMatrixes.World;
axises.Effect.View
= axisesMatrixes.View;
axises.Effect.Projection
= cubeMatrixes.Projection; // instead axisesMatrixes.Projection;

Related

How to change the triangle in to a square

How do I change the triangle in to a square in my d3d11 app (I don't have much experience so idk how to do it) I got the code from this http://www.directxtutorial.com/Lesson.aspx?lessonid=11-4-5 tutorial to render a green triagle but i need to make it a square.
Instead of drawing a single triangle, you draw TWO triangles that share two vertices. The key challenge is making sure you specify them with the correct winding order for your rendering setup.
// Single multi-colored triangle
static const Vertex s_vertexData[3] =
{
{ { 0.0f, 0.5f, 0.5f, 1.0f },{ 1.0f, 0.0f, 0.0f, 1.0f } }, // Top / Red
{ { 0.5f, -0.5f, 0.5f, 1.0f },{ 0.0f, 1.0f, 0.0f, 1.0f } }, // Right / Green
{ { -0.5f, -0.5f, 0.5f, 1.0f },{ 0.0f, 0.0f, 1.0f, 1.0f } } // Left / Blue
};
...
context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST);
context->Draw(3, 0);
SimpleTriangle sample on GitHub
You can draw the two triangles two different ways.
The FIRST and most straight-forward is using an index buffer (IB) which makes it easier to support arbitrary cull settings.
// Two triangles forming a quad with the same color at all corners
static const Vertex s_vertexData[4] =
{
{ { -0.5f, -0.5f, 0.5f, 1.0f }, { 0.f, 1.f } },
{ { 0.5f, -0.5f, 0.5f, 1.0f }, { 1.f, 1.f } },
{ { 0.5f, 0.5f, 0.5f, 1.0f }, { 1.f, 0.f } },
{ { -0.5f, 0.5f, 0.5f, 1.0f }, { 0.f, 0.f } },
};
static const uint16_t s_indexData[6] =
{
3,1,0,
2,1,3,
};
...
context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST);
context->DrawIndexed(6, 0, 0);
SimpleTexture sample on GitHub
The SECOND method is a bit trickier which is to use just 4 vertices without an index buffer by drawing them as a triangle strip. The problem here is that it constrains your winding order choices a fair bit. If you turn off backface culling, it's really simple using the same 4 vertices
// Create rsState with D3D11_RASTERIZER_DESC.CullMode
// set to D3D11_CULL_NONE
...
context->RSSetState(rsState);
context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP);
context->Draw(4, 0);
As you are new to DirectX, you may want to take a look at DirectX Tool Kit.
PS: For the special case of wanting a quad that fills an entire render viewport (such as a full-screen quad), with Direct3D Hardware Feature Level 10.0 or better hardware, you can skip using a IB or VB at all and just generate the quad inside the vertex shader itself. See GitHub.

OpenGL 2.0 Setting different color for a cube

I am trying to draw a cube with different colors on each face using OpenGL ES 2.0. I can only draw a cube in one color now. I knew I need to use VertexAttribPointer in this case instead of Uniform, but I probably added them wrongly. Screen shows nothing after I implement my code. Here is my code, can anybody give me a hand? Thank you so much!!!
public class MyCube {
private FloatBuffer vertexBuffer;
private ShortBuffer drawListBuffer;
private ShortBuffer[] ArrayDrawListBuffer;
private FloatBuffer colorBuffer;
private int mProgram;
//For Projection and Camera Transformations
private final String vertexShaderCode =
// This matrix member variable provides a hook to manipulate
// the coordinates of the objects that use this vertex shader
"uniform mat4 uMVPMatrix;" +
"attribute vec4 vPosition;" +
"void main() {" +
// the matrix must be included as a modifier of gl_Position
// Note that the uMVPMatrix factor *must be first* in order
// for the matrix multiplication product to be correct.
" gl_Position = uMVPMatrix * vPosition;" +
"}";
// Use to access and set the view transformation
private int mMVPMatrixHandle;
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
// number of coordinates per vertex in this array
static final int COORDS_PER_VERTEX = 3;
float cubeCoords[] = {
-0.5f, 0.5f, 0.5f, // front top left 0
-0.5f, -0.5f, 0.5f, // front bottom left 1
0.5f, -0.5f, 0.5f, // front bottom right 2
0.5f, 0.5f, 0.5f, // front top right 3
-0.5f, 0.5f, -0.5f, // back top left 4
0.5f, 0.5f, -0.5f, // back top right 5
-0.5f, -0.5f, -0.5f, // back bottom left 6
0.5f, -0.5f, -0.5f, // back bottom right 7
};
// Set color with red, green, blue and alpha (opacity) values
float color[] = { 0.63671875f, 0.76953125f, 0.22265625f, 1.0f };
float red[] = { 1.0f, 0.0f, 0.0f, 1.0f };
float blue[] = { 0.0f, 0.0f, 1.0f, 1.0f };
private short drawOrder[] = {
0, 1, 2, 0, 2, 3,//front
0, 4, 5, 0, 5, 3, //Top
0, 1, 6, 0, 6, 4, //left
3, 2, 7, 3, 7 ,5, //right
1, 2, 7, 1, 7, 6, //bottom
4, 6, 7, 4, 7, 5};//back (order to draw vertices)
final float[] cubeColor =
{
// Front face (red)
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f,
// Top face (green)
0.0f, 1.0f, 0.0f, 1.0f,
0.0f, 1.0f, 0.0f, 1.0f,
0.0f, 1.0f, 0.0f, 1.0f,
0.0f, 1.0f, 0.0f, 1.0f,
0.0f, 1.0f, 0.0f, 1.0f,
0.0f, 1.0f, 0.0f, 1.0f,
// Left face (blue)
0.0f, 0.0f, 1.0f, 1.0f,
0.0f, 0.0f, 1.0f, 1.0f,
0.0f, 0.0f, 1.0f, 1.0f,
0.0f, 0.0f, 1.0f, 1.0f,
0.0f, 0.0f, 1.0f, 1.0f,
0.0f, 0.0f, 1.0f, 1.0f,
// Right face (yellow)
1.0f, 1.0f, 0.0f, 1.0f,
1.0f, 1.0f, 0.0f, 1.0f,
1.0f, 1.0f, 0.0f, 1.0f,
1.0f, 1.0f, 0.0f, 1.0f,
1.0f, 1.0f, 0.0f, 1.0f,
1.0f, 1.0f, 0.0f, 1.0f,
// Bottom face (cyan)
0.0f, 1.0f, 1.0f, 1.0f,
0.0f, 1.0f, 1.0f, 1.0f,
0.0f, 1.0f, 1.0f, 1.0f,
0.0f, 1.0f, 1.0f, 1.0f,
0.0f, 1.0f, 1.0f, 1.0f,
0.0f, 1.0f, 1.0f, 1.0f,
// Back face (magenta)
1.0f, 0.0f, 1.0f, 1.0f,
1.0f, 0.0f, 1.0f, 1.0f,
1.0f, 0.0f, 1.0f, 1.0f,
1.0f, 0.0f, 1.0f, 1.0f,
1.0f, 0.0f, 1.0f, 1.0f,
1.0f, 0.0f, 1.0f, 1.0f
};
public MyCube() {
// initialize vertex byte buffer for shape coordinates
ByteBuffer bb = ByteBuffer.allocateDirect(
// (# of coordinate values * 4 bytes per float)
cubeCoords.length * 4);
bb.order(ByteOrder.nativeOrder());
vertexBuffer = bb.asFloatBuffer();
vertexBuffer.put(cubeCoords);
vertexBuffer.position(0);
// initialize byte buffer for the draw list
ByteBuffer dlb = ByteBuffer.allocateDirect(
// (# of coordinate values * 2 bytes per short)
drawOrder.length * 2);
dlb.order(ByteOrder.nativeOrder());
drawListBuffer = dlb.asShortBuffer();
drawListBuffer.put(drawOrder);
drawListBuffer.position(0);
// initialize byte buffer for the color list
ByteBuffer cb = ByteBuffer.allocateDirect(
// (# of coordinate values * 2 bytes per short)
cubeColor.length * 4);
cb.order(ByteOrder.nativeOrder());
colorBuffer = cb.asFloatBuffer();
colorBuffer.put(cubeColor);
colorBuffer.position(0);
int vertexShader = MyRenderer.loadShader(GLES20.GL_VERTEX_SHADER,
vertexShaderCode);
int fragmentShader = MyRenderer.loadShader(GLES20.GL_FRAGMENT_SHADER,
fragmentShaderCode);
// create empty OpenGL ES Program
mProgram = GLES20.glCreateProgram();
// add the vertex shader to program
GLES20.glAttachShader(mProgram, vertexShader);
// add the fragment shader to program
GLES20.glAttachShader(mProgram, fragmentShader);
// creates OpenGL ES program executables
GLES20.glLinkProgram(mProgram);
}
private int mPositionHandle;
private int mColorHandle;
private final int vertexCount = cubeCoords.length / COORDS_PER_VERTEX;
private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex
public void draw(float[] mvpMatrix) { // pass in the calculated transformation matrix
// Add program to OpenGL ES environment
GLES20.glUseProgram(mProgram);
// get handle to vertex shader's vPosition member
mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
// get handle to fragment shader's vColor member
mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
// Enable a handle to the cube vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Prepare the cube coordinate data
GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX,
GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
// Set color for drawing the triangle
//mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
// Enable a handle to the cube colors
GLES20.glEnableVertexAttribArray(mColorHandle);
// Prepare the cube color data
GLES20.glVertexAttribPointer(mColorHandle, 4, GLES20.GL_FLOAT, false, 16, colorBuffer);
// Set the color for each of the faces
//GLES20.glUniform4fv(mColorHandle, 1, blue, 0);
//***When I add this line of code above, it can show a cube totally in blue.***
// get handle to shape's transformation matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
// Pass the projection and view transformation to the shader
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
// Draw the cube
GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length, GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
// Disable vertex array
GLES20.glDisableVertexAttribArray(mPositionHandle);
GLES20.glDisableVertexAttribArray(mColorHandle);
GLES20.glDisableVertexAttribArray(mMVPMatrixHandle);
}
}
Remove the uniform variable vColor declaration from the fragment shader. Define the new per-vertex attribute input variable in the vertex shader, write that value to a varying variable which is output by the vertex shader, and add as a varying input which is read by the fragment shader.

c++ std::vector dont get raw data from subclass

hey guys i have a problem with the std::vector and i need your help.
I am currently programming a rendering engine with the new vulkan api and i want to support different vertex layouts for different meshes.
The problem is that std::vector::data() doesnt return the raw data i need. Here are my Vertex structs:
struct Vertex
{
};
struct VertexColor : public Vertex
{
public:
VertexColor(Vec3f pos, Vec3f col) : position(pos), color(col) {}
Vec3f position;
Vec3f color;
};
This is what i actually have and works:
std::vector<VertexColor> cubeVertexBuffer = {
VertexColor{ Vec3f( -1.0f, -1.0f, 1.0f ), Vec3f( 0.0f, 0.0f, 0.0f ) },
VertexColor{ Vec3f( 1.0f, -1.0f, 1.0f ), Vec3f( 1.0f, 0.0f, 0.0f ) },
VertexColor{ Vec3f( 1.0f, 1.0f, 1.0f ), Vec3f( 1.0f, 1.0f, 0.0f ) },
VertexColor{ Vec3f( -1.0f, 1.0f, 1.0f ), Vec3f( 0.0f, 1.0f, 0.0f ) },
VertexColor{ Vec3f( -1.0f, -1.0f, -1.0f), Vec3f( 0.0f, 0.0f, 1.0f ) },
VertexColor{ Vec3f( 1.0f, -1.0f, -1.0f ), Vec3f( 0.0f, 1.0f, 0.0f ) },
VertexColor{ Vec3f( 1.0f, 1.0f, -1.0f ), Vec3f( 1.0f, 1.0f, 0.0f ) },
VertexColor{ Vec3f( -1.0f, 1.0f, -1.0f ), Vec3f( 1.0f, 0.0f, 0.0f ) }
};
uint32_t size = 8 * sizeof(VertexColor);
Vertex* vertices = (Vertex*)malloc(size);
memcpy(vertices, cubeVertexBuffer.data(), size);
std::vector<uint32_t> cubeIndexBuffer = { 1,2,0, 2,3,0,
0,3,4, 3,7,4,
5,1,4, 1,0,4,
2,6,3, 6,7,3,
5,6,1, 6,2,1,
6,5,7, 5,4,7 };
Cube::cubeMesh = new Mesh(vertices, size, cubeIndexBuffer);
What i want:
std::vector<Vertex> cubeVertexBuffer = {
VertexColor{ Vec3f( -1.0f, -1.0f, 1.0f ), Vec3f( 0.0f, 0.0f, 0.0f ) },
VertexColor{ Vec3f( 1.0f, -1.0f, 1.0f ), Vec3f( 1.0f, 0.0f, 0.0f ) },
VertexColor{ Vec3f( 1.0f, 1.0f, 1.0f ), Vec3f( 1.0f, 1.0f, 0.0f ) },
VertexColor{ Vec3f( -1.0f, 1.0f, 1.0f ), Vec3f( 0.0f, 1.0f, 0.0f ) },
VertexColor{ Vec3f( -1.0f, -1.0f, -1.0f), Vec3f( 0.0f, 0.0f, 1.0f ) },
VertexColor{ Vec3f( 1.0f, -1.0f, -1.0f ), Vec3f( 0.0f, 1.0f, 0.0f ) },
VertexColor{ Vec3f( 1.0f, 1.0f, -1.0f ), Vec3f( 1.0f, 1.0f, 0.0f ) },
VertexColor{ Vec3f( -1.0f, 1.0f, -1.0f ), Vec3f( 1.0f, 0.0f, 0.0f ) }
};
std::vector<uint32_t> cubeIndexBuffer = { 1,2,0, 2,3,0,
0,3,4, 3,7,4,
5,1,4, 1,0,4,
2,6,3, 6,7,3,
5,6,1, 6,2,1,
6,5,7, 5,4,7 };
Cube::cubeMesh = new Mesh(cubeVertexBuffer, cubeIndexBuffer);
As i mentioned i need the whole vertex data in a raw format contigously in memory for mapping to the gpu, but the .data() function with the std::vector doesnt return the "real data". I dont know how i can using inheritation with the std::vector to get the "raw data from the subclass" in it.
Hope you can help me!
Thanks
EDIT: I checked the memory and with the std::vector, where i put my VertexColor(..) data into it, it dont set any data in memory. Is it because the "Vertex" struct does not have any members?
What you are experiencing is called slicing. When you declare a vector<Vertex>, it will hold instances of Vertex. Each "cell" will be large enough for the data in Vertex. Since Vertex is a base-class of VertexColor, it is possible to assign a VertexColor object to a Vertex, but this will only copy the data members from Vertex. Vertex has no members, so what do you expect to be the content of a vector<Vertex>?
Inheritance is the wrong design approach in this case. From what I see, Mesh should be a template class template<typename V> class Mesh<V> to support different Vertex types, restricted to be a standard layout type (with enable_if<is_standard_layout<V>::value, V, V>::type).
Firstly, never inherit from std::vector. It is not recommended and probably bad practise.
Also have a look at Vec3f. Does it have any virtual methods or destructors? You might be sitting with more data in your classes than just the members you defined. The structures you use should be plain old c++ objects without any polymorphic properties (i.e. plain c-style structures).
Also, check your member byte alignment. You can remove all padding from structures with:
#pragma pack(push)
#pragma pack(1)
struct ...
{
...
}
#pragma pack(pop)
Let me know if you get it going.
Additionally vector<Vertex> and vector<VertexColor> are not comparable, since the individual elements have different sizes. In my experience the 3D APIs expect the data in a very specific format and you can't do it generically if the API does not already provide the format for you.
Slicing can also occur, as Jens suggests.

Use different GL_ELEMENT_ARRAY_BUFFER for different attributes in one shader?

In OpenGL (OpenGL ES 2.0) can I use more than one GL_ELEMENT_ARRAY_BUFFER for different GL_ARRAY_BUFFER buffers? I'm reading "OpenGL ES 2.0 Programming Guide", Chapter 6 "Vertex attributes, arrays and buffer objects", there is source example: there are several GL_ARRAY_BUFFER (for position, normal, texture coords) and one GL_ELEMENT_ARRAY_BUFFER ("used to store element indices").
While I was writing question, I got that I can't send more then one indices array to glDrawElements, so if I use buffers, maybe only last binded GL_ELEMENT_ARRAY_BUFFER is using for drawing. But what about memory saving (what is purpose of glDrawElements)? I will illustrate problem that I faced.
There are 2 arrays (as GL_ARRAY_BUFFERs) - 8 vertices and 6 normals
GLfloat gVertexPositions[] =
{
0.5f, 0.5f, 0.5f,
0.5f, -0.5f, 0.5f,
-0.5f, -0.5f, 0.5f,
-0.5f, 0.5f, 0.5f,
0.5f, 0.5f, -0.5f,
0.5f, -0.5f, -0.5f,
-0.5f, -0.5f, -0.5f,
-0.5f, 0.5f, -0.5f
};
GLfloat gVertexNormals[] =
{
1.0f, 0.0f, 0.0f, // top
-1.0f, 0.0f, 0.0f, // bottom
0.0f, 1.0f, 0.0f, // right
0.0f, -1.0f, 0.0f, // left
0.0f, 0.0f, 1.0f, // back
0.0f, 0.0f, -1.0f // front
};
2 arrays of indices (as GL_ELEMENT_ARRAY_BUFFERs)
GLubyte gVertexPositionIndices[] =
{
0, 1, 2, // top
2, 3, 0,
0, 4, 1, // right
1, 4, 5,
5, 4, 7, // bottom
6, 5, 7,
2, 6, 7, // left
2, 7, 3,
1, 4, 2, // front
2, 4, 5,
0, 3, 4, // back
7, 4, 3
};
GLubyte gVertexNormalIndices[] =
{
0, 0, 0,
0, 0, 0,
2, 2, 2,
2, 2, 2,
1, 1, 1,
1, 1, 1,
3, 3, 3,
3, 3, 3,
5, 5, 5,
5, 5, 5,
4, 4, 4,
4, 4, 4
};
I set vertex attribute state
glBindAttribLocation(program, ATTRIB_POSITION, "a_position");
glBindAttribLocation(program, ATTRIB_NORMAL, "a_normal");
//.....
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vboIds[VBO_POSITION_INDICES]);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(GLubyte) * 36, gVertexPositionIndices, GL_STATIC_DRAW);
glBindBuffer(GL_ARRAY_BUFFER, vboIds[VBO_POSITION_DATA]);
glBufferData(GL_ARRAY_BUFFER, sizeof(float) * 3 * 8, gVertexPositions, GL_STATIC_DRAW);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vboIds[VBO_NORMAL_INDICES]);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(GLubyte) * 36, gVertexNormalIndices, GL_STATIC_DRAW);
glBindBuffer(GL_ARRAY_BUFFER, vboIds[VBO_NORMAL_DATA]);
glBufferData(GL_ARRAY_BUFFER, sizeof(float) * 3 * 6, gVertexNormals, GL_STATIC_DRAW);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vboIds[VBO_POSITION_INDICES]);
glBindBuffer(GL_ARRAY_BUFFER, vboIds[VBO_POSITION_DATA]);
glEnableVertexAttribArray(ATTRIB_POSITION);
glBindBuffer(GL_ARRAY_BUFFER, vboIds[VBO_NORMAL_DATA]);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vboIds[VBO_NORMAL_INDICES]);
glEnableVertexAttribArray(ATTRIB_NORMAL);
glVertexAttribPointer(ATTRIB_POSITION, 3, GL_FLOAT, GL_FALSE, 0, 0);
glVertexAttribPointer(ATTRIB_NORMAL, 3, GL_FLOAT, GL_FALSE, 0, 0);
Then draw
glDrawElements(GL_TRIANGLES, 36, GL_UNSIGNED_BYTE, 0);
And screen is empty (because last GL_ELEMENT_ARRAY_BUFFER is using for "a_position" attribute, where all triplets have identical numbers)
All I want - program makes 36 vertices, sets their positions from gVertexPositions using gVertexPositionIndices and their normals from gVertexNormals using gVertexNormalIndices. I doubt that is possible, but I want to know exactly. And what will be the right way if that is impossible? Do I have to use 8*3 floats for positions, 36 bytes for indices and 36*3 floats for normals? So I can save memory only for position attribute?
I hope I'm not too late, but what I do is set the normals on a per-vertex basis within the vertex array, then use the appropriate strides and buffer offsets when declaring the vertex attributes. Here's my code declaring the vertices and normals:
GLfloat BlockVertexData[144] = {
//Right side...
//Vertices... //Normals...
1.0f, 2.0f, 0.5f, 1.0f, 0.0f, 0.0f, //B
1.0f, 2.0f, -0.5f, 1.0f, 0.0f, 0.0f, //F
1.0f, -2.0f, 0.5f, 1.0f, 0.0f, 0.0f, //D
1.0f, -2.0f, -0.5f, 1.0f, 0.0f, 0.0f, //H
//Front side...
-1.0f, 2.0f, 0.5f, 0.0f, 0.0f, 1.0f, //A
1.0f, 2.0f, 0.5f, 0.0f, 0.0f, 1.0f, //B
-1.0f, -2.0f, 0.5f, 0.0f, 0.0f, 1.0f, //C
1.0f, -2.0f, 0.5f, 0.0f, 0.0f, 1.0f, //D
//Left side...
-1.0f, 2.0f, 0.5f, -1.0f, 0.0f, 0.0f, //A
-1.0f, 2.0f, -0.5f, -1.0f, 0.0f, 0.0f, //E
-1.0f, -2.0f, 0.5f, -1.0f, 0.0f, 0.0f, //C
-1.0f, -2.0f, -0.5f, -1.0f, 0.0f, 0.0f, //G
//Back side...
-1.0f, 2.0f, -0.5f, 0.0f, 0.0f, -1.0f, //E
1.0f, 2.0f, -0.5f, 0.0f, 0.0f, -1.0f, //F
-1.0f, -2.0f, -0.5f, 0.0f, 0.0f, -1.0f, //G
1.0f, -2.0f, -0.5f, 0.0f, 0.0f, -1.0f, //H
//Top side...
-1.0f, 2.0f, -0.5f, 0.0f, 1.0f, 0.0f, //E
1.0f, 2.0f, -0.5f, 0.0f, 1.0f, 0.0f, //F
-1.0f, 2.0f, 0.5f, 0.0f, 1.0f, 0.0f, //A
1.0f, 2.0f, 0.5f, 0.0f, 1.0f, 0.0f, //B
//Bottom side...
-1.0f, -2.0f, -0.5f, 0.0f, -1.0f, 0.0f, //G
1.0f, -2.0f, -0.5f, 0.0f, -1.0f, 0.0f, //H
-1.0f, -2.0f, 0.5f, 0.0f, -1.0f, 0.0f, //C
1.0f, -2.0f, 0.5f, 0.0f, -1.0f, 0.0f //D };
GLuint BlockIndicesData[36] = {
//Right side...
2, 0, 3, 0, 1, 3,
//Front side...
6, 4, 7, 4, 5, 7,
//Left side...
11, 10, 8, 8, 9, 11,
//Back side...
15, 14, 12, 12, 13, 15,
//Top side...
19, 18, 16, 16, 17, 19,
//Bottom side...
23, 22, 20, 20, 21, 23 };
And here's the code declaring the attributes:
// The stride shows that there are 6 floats in each row.
GLsizei stride = 6 * sizeof(GLfloat);
GLuint attribute;
attribute = glGetAttribLocation(program, "VertexPosition");
glEnableVertexAttribArray(attribute);
glVertexAttribPointer(attribute, 3, GL_FLOAT, GL_FALSE, stride, 0);
attribute = glGetAttribLocation(self.program, "VertexNormal");
glEnableVertexAttribArray(attribute);
// The sixth parameter indicates the buffer offset, so here there were 3 floats preceding it, so this indicates it.
glVertexAttribPointer(attribute, 3, GL_FLOAT, GL_FALSE, stride, (GLvoid *)(sizeof(GLfloat) * 3));
I know that this may occupy more memory, but maybe someone can come up with a better solution. This is what I can think of to solve your problem.

glFrustum() in OpenGL ES: why do shapes appear the same size?

I am trying to understand the perspective view in OpenGL.
What I am trying to do is render two identical triangles, but at different z coordinates, so I assume they should appear at different sizes. Here is my code:
CUSTOMVERTEX Vertices[] =
{
{ 0.5f, 1.0f, 0.5f, 1.0f, 0.0f, 0.0f, 1.0f }, // x, y, z, color
{ 0.0f, 0.0f, 0.5f, 0.0f, 1.0f, 0.0f, 1.0f },
{ 1.0f, 0.0f, 0.5f, 0.0f, 0.0f, 1.0f, 1.0f },
};
and for drawing
glDrawArrays(GL_TRIANGLES,0, 3);
glTranslatef(0.0f,-1.0f,-1.5f);
glDrawArrays(GL_TRIANGLES,0, 3);
and here is how I init some attributes
glShadeModel(GL_SMOOTH);
glClearDepthf(1.0f);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
glEnable(GL_CULL_FACE);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glFrustumf(-1.0f, 1.0f, -1.0f, 1.0f, 0.0f, 100.0f);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
but the triangles appear at the same size, just at different locations as I translated Y.
Can someone please explain to me?
You cannot use 0.0 for the perspective projection's near-Z value. It must be a positive number greater than zero. Preferably on the order of 1.0 or so.
As Nicol said you should use numbers greater than 0 for frustrum construction. I strongly suggest you to read this article to understand why it is so.

Resources