How to set up vertex attributes in OpenGL? - vbo

I am trying to create a VBO for a simple rectangle. The GL is set up to use the core profile (GL: 3.2, GLSL: 1.5, inside an NSView in Cocoa).
I spent hours trying to figure out how to draw a simple rectangle in OpenGL. It seems really hard to find any decent tutorials on the core profile. The best I could find was this tutorial.
I adapted it to my needs and came up with the following code:
GLfloat vertices[] = { 1.0, -1.0, 1.0,
1.0, 1.0, 1.0,
-1.0, 1.0, -1.0,
-1.0, -1.0, -1.0 };
glGenBuffers(1, &vertexVBO);
glBindBuffer(GL_ARRAY_BUFFER, vertexVBO);
glBufferData(GL_ARRAY_BUFFER, sizeof(GLfloat)*3*4, vertices, GL_STATIC_DRAW);
glVertexAttribPointer(VERTEX_POS, 3, GL_FLOAT, GL_FALSE, 0, 0); // VERTEX_POS = 0
glEnableVertexAttribArray(VERTEX_POS); // fails
However, this is throwing an error when calling glEnableVertexAttribArray: INVALID_OPERATION. The documentation suggests that this error is produced if the aforementioned call is made between a glBegin and glEnd. This is not the case. I mean, as far as I know, glBegin and glEnd are not even supported in the core profile.
Thus, I am at a loss. How can I draw this stupid rectangle (or at least initialize it)?

You need to bind a VAO before setting attribute pointers.
GLuint vao_name;
glGenVertexArrays(1, &vao_name);
glBindVertexArray(vao_name);
// ...
glVertexAttribPointer(...);
glEnableVertexAttribArray(...);
Also, the documentation you're linking to is outdated — use this one instead.

Related

Ruby, openGL : change texture luminosity

I have some problems with OpenGL and luminosity. Let me explain you my problem :
I drew this "sprite" (it's only a plane here) with a code like that :
sprite.set_active
left, right, top, bottom = 0.0, 1.0, 1.0, 0.0
glPushMatrix
glTranslate(#position.x - 16, #position.y, #position.z)
glRotate(-90 -#window.camera.horizontal_angle, 0, 1, 0)
glScale(chara.width, chara.height, 32.0)
begin
glEnable(GL_BLEND)
glBegin(GL_QUADS)
glColor4f(1.0, 1.0, 1.0, 1.0)
glTexCoord2d(left, top); glVertex3f(0, 1, 0.5)
glTexCoord2d(right, top); glVertex3f(1, 1, 0.5)
glTexCoord2d(right, bottom); glVertex3f(1, 0, 0.5)
glTexCoord2d(left, bottom); glVertex3f(0, 0, 0.5)
glEnd
glDisable(GL_BLEND)
rescue
end
glPopMatrix
My problem is with that line :
glColor4f(1.0, 1.0, 1.0, 1.0)
Well, I can put a number lesser than 1.0 to have a darker sprite, but I can't do the contrary. How can I do that ? How can I make the sprite be totally white, for example ?
To get full control over your fragment processing, the best approach is using the programmable pipeline, where you can implement exactly what you want with GLSL code.
But there are some options that could work for this case in the fixed pipeline. The simplest one is using a different GL_TEXTURE_ENV_MODE. The default value is GL_MODULATE, which means that the color you specified with glColor4f() is multiplied with the color from the texture. As you found, that allows you to make the texture darker, but not brighter.
You could try using GL_ADD instead. As the name suggests, this will produce the final output as the sum of the texture color and the color from glColor4f(). For example:
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_ADD);
glColor4f(0.2f, 0.2f, 0.2f, 0.0f);
would add 0.2 to the color components read from the texture.
There is more complex functionality in the fixed pipeline that gives you more control over how texture values are used to generate colors. You can find it by looking for "texture combiners". But in my personal opinion, you're much better off moving to the programmable pipeline if you need something complex enough to require texture combiners.

alpha blending in opengl es not working

I'm trying to vary the transparency of a texture drawn onto a quad, the code below works fine except the alpha set with glColor4f has no effect. What are the possible reasons for this? Is it likely to be a gl setting somewhere else in the program?
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_COLOR, GL_ONE_MINUS_SRC_ALPHA);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, textureId);
glColor4f(1.0f, 1.0f, 1.0f, 0.3f);
glVertexAttribPointer(vertexHandle, 3, GL_FLOAT, GL_FALSE, 0, quadVertices);
glVertexAttribPointer(normalHandle, 3, GL_FLOAT, GL_FALSE, 0, quadNormals);
glVertexAttribPointer(textureCoordHandle, 2, GL_FLOAT, GL_FALSE, 0, quadTexCoords);
glEnableVertexAttribArray(vertexHandle);
glEnableVertexAttribArray(normalHandle);
glEnableVertexAttribArray(textureCoordHandle);
glUniformMatrix4fv(mvpMatrixHandle, 1, GL_FALSE, (GLfloat*)&modelViewProjectionButton.data[0] );
glDrawElements(GL_TRIANGLES, NUM_QUAD_INDEX, GL_UNSIGNED_SHORT, quadIndices);
glDisableVertexAttribArray(vertexHandle);
glDisableVertexAttribArray(normalHandle);
glDisableVertexAttribArray(textureCoordHandle);
Edit:
I managed to do it as per the answer below. If anyone's interested, i put a uniform variable in my shader, called alpha, like this:
uniform float alpha;
void main()
{
gl_FragColor = texture2D(texSampler2D, texCoord);
gl_FragColor = gl_FragColor * alpha;
}
and then when i'm drawing the scene i used it like this (for example to set 0.5 alpha):
GLint alphaLocation = glGetUniformLocation(shaderProgramID, "alpha");
glUniform1f(alphaLocation, 0.5);
You are obviously using OpenGL ES 2.0 (because you are using glVertexAttribPointer and glUniformMatrix4fv), which actually makes this a bit puzzling. OpenGL ES 2.0 does not define glColor<N> (...), this was part of the fixed-function API and should not be defined in a compliant OpenGL ES 2.0 implementation.
Even if it is defined, there is no mechanism in GLSL ES to get the "current color" in a shader. Desktop GL has the pre-declared variable gl_Color in compatibility GLSL profiles, but GLSL ES does not.
You will need to use a GLSL uniform if you want to define the color this way instead of using a per-vertex attribute.

My triangle doesn't render when I use OpenGL Core Profile 3.2

I have a Cocoa (OSX) project that is currently very simple, I'm just trying to grasp the general concepts behind using OpenGL. I was able to get a triangle to display in my view, but when I went to write my vertex shaders and fragment shaders, I realized I was running the legacy OpenGL core profile. So I switched to the OpenGL 3.2 profile by setting the properties in the pixel format of the view in question before generating the context, but now the triangle doesn't render, even without my vertex or fragment shaders.
I have a controller class for the view that's instantiated in the nib. On -awakeFromNib it sets up the pixel format and the context:
NSOpenGLPixelFormatAttribute attr[] =
{
NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core,
0
};
NSOpenGLPixelFormat *glPixForm = [[NSOpenGLPixelFormat alloc] initWithAttributes:attr];
[self.mainView setPixelFormat:glPixForm];
self.glContext = [self.mainView openGLContext];
Then I generate the VAO:
glGenVertexArrays(1, &vertexArrayID);
glBindVertexArray(vertexArrayID);
Then the VBO:
glGenBuffers(1, &buffer);
glBindBuffer(GL_ARRAY_BUFFER, buffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(g_vertex_buffer_data), g_vertex_buffer_data, GL_STATIC_DRAW);
g_vertex_buffer_data, the actual data for that buffer is defined as follows:
static const GLfloat g_vertex_buffer_data[] = {
-1.0f, -1.0f, 0.0f,
1.0f, -1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
};
Here's the code for actually drawing:
[_glContext setView:self.mainView];
[_glContext makeCurrentContext];
glViewport(0, 0, [self.mainView frame].size.width, [self.mainView frame].size.height);
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, self.vertexBuffer);
glVertexAttribPointer(
0,
3,
GL_FLOAT,
GL_FALSE,
0,
(void*)0
);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glDrawArrays(GL_TRIANGLES, 0, 3); // Starting from vertex 0; 3 vertices total -> 1 triangle
glDisableVertexAttribArray(0);
glFlush();
This code draws the triangle fine if I comment out the NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core, in the NSOpenGLPixelFormatAttribute array, but as soon as I enable OpenGL Core Profile 3.2, it just displays black. Can anyone tell me what I'm doing wrong here?
EDIT: This issue still happens whether I turn my vertex and fragment shaders on or not, but here are my shaders in case it is helpful:
Vertex shader:
#version 150
in vec3 position;
void main() {
gl_Position.xyz = position;
}
Fragment shader:
#version 150
out vec3 color;
void main() {
color = vec3(1,0,0);
}
And right before linking the program, I make this call to bind the attribute location:
glBindAttribLocation(programID, 0, "position");
EDIT 2:
I don't know if this helps at all, but I just stepped through my program, running glGetError() and it looks like everything is fine until I actually call glDrawArrays(), then it returns GL_INVALID_OPERATION. I'm trying to figure out why this could be occurring, but still having no luck.
I figured this out, and it's sadly a very stupid mistake on my part.
I think the issue is that you need a vertex shader and a fragment shader when using 3.2 core profile, you can't just render without them. The reason it wasn't working with my shaders was...wait for it...after linking my shader program, I forgot to store the programID in the ivar in my class, so later when I call glUseProgram() I'm just calling it with a zero parameter.
I guess one of the main sources of confusion was the fact that I expected the 3.2 core profile to work without any vertex or fragment shaders.

Help understanding gluLookAt()

Imagine you're standing on the ground looking up at a cube in the sky. As you tilt your head, the cube moves. I'm trying to replicate this using OpenGL ES on the iPhone by manipulating the tilt of the camera while looking at a simple 3D cube drawn around the origin. I'm using the gluLookAt() function from Cocos2d which is supposed to emulate the OpenGL version and it seems that when I try to tinker with any of the values, my cube disappears.
My question is: Can you provide a gluLookAt() usage here that will get me started manipulating the camera so I can figure out how this works? I'm really just interesting in learning how to tilt the camera along the Y axis.
Here is my current code:
Viewport Configuration
glBindFramebufferOES(GL_FRAMEBUFFER_OES, _viewFramebuffer);
glViewport(0, 0, _backingWidth, _backingHeight);
Projection Matrix
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
// Maybe this should be a perspective projection?? If so,
// can you provide an example using gluPerspective()?
glOrthof(-_backingWidth, _backingWidth,-_backingHeight, _backingHeight, -1, 1);
ModelView Matrix
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
gluLookAt() // What goes here?
Drawing Code
static const GLfloat cubeVertices[] = {
-1.0, -1.0, 1.0,
1.0, -1.0, 1.0,
-1.0, 1.0, 1.0,
1.0, 1.0, 1.0,
-1.0, -1.0, -1.0,
1.0, -1.0, -1.0,
-1.0, 1.0, -1.0,
1.0, 1.0, -1.0,
};
static const GLushort cubeIndices[] = {
0, 1, 2, 3, 7, 1, 5, 4, 7, 6, 2, 4, 0, 1
};
static const GLubyte cubeColors[] = {
255, 255, 0, 255,
0, 255, 255, 255,
0, 0, 0, 0,
255, 0, 255, 255,
255, 255, 0, 255,
0, 255, 255, 255,
0, 0, 0, 0,
255, 0, 255, 255
};
glVertexPointer(3, GL_FLOAT, 0, cubeVertices);
glEnableClientState(GL_VERTEX_ARRAY);
glColorPointer(4, GL_UNSIGNED_BYTE, 0, cubeColors);
glEnableClientState(GL_COLOR_ARRAY);
glDrawElements(GL_TRIANGLE_STRIP, 14, GL_UNSIGNED_SHORT, cubeIndices);
I'm not completely sure what exactly you want, but here some explanations:
gluLookAt expects 3 vectors (each as 3 doubles): first the position of the camera (eye point), then the position to where you look (center point) and finally an up-vector that specifies an up-direction (this need not be the perfect orthogonal upward direction, as it is reorthogonalized anyway).
So if you stand at (0,0,5) and look at your cube (that is at the center) and want the y-axis to be the up-direction, you would call gluLookAt(0.0, 0.0, 5.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0) to see your cube in full beauty.
If you want to tilt your head to the side, you just need to change the up-vector and rotate it to the side a bit. Or if you want to look up, but without tilting the head to the side, you still use the y-axis as up-vector, but you just look at another point, so you change the center point to a point above and in front of you (maybe rotated about the eye position). But this won't work if you want to look straight up, in this case you need to change the up-vector to something orthogonal to the y-axis (in addition to setting the center point to a point straight above you, of course).
But I think you want a perspective projection. Your current ortho is at least quite inappropriate for your coordinates, as it specifies a coordinate system in which coordinates are in the size of pixel, so your [-1,1]-cube is about the size of a pixel on the screen. Try gluPerspective(60.0, ((double)_backingWidth)/_backingHeight, 0.1, 100.0). If you really want an orthographic projection without any realistic perspective distortion, you can use glOrtho, but in this case you should keep the size proprtions of the glOrtho parameters and your model's coordinates roughly in-sync (therefore not specify a screen-sized ortho and using coordinates in the [-1,1] range).

OpenGL 2.0 ES coordinates

I'm doing my first steps with OpenGL ES 2.0 trying things on my ipod touch. I was wondering how to solve this coordinates issue..
To explain better, I was trying to draw a quad and rotate/translate it using a vertex shader (also because from what I've read it seems the only way to do it).
Since I'm working with a ipod I have a 1.5 : 1 ratio and a viewport set by
glViewport(0, 0, backingWidth, backingHeight);
So 0,0 is the center and bounds for clipping should be at -1.0, -1.0, -1.0, 1.0, etc (right?)
To draw a square I had to use different values for x and y coordinates because of the aspect ratio:
static const GLfloat lineV[] = {
-0.5f, 0.33f, 0.5f, 0.33f,
0.5f, 0.33f, 0.5f,-0.33f,
0.5f,-0.33f, -0.5f,-0.33f,
-0.5f,-0.33f, -0.5f, 0.33f,
-0.5f, 0.33f, 0.5f,-0.33f,
0.5f, 0.33f, -0.5f,-0.33f,
};
It's a square with both diagonals (I know that using indexes would be more efficient but that's not the point)..
Then I tried writing a vertex shader to rotate the object while moving it:
void main()
{
m = mat4( cos(rotation), sin(rotation), 0.0, 0.0,
-sin(rotation), cos(rotation), 0.0, 0.0,
0.0, 0.0, 1.0, 0.0,
0.0, 0.0, 0.0, 1.0);
m2 = mat4(1.0);
m2[1][3] = sin(rotation)*0.8;
gl_Position = position*(m*m2);
}
It works but since coordinates are not the same the quad is distorted while it rotates. How should I prevent that? I thought if it was possible to change the view frustum to have different bounds (not -1.0 to 1.0 on both axis so that enlarging on y-axis would fix the problem).
In addition is there a better way to use matrixes? I mean, I was used to use glRotatef without having to specify the whole matrix.. does convenience functions/constructors exist to accomplish this task?
The first arguments to glViewport() is not the center, it's the bottom left corner's coordinates.
You should probably set up a projection that takes your aspect into account, typically using gluPerspective() (if GLU is available in ES).
No glut or support functions are provided from what I've seen. Basically I solved it by using equal coordinates when building vertices and using a vertex shader to scale on y axis by the right aspect ratio.

Resources