Texturing OpenGL/WebGL rectangle - opengl-es

I have drawn a rectangle, which spins. Also I have set the texture for it, but it does look like very ugly (very low resolution).
The table of vertex for rectangle is the next:
1.0, 1.0, 0.0,
-1.0, 1.0, 0.0,
1.0, -1.0, 0.0,
-1.0, -1.0, 0.0
This is the coordinates for the texture mapping the above rectangle:
0.0, 0.0,
1.0, 0.0,
1.0, 1.0,
0.0, 1.0,
The size of texture is 512x512 (it's too large!!!, so there mustn't be such problems with the size exactly).
The full source code locates here:
http://pastebin.com/qXJFNe1c
I clearly understand, that is my fault, but I don't get where exactly is the fault.
PS
I think, that such a problem isn't related strongly to WebGL, I think that some pure OpenGL developers could give me a piece of advice also.
If you want to test it live, you may check it via:
http://goo.gl/YpXyPl

When testing, [.WebGLRenderingContext]RENDER WARNING: texture bound to texture unit 0 is not renderable. It maybe non-power-of-2 and have incompatible texture filtering or is not 'texture complete' 77.246.234.123:8893/plane/:1, but it does render a texture at least, so maybe that's about something else.
Also the viewport seems to be 300 x 150. The canvas rendering width/height does not match the width/height within the html page. Try this:
function updateCanvasSize(canvas)
{
if (canvas.width != canvas.clientWidth || canvas.height != canvas.clientHeight)
{
canvas.width = canvas.clientWidth;
canvas.height = canvas.clientHeight;
gl.viewportWidth = canvas.width;
gl.viewportHeight = canvas.height;
//might want to update the projection matrix to fix the new aspect ratio too
}
}
and in the a draw or update function (I doubt this would impact performance)...
var canvas = document.getElementById("canvas-main"); //or store "canvas" globally
updateCanvasSize(canvas);
The incorrect canvas size occurs because at the time webGLStart is called, the canvas is actually 300x150 for whatever reason. Either 1. You actually want a fixed size render target (give the canvas widht/height in pixels and all is well) or 2. you want it to take up the whole window, in which case the user may want to resize the window which needs to be handled (js probably has a resize event you could use too instead of polling).

Related

Strange behavior of alpha without blending in WebGL

I found strange behavior of WebGL when it is rendering with blending turned off. I reproduced it on this simplest tutorial.
Just change strings:
gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
to
gl_FragColor = vec4(0.0, 0.0, 0.0, 0.5);
and
gl.clearColor(0.0, 0.0, 0.0, 1.0);
to
gl.clearColor(1.0, 1.0, 1.0, 1.0);
So, since blending is turned off, I supposed to see black shapes on white background (alpha 0.5 of the pixel shouldn't make influence). But I see gray shapes on white backgound. I believe I missed something, but I can't undertand what. Any ideas?
P.S. gl.disable(gl.BLEND) doesn't change the result.
This is basically already answered here
Alpha rendering difference between OpenGL and WebGL
What you're seeing is that WebGL canvases are, by default, blended with the background. Either the background color of the canvas or whatever it's a child of. The default background color for HTML is white so if you draw with [0.0, 0.0, 0.0, 0.5] that's 50% alpha black blended with the white webpage.
See the link above for how to fix it.

Ruby, openGL : change texture luminosity

I have some problems with OpenGL and luminosity. Let me explain you my problem :
I drew this "sprite" (it's only a plane here) with a code like that :
sprite.set_active
left, right, top, bottom = 0.0, 1.0, 1.0, 0.0
glPushMatrix
glTranslate(#position.x - 16, #position.y, #position.z)
glRotate(-90 -#window.camera.horizontal_angle, 0, 1, 0)
glScale(chara.width, chara.height, 32.0)
begin
glEnable(GL_BLEND)
glBegin(GL_QUADS)
glColor4f(1.0, 1.0, 1.0, 1.0)
glTexCoord2d(left, top); glVertex3f(0, 1, 0.5)
glTexCoord2d(right, top); glVertex3f(1, 1, 0.5)
glTexCoord2d(right, bottom); glVertex3f(1, 0, 0.5)
glTexCoord2d(left, bottom); glVertex3f(0, 0, 0.5)
glEnd
glDisable(GL_BLEND)
rescue
end
glPopMatrix
My problem is with that line :
glColor4f(1.0, 1.0, 1.0, 1.0)
Well, I can put a number lesser than 1.0 to have a darker sprite, but I can't do the contrary. How can I do that ? How can I make the sprite be totally white, for example ?
To get full control over your fragment processing, the best approach is using the programmable pipeline, where you can implement exactly what you want with GLSL code.
But there are some options that could work for this case in the fixed pipeline. The simplest one is using a different GL_TEXTURE_ENV_MODE. The default value is GL_MODULATE, which means that the color you specified with glColor4f() is multiplied with the color from the texture. As you found, that allows you to make the texture darker, but not brighter.
You could try using GL_ADD instead. As the name suggests, this will produce the final output as the sum of the texture color and the color from glColor4f(). For example:
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_ADD);
glColor4f(0.2f, 0.2f, 0.2f, 0.0f);
would add 0.2 to the color components read from the texture.
There is more complex functionality in the fixed pipeline that gives you more control over how texture values are used to generate colors. You can find it by looking for "texture combiners". But in my personal opinion, you're much better off moving to the programmable pipeline if you need something complex enough to require texture combiners.

OpenGL getting background pixels info

I have an OpenGL application with fully transparent window and I need to draw a picture into it with pixels transparency depending on background. Is there any way of getting background pixel data that are BELOW my transparent window (like wallpaper, desktop, another windows etc) so I can dynamically change pixels in shaders?
For now I have code like this
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_COLOR, GL_DST_COLOR);
glClearColor(1.0, 0, 0, 1.0);
glClear(GL_COLOR_BUFFER_BIT);
[self.shader useShader];
[self drawTriangle];
useShader just calls the glUseProgram procedure and drawTriangle just draws a test triangle.
The shader is:
#version 120
void main()
{
gl_FragColor = gl_SecondaryColor + vec4(0.0, 1.0, 0.0, 0.0);
}
So if I clear the window with (1.0, 0, 0, 1.0) I get the yellow triangle as expected, but when i switch to (0, 0, 0, 0) it gets green. Is there any way of getting undercolor data?
Is there any way of getting background pixel data that are BELOW my transparent window
With just OpenGL? No. In fact you can't even read back destination framebuffer pixels in a shader.
You'll have to use operating specific functions to retrieve the screen contents below the window as an image, load it into a texture and pass this to rendering.

How to set up vertex attributes in OpenGL?

I am trying to create a VBO for a simple rectangle. The GL is set up to use the core profile (GL: 3.2, GLSL: 1.5, inside an NSView in Cocoa).
I spent hours trying to figure out how to draw a simple rectangle in OpenGL. It seems really hard to find any decent tutorials on the core profile. The best I could find was this tutorial.
I adapted it to my needs and came up with the following code:
GLfloat vertices[] = { 1.0, -1.0, 1.0,
1.0, 1.0, 1.0,
-1.0, 1.0, -1.0,
-1.0, -1.0, -1.0 };
glGenBuffers(1, &vertexVBO);
glBindBuffer(GL_ARRAY_BUFFER, vertexVBO);
glBufferData(GL_ARRAY_BUFFER, sizeof(GLfloat)*3*4, vertices, GL_STATIC_DRAW);
glVertexAttribPointer(VERTEX_POS, 3, GL_FLOAT, GL_FALSE, 0, 0); // VERTEX_POS = 0
glEnableVertexAttribArray(VERTEX_POS); // fails
However, this is throwing an error when calling glEnableVertexAttribArray: INVALID_OPERATION. The documentation suggests that this error is produced if the aforementioned call is made between a glBegin and glEnd. This is not the case. I mean, as far as I know, glBegin and glEnd are not even supported in the core profile.
Thus, I am at a loss. How can I draw this stupid rectangle (or at least initialize it)?
You need to bind a VAO before setting attribute pointers.
GLuint vao_name;
glGenVertexArrays(1, &vao_name);
glBindVertexArray(vao_name);
// ...
glVertexAttribPointer(...);
glEnableVertexAttribArray(...);
Also, the documentation you're linking to is outdated — use this one instead.

OpenGL 2.0 ES coordinates

I'm doing my first steps with OpenGL ES 2.0 trying things on my ipod touch. I was wondering how to solve this coordinates issue..
To explain better, I was trying to draw a quad and rotate/translate it using a vertex shader (also because from what I've read it seems the only way to do it).
Since I'm working with a ipod I have a 1.5 : 1 ratio and a viewport set by
glViewport(0, 0, backingWidth, backingHeight);
So 0,0 is the center and bounds for clipping should be at -1.0, -1.0, -1.0, 1.0, etc (right?)
To draw a square I had to use different values for x and y coordinates because of the aspect ratio:
static const GLfloat lineV[] = {
-0.5f, 0.33f, 0.5f, 0.33f,
0.5f, 0.33f, 0.5f,-0.33f,
0.5f,-0.33f, -0.5f,-0.33f,
-0.5f,-0.33f, -0.5f, 0.33f,
-0.5f, 0.33f, 0.5f,-0.33f,
0.5f, 0.33f, -0.5f,-0.33f,
};
It's a square with both diagonals (I know that using indexes would be more efficient but that's not the point)..
Then I tried writing a vertex shader to rotate the object while moving it:
void main()
{
m = mat4( cos(rotation), sin(rotation), 0.0, 0.0,
-sin(rotation), cos(rotation), 0.0, 0.0,
0.0, 0.0, 1.0, 0.0,
0.0, 0.0, 0.0, 1.0);
m2 = mat4(1.0);
m2[1][3] = sin(rotation)*0.8;
gl_Position = position*(m*m2);
}
It works but since coordinates are not the same the quad is distorted while it rotates. How should I prevent that? I thought if it was possible to change the view frustum to have different bounds (not -1.0 to 1.0 on both axis so that enlarging on y-axis would fix the problem).
In addition is there a better way to use matrixes? I mean, I was used to use glRotatef without having to specify the whole matrix.. does convenience functions/constructors exist to accomplish this task?
The first arguments to glViewport() is not the center, it's the bottom left corner's coordinates.
You should probably set up a projection that takes your aspect into account, typically using gluPerspective() (if GLU is available in ES).
No glut or support functions are provided from what I've seen. Basically I solved it by using equal coordinates when building vertices and using a vertex shader to scale on y axis by the right aspect ratio.

Resources