How to use UBO in OpenGL ES2.0 shaders - opengl-es

1.I need to move the shader in opengl to Opengles 2.0.so I have a problem, I don't know how to transfer this structure called UBO.
2.What should I assign to the program if the transfer is successful?
1.1 To transfer to opengles2.0 code:
layout(binding = 0) uniform UniformBufferObject
{
mat4 model;
mat4 normal;
mat4 view;
mat4 proj;
vec3 eyepos;
material_struct material;
light_struct lights[8];
} ubo;
2.1 I want to transform the vertex data. How should I assign this UBO to the program?
//vertex
int mPositionHandle = GLES20.GetAttribLocation(_program, "vPosition");
GLES20.EnableVertexAttribArray(mPositionHandle);
GLES20.VertexAttribPointer(mPositionHandle, 3, GLES20.GL_FLOAT, false, 0, _buffer);
//color
int mColorHandle = GLES20.GetAttribLocation(_program, "aColor");
GLES20.EnableVertexAttribArray(mColorHandle);
GLES20.VertexAttribPointer(mColorHandle, 4, GLES20.GL_FLOAT, false, 0, _color);
//UBO???
At present, the vertex data, index, and color are all there, but the vertex data is too large. I hope to change between (-1~1).

Uniform blocks are not provided in OpenGL ES 2.0. See GLSL ES 1.0 specification.
Uniform blocks are supported by OpenGL ES 3.0 respectively GLSL ES 3.00.
See GLSL ES 3.00 specification - 4.3.7 Interface Blocks; page 43.
But the binding layout qualifier is provided since OpenGL ES 3.1 and GLSL ES 3.10.
See GLSL ES 3.10 specification - 4.4 Layout Qualifiers; page 51.
In OpenGL ES 3.0 the binding of a uniform block can be set by glUniformBlockBinding and the uniform block index of the program can be get by glGetUniformBlockIndex. The program has to be successfully linked before, in both cases. Note that the uniform block index is not to be confused with the uniform location, this are different things.
In OpenGL ES 2.0 the only possibility is to use conventional Uniform variables.

Related

OpenGL ES 3.2 doesn't recognize gl_in in geometry shader

I have the following shader code:
#version 320 es
layout(points) in;
layout(points, max_vertices=1) out;
uniform mat4 transform;
void main() {
gl_Position = gl_in[0].gl_Position * transform;
EmitVertex();
EndPrimitive();
}
But when creating the shader program I get the following error:
'gl_in' : undeclared identifier
'gl_in' : left of '[' is not of type array, matrix, or vector
'gl_Position' : field selection requires structure, vector, or matrix on left hand side
'assign' : cannot convert from 'const highp float' to 'Position 4-component vector of highp float
But in https://www.khronos.org/registry/OpenGL/specs/es/3.2/GLSL_ES_Specification_3.20.html it explicitly states the existance of gl_in (as a built-in variable).
It is related to Intel UHD graphics not supporting OpenGLES explictly. When I checked what GLSL version was requested it was requesting OpenGL 2.0 which does not support gl_in as a built in.

How to draw the data (from gleReadPixels) onto the default (display) farmebuffer in OpenGLES 2.0

Sorry if I am asking something which is already available. So far I could not trace. I read details about FBO and got a fair idea about off-screen buffering. http://mattfife.com/?p=2813 is a nice little article on FBOs. In all the examples, including this one, I don't see details on how to write the data, retrieved through glReadPixels call, onto default display framebuffer. Sorry if I am missing anything silly. I did my due diligence but could not get any example.
Note: I am using OpenGLES 2.0, hence I cannot use calls such as glDrawPixels, etc.
Basically my requirement is to have off-screen buffering. Because I am working on subtitle/captions wherein scrolling of the caption will have to repeat the rendering of lines till those go out of caption-display area.
I got a suggestion to use FBO and bind the texture created to the main default framebuffer.
My actual need is caption/ subtitle (which can be in scrolling mode)
Suppose the first time I had below on display,
This is Line Number - 1
This is Line Number - 2
This is Line Number - 3
After scrolling, then I want to have,
This is Line Number - 2
This is Line Number - 3
This is Line Number - 4
In the second time when I want to render, I will have to update the content in offscreen FBO? That would be re-writing line-2 and line-3 at a new position, removing line-1 and adding line-4.
Create a framebuffer with an texture attachment (see Attaching Images). Note glFramebufferTexture2D is supported by OpenGL ES 2.0.
The color plane of the framebuffer can be loaded to the CPU by glReadPixels, the same way as when you use a Renderbuffer. But the rendering is stored in a 2D Texture.
Bind the texture and the default framebuffer and render a screen space quad with the texture on it.
Render a quad (GL_TRIANLGE_FAN) with the vertex coordinates (-1, -1), (1, -1), (1, 1), (-1, 1) and use the following simple OpenGL ES 2.0 shader:
Vertex shader
attribute vec2 pos;
varying vec2 uv;
void main()
{
uv = pos * 0.5 + 0.5;
gl_Position = vec4(pos, 0.0, 1.0);
}
Fragment shader
precision mediump float;
varying vec2 uv;
uniform sampler2D u_texture;
void main()
{
gl_FragColor = texture2D(u_texture, uv);
}

How effectively interpolate between many color attributes in GLSL ES 2.0

I'm working on project with OpenGl ES 2.0. Every vertex in my mesh has fixed number of color attributes (lets say 5). The final per-vertex color is computed as an interpolation between two selected color attributes.
In my implementation, choice of the two colors is based on two given indexes. I'm aware that if statement may by a big performance hit so a choose to put all attributes into one array and use indexing to retrieve wanted colors. Still i see a significant performance drop.
attribute vec4 a_position;
//The GLSL ES 2.0 specification states that attributes cannot be declared as arrays.
attribute vec4 a_color;
attribute vec4 a_color1;
attribute vec4 a_color2;
attribute vec4 a_color3;
attribute vec4 a_color4;
uniform mat4 u_projTrans;
uniform int u_index;
uniform int u_index1;
uniform float u_interp;
varying vec4 v_color;
void main()
{
vec4 colors[5];
colors[0] = a_color;
colors[1] = a_color1;
colors[2] = a_color2;
colors[3] = a_color3;
colors[4] = a_color4;
v_color = mix(colors[u_index], colors[u_index1], u_interp);
gl_Position = u_projTrans * a_position;
}
Is there a better more efficient way of computing the final color interpolation? Or at least a better way to choose interpolation colors?
The indices you use here are uniforms. That means every vertex in each rendering command uses the same indices. If that's the case... why do you bother fetching this stuff in the VS at all?
You should only have 2 color input values. You then use glVertexAttribPointer to pick the two arrays that will be interpolated between.
Your "significant performance drop" likely has nothing to do with how you fetch such values, and everything to do with the fact that you're sending lots of per-vertex data that never gets used for anything.

can we use structs for uniforms in GLSL-ES

I am trying to use structs for the uniforms in my vertex shader in ES:
struct temp{
mat4 mvp;
};
uniform temp MP;
in vec2 inPos;
void main() {
vec4 vert = MP.mvp * vec4(inPos.x,inPos.y,0,1);
gl_Position = vert;
}
glGetUniformLocation(program, "MP.mvp");
it does not display any thing on screen, neigther any glerror. is it allowed in glsl es 300?
It works with OpenGL.
From The OpenGL ES Shading Language v3.00, in section 4.3.5 Uniform Variables it states:
The uniform qualifier can be used with any of the basic data types, or when declaring a variable whose type is a structure, or an array of any of these.
If glGetUniformLocation(program, "MP.mvp") is returning 0 or greater then what you've shared so far looks legit. You'll probably have to post more code to get to the bottom of it.
If your struct only has a mat4 in it then I would recommend just eliminating the struct altogether.

How can I get Alpha blending transparency working in OpenGL ES 2.0?

I'm in the midst of porting some code from OpenGL ES 1.x to OpenGL ES 2.0, and I'm struggling to get transparency working as it did before; all my triangles are being rendered fully opaque.
My OpenGL setup has these lines:
// Draw objects back to front
glDisable(GL_DEPTH_TEST);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glDepthMask(false);
And my shaders look like this:
attribute vec4 Position;
uniform highp mat4 mat;
attribute vec4 SourceColor;
varying vec4 DestinationColor;
void main(void) {
DestinationColor = SourceColor;
gl_Position = Position * mat;
}
and this:
varying lowp vec4 DestinationColor;
void main(void) {
gl_FragColor = DestinationColor;
}
What could be going wrong?
EDIT: If I set the alpha in the fragment shader manually to 0.5 in the fragment shader (or indeed in the vertex shader) as suggested by keaukraine below, then I get transparent everything. Furthermore, if I change the color values I'm passing in to OpenGL to be floats instead of unsigned bytes, then the code works correctly.
So it looks as though something is wrong with the code that was passing the color information into OpenGL, and I'd still like to know what the problem was.
My vertices were defined like this (unchanged from the OpenGL ES 1.x code):
typedef struct
{
GLfloat x, y, z, rhw;
GLubyte r, g, b, a;
} Vertex;
And I was using the following code to pass them into OpenGL (similar to the OpenGL ES 1.x code):
glBindBuffer(GL_ARRAY_BUFFER, glTriangleVertexBuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(Vertex) * nTriangleVertices, triangleVertices, GL_STATIC_DRAW);
glUniformMatrix4fv(matLocation, 1, GL_FALSE, m);
glVertexAttribPointer(positionSlot, 4, GL_FLOAT, GL_FALSE, sizeof(Vertex), (void*)offsetof(Vertex, x));
glVertexAttribPointer(colorSlot, 4, GL_UNSIGNED_BYTE, GL_FALSE, sizeof(Vertex), (void*)offsetof(Vertex, r));
glDrawArrays(GL_TRIANGLES, 0, nTriangleVertices);
glBindBuffer(GL_ARRAY_BUFFER, 0);
What is wrong with the above?
Your Colour vertex attribute values are not being normalized. This means that the vertex shader sees values for that attribute in the range 0-255.
Change the fourth argument of glVertexAttribPointer to GL_TRUE and the values will be normalized (scaled to the range 0.0-1.0) as you originally expected.
see http://www.khronos.org/opengles/sdk/docs/man/xhtml/glVertexAttribPointer.xml
I suspect the DestinationColor varying to your fragment shader always contains 0xFF for the alpha channel? If so, that is your problem. Try changing that so that the alpha actually varies.
Update: We found 2 good solutions:
Use floats instead of unsigned bytes for the varyings that are supplied to the DestinationColor in the fragment shader.
Or, as GuyRT pointed out, you can change the fourth argument of glVertexAttribPointer to GL_TRUE to tell OpenGL ES to normalize the values when they are converted from integers to floats.
To debug this situation, you can try setting constant alpha and see if it makes a difference:
varying lowp vec4 DestinationColor;
void main(void) {
gl_FragColor = DestinationColor;
gl_FragColor.a = 0.5; /* try other values from 0 to 1 to test blending */
}
Also you should ensure that you're picking EGL config with alpha channel.
And don't forget to specify precision for floats in fragment shaders! Read specs for OpenGL GL|ES 2.0 (http://www.khronos.org/registry/gles/specs/2.0/GLSL_ES_Specification_1.0.17.pdf section 4.5.3), and please see this answer: https://stackoverflow.com/a/6336285/405681

Resources