GLSL - Checking for set attributes - opengl-es

I have a vertex shader with attributes that may or may not be set in any given frame. How can I check whether or not these attributes have been set?
What I'd like to do:
attribute vec3 position;
attribute vec3 normal;
attribute vec4 color;
attribute vec2 textureCoord;
uniform mat4 perspective;
uniform mat4 modelview;
uniform mat4 camera;
uniform sampler2D textureSampler;
varying lowp vec4 vColor;
void main() {
gl_Position = perspective * camera * modelview * vec4(position, 1.0);
if ((bool)textureCoord) { // syntax error
vColor = texture2D(textureSampler, textureCoord);
} else {
vColor = (bool)color ? color : vec4((normal.xyz + 1.0)/2.0 , 1.0);
}
}

I have a vertex shader with attributes that may or may not be set in any given frame.
No, you don't. :)
With attributes, it's impossible that an attribute wouldn't be "set". Every vertex shader instance receives valid values from every declared attribute.
If the attribute array is not enabled by glEnableVertexArray, then the default attribute (as specified by glVertexAttrib and its defaults) will be passed.
In your case, you can either:
compile your shader in different versions with or without texturing (conditional compilation is your friend; google for UberShader),
use an uniform variable like "useTexturing" to save on shader switches.
Pick one.

Related

Declare external global variables for glsl validator / webgl / three.js

I'm building a project with three.js and importing glsl files externally (with glsl-ify-loader) for use in a Three ShaderMaterial.
When using ShaderMaterial, Three prepends global variables like projectionMatrix, modelViewMatrix to my shader code pre-compilation when it concats the shader. So when I write my shader all i need is (as a simple example):
varying vec3 vNormal;
void main () {
vNormal = normal;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
... or similar.
My problem is that I'm using the glsl validator on my shader files which subsequently thinks that the pre-declared three variables are undeclared.
In JS, with eslint you could put /* global aGlobalVariableHere */ to appease the lint gods.
Is there anyway of doing this with the glsl validator? I can't find any resources that suggest how I could go about it.
You could use THREE.RawShaderMaterial (see docs), instead of ShaderMaterial. They're identical, except Raw doesn't prepend any uniforms or attributes to your shader at all, you have to do it manually. Then your linter will no longer act surprised:
Top of vertex shader:
precision highp float;
uniform mat4 modelMatrix;
uniform mat4 viewMatrix;
uniform mat4 projectionMatrix;
uniform vec3 cameraPosition;
// ...
attribute vec3 position;
attribute vec3 normal;
attribute vec2 uv;
// ...
You can read this page to see what uniforms and attributes get automatically added so you can add them yourself if needed in your shader code.

Threejs normal values in shader are set to 0

I'm trying to get this tutorial to work but I ran into two issues, one of which can be found here. The other one is the following.
For convenience this is the code that is supposed to work and here's a jsfiddle.
Vertex-shader:
uniform mat4 projectionMatrix;
uniform mat4 modelViewMatrix;
attribute vec3 position;
uniform vec3 normal;
varying vec3 vNormal;
void main() {
test = 0.5;
vNormal = normal;
gl_Position = projectionMatrix *
modelViewMatrix *
vec4(position,1.0);
}
Fragment-shader:
varying mediump vec3 vNormal;
void main() {
mediump vec3 light = vec3(0.5, 0.2, 1.0);
// ensure it's normalized
light = normalize(light);
// calculate the dot product of
// the light to the vertex normal
mediump float dProd = max(0.0, dot(vNormal, light));
// feed into our frag colour
gl_FragColor = vec4(dProd, // R
dProd, // G
dProd, // B
1.0); // A
}
The values for normal in the vertex shader or at least the values for vNormal in the fragment shader seem to be 0. The sphere that is supposed to show up stays black. As soon as I change the values for gl_FragColor manually the sphere changes colors. Can anybody tell me why this is not working?
In your vertex shader the vec3 normal should be an attribute (since each vertex has a normal) not a uniform:
attribute vec3 normal;
Here is the working version of your code.

openGL ES shaders wrong uniforms location

vertex shader looks like this:
uniform mat4 projectionMatrix;
uniform mat4 modelMatrix;
uniform mat4 viewMatrix;
attribute vec4 vPosition;
attribute vec4 vColor;
varying vec4 vDestinationColor;
void main(void)
{
gl_Position = projectionMatrix * modelMatrix * viewMatrix * vPosition;
vDestinationColor = vColor;
}
Objective-C code:
_projectionMatrixSlot = glGetUniformLocation(_programHandle, "projectionMatrix");
_modelMatrixSlot = glGetUniformLocation(_programHandle, "modelMatrix");
_viewMatrixSlot = glGetUniformLocation(_programHandle, "viewMatrix");
_positionAttribSlot = glGetAttribLocation(_programHandle, "vPosition");
_colorAttribSlot = glGetAttribLocation(_programHandle, "vColor");
here _projectionMatrixSlot _viewMatrixSlot _modelMatrixSlot equals 4294967295
while _positionAttribSlot and _colorAttribSlot is fine
The compiler is free to throw away variables that are not used in the code. Therefore, even if a uniform is declared in the shader, as long as it is not used, its reported location can be -1 or max int or unsigned int.
You may have attached wrong vertex shader, not this one which you post here.

Compute normals in shader issue

I have the following vertex shader to rotate normals. Before I implemented that, I passed also the rotation matrix of the mesh to calculate the normals. That time lighting was just fine.
#version 150
uniform mat4 projection;
uniform mat4 modelview;
in vec3 position;
in vec3 normal;
in vec2 texcoord;
out vec3 fposition;
out vec3 fnormal;
out vec2 ftexcoord;
void main()
{
mat4 mvp = projection * modelview;
fposition = vec3(mvp * vec4(position, 1.0));
fnormal = normalize(mat3(transpose(inverse(modelview))) * normal);
ftexcoord = texcoord;
gl_Position = mvp * vec4(position, 1.0);
}
But with this shader, the lighting computed in the fragment shader turns with the camera. I haven't changed the fragment shader, so the issue should be in the code above.
What am I doing wrong in computing the normals?
The steps you use to create the normal Matrix might be out of order.
Try:
fnormal = normalize(transpose(inverse(mat3(modelview))) * normal)
Edit:
Since you are inverting the mat4, the translation values (which get truncated when a mat4 is converted to a mat3) are probably affecting the calculation of the inverse matrix.

Desktop GLSL without ftransform()

I'm porting a codebase of mine from fixed-function OpenGL 1.x to OpenGL 2.x - Technically OpenGL ES 2.0, but I'm still coding on the desktop, just keeping in mind the limitations that ES 2.0 imposes which are similar to the 3.1 'new' profile.
Problem is, it seems like for anything other than 2D, creating a shader passing in the modelviewprojection matrix as a uniform does not work. Normally I get a black screen, but if I set the Z value of all my vertices to 0 I get stuff to show up.
Putting my shaders in RenderMonkey works when I have ES 2.0 mode enabled, but on standard desktop GL it's just a black screen (no compiler errors/warnings):
vert shader:
uniform mat4 mvp_matrix;
uniform mat4 obj_matrix;
uniform vec4 u_color;
attribute vec3 a_vertex;
attribute vec2 a_texcoord0;
varying vec4 v_color;
varying vec2 v_texcoord0;
void main(void)
{
v_color = u_color;
gl_Position = mvp_matrix * (obj_matrix * vec4(a_vertex, 1.0));
v_texcoord0 = a_texcoord0;
}
frag shader:
uniform sampler2D t_texture0;
varying vec2 v_texcoord0;
varying vec4 v_color;
void main(void)
{
vec4 color = texture2D(t_texture0, v_texcoord0);
gl_FragColor = color * v_color;
}
I am passing in the matrices as glUniformMatrix4fv(location, 1, GL_FALSE, mvpMatrix);
This shader works like gold for anything drawn in 2D. What am I doing wrong here? Or am I required to use ftransform() on desktop GL?
One thing I think needs a bit of clarification:
A model matrix transforms an object from object coordinates to world coordinates.
A view matrix transforms the world coordinates to eye coordinates.
A projection matrix converts eye coordinates to clip coordinates.
Based on standard naming conventions, the mvpMatrix is projection * view * model, in that order. There is no other matrices that you need to multiply by. Projection is your projection matrix (either ortho or perspective), view is the camera transform matrix (NOT the modelview), and model is the position, scale, and rotation of your object.
I believe the issue either lies in either multiplying matrices that don't need to be multiplied together or in multiplying matrices in the wrong order. (matrix multiplication isn't commutative)
If you haven't already solved this, I would recommend sending all 3 matrices over separately and later dumping the values back to make sure there are no issues sending the matrices over.
Vertex shader:
attribute vec4 a_vertex;
attribute vec2 a_texcoord0;
varying vec2 v_texcoord0;
uniform mat4 projection;
uniform mat4 view;
uniform mat4 model;
void main(void)
{
gl_Position = projection * view * model * a_vertex;
v_texcoord0 = a_texcoord0;
}
Fragment Shader:
uniform sampler2D t_texture0;
uniform vec4 u_color;
varying vec2 v_texcoord0;
void main(void)
{
vec4 color = texture2D(t_texture0, v_texcoord0);
gl_FragColor = color * u_color;
}
Also I moved the color uniform to the frag shader, passing it through as a varying is unnecessary when all the vertices will have the same color.

Resources