opengles and emscripten shader version unsupported - opengl-es

i am trying to target opengles #version 300 es through emscripten, but the compilation gives error unsupported shader version.
i am using emscripten version 2.0.6.
the emmake command i am using (taken reference from (here)),
$emcmake cmake ../ -DUSE_EMSCRIPTEN=1 -DCMAKE_FIND_ROOT_PATH=/ -DCMAKE_CXX_FLAGS="-sFULL_ES3 -sUSE_WEBGL2 -s MAX_WEBGL_VERSION=2 --preload-file vertex.vs --preload-file fragment.fs --preload-file test.jpg -O1 -std=c++17 --profiling -pthread -s FETCH=1 -s WASM=1 -s NO_EXIT_RUNTIME=1 -s INITIAL_MEMORY=1000MB -s USE_PTHREADS=1 -s PTHREAD_POOL_SIZE=20 -s \"EXTRA_EXPORTED_RUNTIME_METHODS=['ccall']\" -s DISABLE_EXCEPTION_CATCHING=0 --bind -s USE_SDL=2"
my vertex shader code:
#version 300 es
in vec3 position;
in vec4 color;
in vec3 normal;
in vec2 uv;
out vec4 v_color;
out vec3 v_normal;
out vec2 v_uv;
uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;
void main()
{
v_color = color;
v_normal = normal;
v_uv = uv;
gl_Position = projection * view * transpose(model) * vec4(position.xyz, 1.0);
}
what could be the reason here?
Edit: i guess this explains it, not sure though

Related

How to make visible uniform variable in both shaders?

I have variable in vertex shader "uIsTeapot".
uniform float uIsTeapot;
Vertex shader work with it very well, but fragment shader don't see it. If I use
if (uIsTeapot == 0.0)
then an error occured
"uIsTeapot": undeclared identifier
but I defined it in vertex shader. If I define uIsTeapot in both shaders as uniform, then program say "Could not initialise shaders" since program don't pass veryfication
if (!gl.getProgramParameter(shaderProgram, gl.LINK_STATUS)) {
alert("Could not initialise shaders");
}
EDITED:
I add mediump to variables and now program compiled without errors, but result is one object on screen, but I draw two object.
Vertex shader
attribute vec3 aVertexPosition;
attribute vec3 aVertexNormal;
attribute vec2 aTextureCoord;
attribute vec4 aVertexColor;
uniform mat4 uMVMatrix;
uniform mat4 uPMatrix;
uniform mat3 uNMatrix;
uniform vec3 uAmbientColor;
uniform vec3 uPointLightingLocation;
uniform vec3 uPointLightingColor;
uniform mediump float uIsTeapot;
//varying float vIsTeapot;
varying vec2 vTextureCoord;
varying vec3 vLightWeighting;
varying vec4 vColor;
void main(void) {
if (uIsTeapot == 1.0) {
vec4 mvPosition = uMVMatrix * vec4(aVertexPosition, 1.0);
gl_Position = uPMatrix * mvPosition;
vec3 lightDirection = normalize(uPointLightingLocation - mvPosition.xyz);
vec3 transformedNormal = uNMatrix * aVertexNormal;
float directionalLightWeighting = max(dot(transformedNormal, lightDirection), 0.0);
directionalLightWeighting = 100.0;
vLightWeighting = uAmbientColor + uPointLightingColor * directionalLightWeighting;
vTextureCoord = aTextureCoord;
}
if (uIsTeapot == 0.0) {
gl_Position = uPMatrix * uMVMatrix * vec4(aVertexPosition, 1.0);
vColor = aVertexColor;
} }
Fragment shader
precision mediump float;
varying vec2 vTextureCoord;
varying vec3 vLightWeighting;
varying vec4 vColor;
uniform sampler2D uSampler;
uniform mediump float uIsTeapot;
//varying float vIsTeapot;
void main(void) {
if (uIsTeapot == 1.0) {
vec4 textureColor = texture2D(uSampler, vec2(vTextureCoord.s, vTextureCoord.t));
gl_FragColor = vec4(textureColor.rgb * vLightWeighting, textureColor.a);
} else {
gl_FragColor = vColor;
}
}
You need to declare it in both shaders, with the same precision.
What do you mean by:
then program don't work right.
Can you post your vertex shader and your fragment shader? I suspect you are relying on the default precision, and it is likely that you are using highp in the vertex shader and mediump in the fragment shader
Try using:
uniform mediump float uIsTeapot;
... in both vertex shader and fragment shader.

GLSL : Use of undeclared identifier 'gl_FragData'

I'm debugging a really big project shading a tree, in that project the author used glsl file to shader the tree, but I have trouble compiling the glsl file:
Here is the error log:
compile: code/trees/TreeRender/Shader/Graph40.vert.glsl
ERROR: 0:4: '' : syntax error: #version
ERROR: 0:15: 'layout' : syntax error: syntax error
the Graph40.vert.glsl:
#version 120 core //the original file use version 400
#define VERT_POSITION 0
#define VERT_NORMAL 1
#define VERT_COLOR 2
#define VERT_TEXTURE 3
uniform mat4x4 matModel;
uniform mat4x4 matView;
uniform mat4x4 matProjection;
layout(location = VERT_POSITION) in vec4 Position;
layout(location = VERT_NORMAL) in vec4 Normal;
layout(location = VERT_COLOR) in vec4 Color;
layout(location = VERT_TEXTURE) in vec4 Texture;
out vec4 VertPosition;
out vec4 VertNormal;
out vec4 VertColor;
out vec4 VertTexture;
void main()
{
VertPosition = Position;
VertNormal = Normal;
VertColor = Color;
VertTexture = Texture;
gl_Position = matProjection * matView * matModel * vec4(Position.xyz, 1);
}
Another error log:
compile: code/trees/TreeRender/Shader/Default.vert.glsl
ERROR: 0:11: 'matModel' : syntax error: syntax error
The Default.vert.glsl:
#define VERT_POSITION 0
#define VERT_NORMAL 1
#define VERT_COLOR 2
#define VERT_TEXTURE 3
uniform mat4x4 matModel;
uniform mat4x4 matView;
uniform mat4x4 matProjection;
layout(location = VERT_POSITION) in vec4 Position;
layout(location = VERT_NORMAL) in vec4 Normal;
layout(location = VERT_COLOR) in vec4 Color;
layout(location = VERT_TEXTURE) in vec4 Texture;
out vec4 VertPosition;
out vec4 VertNormal;
out vec4 VertColor;
out vec4 VertTexture;
void main()
{
VertPosition = Position;
VertNormal = Normal;
VertColor = Color;
VertTexture = Texture;
gl_Position = matProjection * matView * matModel * vec4(Position.xyz, 1);
}
I tried to google the error, but found no feasible solution.
I use mac osx, xcode 7.0, the OpenGL and glut are all default versions. Glew version is 1.13.0.
Is that because of the version not match with the original version that the author used? Because I checked the original version, he used GLEW 1.9.0 and GLUT 3.7.6.
/////update//////
The original glsl files have:
#version 400 core
but there will be an error :
ERROR: 0:4: '' : version '400' is not supported
ERROR: 0:4: '' : syntax error: #version
so I commented that line. But other errors still there.
I checked my OpenGL version, using OpenGL Extension viewer, it's 4.1 in my mac, but versions older than that are also supported and should work too. But when I change to #version 410 core, it has the same error, saying that 410 is not supported.
///////////update////////////
It turned out that the version mac supported is NOT the version my context using. I printed in my code using GL_VERSION, it's 2.1 I'm using. Now I have changed into 4.1, according to [this][1]. But there is still errors:
trees/TreeRender/Shader/DefaultDepth.frag.glsl
helloERROR: 0:20: Use of undeclared identifier 'gl_FragData'
the DefaultLight.frag.glsl:
#version 400 core
in vec4 VertPosition;
in vec4 VertNormal;
in vec4 VertColor;
in vec4 VertTexture;
uniform vec3 lightPos;
void main()
{
float moment1 = gl_FragCoord.z;
float moment2 = moment1 * moment1;
gl_FragData[0] = vec4(moment1, moment2, 0.0, 1.0);
}
Version 120 does not support core profiles (only OpenGL 3.2+)
Layout quallifiers are also only available in OpenGL 3.2+

Can't get OpenGL code to work properly

I'm very new to OpenGL, GLSL and WebGL. I'm trying to get this sample code to work in a tool like http://shdr.bkcore.com/ but I can't get it to work.
Vertex Shader:
void main()
{
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
gl_TexCoord[0] = gl_MultiTexCoord0;
}
Fragment Shader:
precision highp float;
uniform float time;
uniform vec2 resolution;
varying vec3 fPosition;
varying vec3 fNormal;
uniform sampler2D tex0;
void main()
{
float border = 0.01;
float circle_radius = 0.5;
vec4 circle_color = vec4(1.0, 1.0, 1.0, 1.0);
vec2 circle_center = vec2(0.5, 0.5);
vec2 uv = gl_TexCoord[0].xy;
vec4 bkg_color = texture2D(tex0,uv * vec2(1.0, -1.0));
// Offset uv with the center of the circle.
uv -= circle_center;
float dist = sqrt(dot(uv, uv));
if ( (dist > (circle_radius+border)) || (dist < (circle_radius-border)) )
gl_FragColor = bkg_color;
else
gl_FragColor = circle_color;
}
I figured that this code must be from an outdated version of the language, so I changed the vertex shader to:
precision highp float;
attribute vec2 position;
attribute vec3 normal;
varying vec2 TextCoord;
attribute vec2 textCoord;
uniform mat3 normalMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
varying vec3 fNormal;
varying vec3 fPosition;
void main()
{
gl_Position = vec4(position, 0.0, 1.0);
TextCoord = vec2(textCoord);
}
That seemed to fix the error messages about undeclared identifiers and not being able to "convert from 'float' to highp 4-component something-or-other", but I have no idea if, functionally, this will do the same thing as the original intended.
Also, when I convert to this version of the Vertex Shader I have no idea what I'm supposed to do with this line in the Fragment Shader:
vec2 uv = gl_TexCoord[0].xy;
How do I convert this line to fit in with the converted vertex shader and how can I be sure that the vertex shader is even converted correctly?
gl_TexCoord is from desktop OpenGL, and not part of OpenGL ES. You'll need to create a new user-defined vec2 varying to hold the coordinate value.

Threejs normal values in shader are set to 0

I'm trying to get this tutorial to work but I ran into two issues, one of which can be found here. The other one is the following.
For convenience this is the code that is supposed to work and here's a jsfiddle.
Vertex-shader:
uniform mat4 projectionMatrix;
uniform mat4 modelViewMatrix;
attribute vec3 position;
uniform vec3 normal;
varying vec3 vNormal;
void main() {
test = 0.5;
vNormal = normal;
gl_Position = projectionMatrix *
modelViewMatrix *
vec4(position,1.0);
}
Fragment-shader:
varying mediump vec3 vNormal;
void main() {
mediump vec3 light = vec3(0.5, 0.2, 1.0);
// ensure it's normalized
light = normalize(light);
// calculate the dot product of
// the light to the vertex normal
mediump float dProd = max(0.0, dot(vNormal, light));
// feed into our frag colour
gl_FragColor = vec4(dProd, // R
dProd, // G
dProd, // B
1.0); // A
}
The values for normal in the vertex shader or at least the values for vNormal in the fragment shader seem to be 0. The sphere that is supposed to show up stays black. As soon as I change the values for gl_FragColor manually the sphere changes colors. Can anybody tell me why this is not working?
In your vertex shader the vec3 normal should be an attribute (since each vertex has a normal) not a uniform:
attribute vec3 normal;
Here is the working version of your code.

openGL ES shaders wrong uniforms location

vertex shader looks like this:
uniform mat4 projectionMatrix;
uniform mat4 modelMatrix;
uniform mat4 viewMatrix;
attribute vec4 vPosition;
attribute vec4 vColor;
varying vec4 vDestinationColor;
void main(void)
{
gl_Position = projectionMatrix * modelMatrix * viewMatrix * vPosition;
vDestinationColor = vColor;
}
Objective-C code:
_projectionMatrixSlot = glGetUniformLocation(_programHandle, "projectionMatrix");
_modelMatrixSlot = glGetUniformLocation(_programHandle, "modelMatrix");
_viewMatrixSlot = glGetUniformLocation(_programHandle, "viewMatrix");
_positionAttribSlot = glGetAttribLocation(_programHandle, "vPosition");
_colorAttribSlot = glGetAttribLocation(_programHandle, "vColor");
here _projectionMatrixSlot _viewMatrixSlot _modelMatrixSlot equals 4294967295
while _positionAttribSlot and _colorAttribSlot is fine
The compiler is free to throw away variables that are not used in the code. Therefore, even if a uniform is declared in the shader, as long as it is not used, its reported location can be -1 or max int or unsigned int.
You may have attached wrong vertex shader, not this one which you post here.

Resources