lighting problems after gldisable(gl_cull_face) in opengl es 2.0 - opengl-es

I have been working in a skull with opengl es 2.0. The skull appeared without some faces:
After that I dedided to disable gl_cull_face. The problem with the faces was
solved but the the skull appears now without light and I havenĀ“t changed shader code:
I have tried with other lighting algorithms and the skull continues black. I can't understand what is the relationship between gldisable(gl_cull_face) and lighting. The shader code is this:
Vertex Shader:
uniform mediump mat4 MODELVIEWMATRIX;
uniform mediump mat4 PROJECTIONMATRIX;
uniform mediump mat3 NORMALMATRIX;
uniform mediump vec3 LIGHTPOSITION;
varying lowp vec3 lightcolor;
attribute mediump vec3 POSITION;
attribute lowp vec3 NORMAL;
lowp vec3 normal;
attribute mediump vec2 TEXCOORD0;
varying mediump vec2 texcoord0;
void main( void ) {
mediump vec3 position = vec3( MODELVIEWMATRIX * vec4( POSITION, 1.0 ) );
normal = normalize( NORMALMATRIX * NORMAL );
mediump vec3 lightdirection = normalize( LIGHTPOSITION - position );
lowp float ndotl = max( dot( normal, lightdirection ), 0.0 );
lightcolor = ndotl * vec3( 1.0 );
gl_Position = PROJECTIONMATRIX * vec4( position, 1.0 );
texcoord0 = TEXCOORD0;
}
Fragment Shader:
varying mediump vec2 texcoord0;
uniform sampler2D DIFFUSE;
varying lowp vec3 lightcolor;
void main( void ) {
gl_FragColor = texture2D( DIFFUSE, texcoord0 ) * vec4( lightcolor, 1.0 ) + vec4( 0.1);
}
Thank you.

Related

Material shader smooth gradient between two colors

I have the following shaders used in a custom material shader in threeJS and applied to metaballs. Positions of each metaballs (2 in this example) are passed to an array allPos and they are compared in distance with vertex normal to find the closest one. Its index is used to assign a color blobColor which is then passed to the fragment shader.
vertex shader
uniform vec3 colChoice[2];
varying vec3 vNormal;
varying vec3 blobColor;
varying vec3 otherColor;
uniform vec3 allPos[2];
varying float mixScale;
void main() {
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
vNormal = normalize( normalMatrix * normal );
float prevdist = 1000000000000000000000000000000.0;
for(int i=0;i<2;i++){
float distV = distance(allPos[i],normal.xyz);
if(distV<prevdist){
prevdist = distV;
mixScale = distV;
blobColor = colChoice[i];
otherColor = colChoice[i-1];
}
}
gl_Position = projectionMatrix * mvPosition;
}
fragment shader
"varying vec3 blobColor;",
"varying vec3 otherColor;",
"varying vec3 vNormal;",
"varying float mixScale;",
void main() {
finalColor = (0.45*vNormal) + mix(otherColor,blobColor,mixScale);
gl_FragColor = vec4( finalColor, 1.0 );
}
which gives me something like this:
It is a bit rough for now and I would like to apply a dot gradient between
the two colors. Any suggestion?

How to add lighting to shader of instancing

How can I add lighting ( ambient + directional ) to the shader that used with InstancedBufferGeometry?
For example, I want to add lighting to this:
https://threejs.org/examples/?q=inst#webgl_buffergeometry_instancing_dynamic
Here is my vertex shader:
precision highp float;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
uniform mat3 normalMatrix;
attribute vec3 position;
attribute vec3 offset;
attribute vec3 normal;
attribute vec2 uv;
attribute vec4 orientation;
varying vec2 vUv;
// lighting
struct DirectionalLight {
vec3 direction;
vec3 color;
int shadow;
float shadowBias;
float shadowRadius;
vec2 shadowMapSize;
};
uniform DirectionalLight directionalLights[ NUM_DIR_LIGHTS ];
uniform vec3 ambientLightColor;
varying vec3 vLightFactor;
//
void main() {
vec3 vPosition = position;
vec3 vcV = cross(orientation.xyz, vPosition);
vPosition = vcV * (2.0 * orientation.w) + (cross(orientation.xyz, vcV) * 2.0 + vPosition);
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4( offset + vPosition, 1.0 );
// lighting
vec4 ecPosition = modelViewMatrix*vec4(offset + vPosition,1.0);
vec3 ecNormal= -(normalize(normalMatrix*normal));
vec3 fromLight = normalize(directionalLights[0].direction);
vec3 toLight = -fromLight;
vec3 reflectLight = reflect(toLight,ecNormal);
vec3 viewDir = normalize(-ecPosition.xyz);
float ndots = dot(ecNormal, toLight);
float vdotr = max(0.0,dot(viewDir,reflectLight));
vec3 ambi = ambientLightColor;
vec3 diff = directionalLights[0].color * ndots;
vLightFactor = ambi + diff;
//
}
Here is my fragment shader:
precision highp float;
uniform sampler2D map;
varying vec2 vUv;
// lighting
varying vec3 vLightFactor;
//
void main() {
gl_FragColor = texture2D(map, vUv) * vec4(vLightFactor,1.0);
}
Here is my material:
var uniforms = Object.assign(
THREE.UniformsLib['lights'],
{
map: { value: texture }
}
);
var material = new THREE.RawShaderMaterial({
lights: true,
uniforms: uniforms,
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent,
});
Thanks
In a rendering, each mesh of the scene usually is transformed by the model matrix, the view matrix and the projection matrix.
The model matrix defines the location, orientation and the relative size of an mesh in the scene. The model matrix transforms the vertex positions from of the mesh to the world space.
The view matrix describes the direction and position from which the scene is looked at. The view matrix transforms from the world space to the view (eye) space.
Note, The model view matrix modelViewMatrix is the combination of the view matrix and the model matrix. But in your case, the model matrix is possibly the identity matrix, and the modelViewMatrix is possibly equal to the view matrix. I assume that, because you don't do the model transformations by a model matrix, but by the vectors orientation and offset.
The light can either be calculated in view space or in world space.
If the light is calculated in view space the light positions and light directions have to be transformed from world space to view space. Commonly this is done on the CPU (before every frame) and the light parameters uniforms are set up with view space coordinates. Since the view position is (0, 0, 0) in view space, is the view vector the the normalized and inverse vertex position (in view space).
If the light calculations are done in world space, then the view vector has to be calculated by the difference of the view position (eye position) and the vertex position (of course in world space).
You can do the light calculations in view space, because the light direction and position is set up in view space (see three.js - Light). You have to transform the normal vector to world space first and then you have to convert from world space to view space. This has to be done similar as you do it for the vertex position. Add the normal vector to the vertex position. Transform this position to world space. The normal vector in world space, is the difference of the calculated position and the vertex position in world space.
vec3 wNPosition = position + normal;
vec3 wNV = cross(orientation.xyz, wNPosition);
wNPosition = wNV * 2.0 * orientation.w + cross(orientation.xyz, wNV) * 2.0 + wNPosition;
vec3 wNormal = normalize( wNPosition - vPosition );
Under this assumption, your shader code might look like this:
vec3 wPosition = position;
vec3 wV = cross(orientation.xyz, wPosition);
wPosition = offset + wV * 2.0 * orientation.w + cross(orientation.xyz, wV) * 2.0 + wPosition;
vec4 ecPosition = modelViewMatrix * vec4(wPosition, 1.0);
vUv = uv;
gl_Position = projectionMatrix * ecPosition;
// transform normal vector to world space
vec3 wNPosition = position + normal;
vec3 wNV = cross(orientation.xyz, wNPosition);
wNPosition = offset + wNV * 2.0 * orientation.w + cross(orientation.xyz, wNV) * 2.0 + wNPosition;
vec3 ecNormal = normalize(mat3(modelViewMatrix) * (wNPosition - wPosition));
// ambient light
vLightFactor = ambientLightColor;
// diffuse light
vec3 ecToLight = normalize(directionalLights[0].direction);
float NdotL = max(0.0, dot(ecNormal, ecToLight));
vLightFactor += NdotL * directionalLights[0].color;
If you want to add specular light you have to do it like this:
// specular light
vec3 ecReflectLight = reflect( ecFromLight, ecNormal );
vec3 ecViewDir = normalize(-ecPosition.xyz);
float VdotR = max(0.0, dot(ecViewDir, ecReflectLight));
float kSpecular = 4.0 * pow( VdotR, 0.3 * shininess ); // <--- set up shininess parameter
vLightFactor += kSpecular * directionalLights[0].color;
Extension to the answer: Per fragment lighting
While Gouraud shading calculates the light in the the vertex shader, Phong shading calculates the light in the fragment shader.
(see further GLSL fixed function fragment program replacement)
Vertex shader:
precision highp float;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
uniform mat3 normalMatrix;
attribute vec3 position;
attribute vec3 offset;
attribute vec3 normal;
attribute vec2 uv;
attribute vec4 orientation;
varying vec2 vUv;
varying vec3 ecPosition;
varying vec3 ecNormal;
void main()
{
vec3 wPosition = position;
vec3 wV = cross(orientation.xyz, wPosition);
pos = offset + wV * 2.0 * orientation.w + cross(orientation.xyz, wV) * 2.0 + wPosition;
vec4 vPos = modelViewMatrix * vec4(wPosition, 1.0);
ecPosition = vPos.xyz;
vUv = uv;
gl_Position = projectionMatrix * vPos;
// transform normal vector to world space
vec3 wNPosition = position + normal;
vec3 wNV = cross(orientation.xyz, wNPosition);
wNPosition = offset + wNV * 2.0 * orientation.w + cross(orientation.xyz, wNV) * 2.0 + wNPosition;
ecNormal = normalize(mat3(modelViewMatrix) * (wNPosition - wPosition));
}
Fragment shader:
precision highp float;
varying vec2 vUv;
varying vec3 ecPosition;
varying vec3 ecNormal;
uniform sampler2D map;
uniform mat4 modelViewMatrix;
struct DirectionalLight {
vec3 direction;
vec3 color;
int shadow;
float shadowBias;
float shadowRadius;
vec2 shadowMapSize;
};
uniform DirectionalLight directionalLights[ NUM_DIR_LIGHTS ];
uniform vec3 ambientLightColor;
void main()
{
// ambient light
float lightFactor = ambientLightColor;
// diffuse light
vec3 ecToLight = normalize(directionalLights[0].direction);
float NdotL = max(0.0, dot(ecNormal, ecToLight));
lightFactor += NdotL * directionalLights[0].color;
// specular light
vec3 ecReflectLight = reflect( ecFromLight, ecNormal );
vec3 ecViewDir = normalize(-ecPosition.xyz);
float VdotR = max(0.0, dot(ecViewDir, ecReflectLight));
float kSpecular = 4.0 * pow( VdotR, 0.3 * shininess ); // <--- set up shininess parameter
lightFactor += kSpecular * directionalLights[0].color;
gl_FragColor = texture2D(map, vUv) * vec4(vec3(lightFactor), 1.0);
}
see also
Transform the modelMatrix
How does this faking the light work on aerotwist?
GLSL fixed function fragment program replacement
what i mean is the fragment shader should look like this:
precision highp float;
varying vec2 vUv;
varying vec3 ecPosition;
varying vec3 ecNormal;
uniform sampler2D map;
uniform mat4 modelViewMatrix;
struct DirectionalLight {
vec3 direction;
vec3 color;
int shadow;
float shadowBias;
float shadowRadius;
vec2 shadowMapSize;
};
uniform DirectionalLight directionalLights[ NUM_DIR_LIGHTS ];
uniform vec3 ambientLightColor;
void main()
{
// ambient light
vec3 lightFactor = ambientLightColor;
// diffuse light
vec3 ecFromLight = normalize(directionalLights[0].direction);
//vec3 ecToLight = -ecFromLight;
float NdotL = max(0.0, dot(ecNormal, ecFromLight));
lightFactor += NdotL * directionalLights[0].color;
// specular light
/*
float shininess = 10.01;
vec3 ecReflectLight = reflect( ecFromLight, ecNormal );
vec3 ecViewDir = normalize(-ecPosition.xyz);
float VdotR = max(0.0, dot(ecViewDir, ecReflectLight));
float kSpecular = 4.0 * pow( VdotR, 0.3 * shininess ); // <--- set up shininess parameter
lightFactor += kSpecular * directionalLights[0].color;
*/
gl_FragColor = texture2D(map, vUv) * vec4(lightFactor, 1.0);
}

How to make visible uniform variable in both shaders?

I have variable in vertex shader "uIsTeapot".
uniform float uIsTeapot;
Vertex shader work with it very well, but fragment shader don't see it. If I use
if (uIsTeapot == 0.0)
then an error occured
"uIsTeapot": undeclared identifier
but I defined it in vertex shader. If I define uIsTeapot in both shaders as uniform, then program say "Could not initialise shaders" since program don't pass veryfication
if (!gl.getProgramParameter(shaderProgram, gl.LINK_STATUS)) {
alert("Could not initialise shaders");
}
EDITED:
I add mediump to variables and now program compiled without errors, but result is one object on screen, but I draw two object.
Vertex shader
attribute vec3 aVertexPosition;
attribute vec3 aVertexNormal;
attribute vec2 aTextureCoord;
attribute vec4 aVertexColor;
uniform mat4 uMVMatrix;
uniform mat4 uPMatrix;
uniform mat3 uNMatrix;
uniform vec3 uAmbientColor;
uniform vec3 uPointLightingLocation;
uniform vec3 uPointLightingColor;
uniform mediump float uIsTeapot;
//varying float vIsTeapot;
varying vec2 vTextureCoord;
varying vec3 vLightWeighting;
varying vec4 vColor;
void main(void) {
if (uIsTeapot == 1.0) {
vec4 mvPosition = uMVMatrix * vec4(aVertexPosition, 1.0);
gl_Position = uPMatrix * mvPosition;
vec3 lightDirection = normalize(uPointLightingLocation - mvPosition.xyz);
vec3 transformedNormal = uNMatrix * aVertexNormal;
float directionalLightWeighting = max(dot(transformedNormal, lightDirection), 0.0);
directionalLightWeighting = 100.0;
vLightWeighting = uAmbientColor + uPointLightingColor * directionalLightWeighting;
vTextureCoord = aTextureCoord;
}
if (uIsTeapot == 0.0) {
gl_Position = uPMatrix * uMVMatrix * vec4(aVertexPosition, 1.0);
vColor = aVertexColor;
} }
Fragment shader
precision mediump float;
varying vec2 vTextureCoord;
varying vec3 vLightWeighting;
varying vec4 vColor;
uniform sampler2D uSampler;
uniform mediump float uIsTeapot;
//varying float vIsTeapot;
void main(void) {
if (uIsTeapot == 1.0) {
vec4 textureColor = texture2D(uSampler, vec2(vTextureCoord.s, vTextureCoord.t));
gl_FragColor = vec4(textureColor.rgb * vLightWeighting, textureColor.a);
} else {
gl_FragColor = vColor;
}
}
You need to declare it in both shaders, with the same precision.
What do you mean by:
then program don't work right.
Can you post your vertex shader and your fragment shader? I suspect you are relying on the default precision, and it is likely that you are using highp in the vertex shader and mediump in the fragment shader
Try using:
uniform mediump float uIsTeapot;
... in both vertex shader and fragment shader.

Can't get OpenGL code to work properly

I'm very new to OpenGL, GLSL and WebGL. I'm trying to get this sample code to work in a tool like http://shdr.bkcore.com/ but I can't get it to work.
Vertex Shader:
void main()
{
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
gl_TexCoord[0] = gl_MultiTexCoord0;
}
Fragment Shader:
precision highp float;
uniform float time;
uniform vec2 resolution;
varying vec3 fPosition;
varying vec3 fNormal;
uniform sampler2D tex0;
void main()
{
float border = 0.01;
float circle_radius = 0.5;
vec4 circle_color = vec4(1.0, 1.0, 1.0, 1.0);
vec2 circle_center = vec2(0.5, 0.5);
vec2 uv = gl_TexCoord[0].xy;
vec4 bkg_color = texture2D(tex0,uv * vec2(1.0, -1.0));
// Offset uv with the center of the circle.
uv -= circle_center;
float dist = sqrt(dot(uv, uv));
if ( (dist > (circle_radius+border)) || (dist < (circle_radius-border)) )
gl_FragColor = bkg_color;
else
gl_FragColor = circle_color;
}
I figured that this code must be from an outdated version of the language, so I changed the vertex shader to:
precision highp float;
attribute vec2 position;
attribute vec3 normal;
varying vec2 TextCoord;
attribute vec2 textCoord;
uniform mat3 normalMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
varying vec3 fNormal;
varying vec3 fPosition;
void main()
{
gl_Position = vec4(position, 0.0, 1.0);
TextCoord = vec2(textCoord);
}
That seemed to fix the error messages about undeclared identifiers and not being able to "convert from 'float' to highp 4-component something-or-other", but I have no idea if, functionally, this will do the same thing as the original intended.
Also, when I convert to this version of the Vertex Shader I have no idea what I'm supposed to do with this line in the Fragment Shader:
vec2 uv = gl_TexCoord[0].xy;
How do I convert this line to fit in with the converted vertex shader and how can I be sure that the vertex shader is even converted correctly?
gl_TexCoord is from desktop OpenGL, and not part of OpenGL ES. You'll need to create a new user-defined vec2 varying to hold the coordinate value.

texture atlas tiling

i'm trying simple texture splatting on ios opengl es 2.0 (ipad). I have 4 tiled textures in pvrt compressed atlas (2x2 tiles). 4 single textures on 4 texture units was terribly slow.
vertex shader:
attribute lowp vec4 position;
attribute lowp vec2 tex0;
varying lowp vec2 surfCoord;
uniform mat4 projection_modelview;
uniform lowp float uv_coef;
varying lowp vec2 texCoord1;
varying lowp vec2 texCoord2;
varying lowp vec2 texCoord3;
varying lowp vec2 texCoord4;
void main()
{
gl_Position = projection_modelview * position;
vec2 texCoord = fract(vec2(position.x / uv_coef, position.y / uv_coef));
texCoord1 = texCoord * 0.5;
texCoord2 = texCoord1 + vec2(0.5, 0);
texCoord3 = texCoord1 + vec2(0, 0.5);
texCoord4 = texCoord1 + vec2(0.5, 0.5);
surfCoord = tex0;
}
fragment shader:
uniform sampler2D texture0; // surface alpha map
uniform sampler2D texture1; // atlas
varying lowp vec2 surfCoord;
varying lowp vec2 texCoord1;
varying lowp vec2 texCoord2;
varying lowp vec2 texCoord3;
varying lowp vec2 texCoord4;
void main()
{
lowp vec4 surfTexel = texture2D(texture0, surfCoord);
lowp vec4 texel1 = texture2D(texture1, texCoord1);
lowp vec4 texel2 = texture2D(texture1, texCoord2);
lowp vec4 texel3 = texture2D(texture1, texCoord3);
lowp vec4 texel4 = texture2D(texture1, texCoord4);
texel1 *= surfTexel.r;
texel2 = mix(texel1, texel2, surfTexel.g);
texel3 = mix(texel2, texel3, surfTexel.b);
gl_FragColor = mix(texel3, texel4, surfTexel.a);
}
shows this one:
(source: inputwish.com)
My problem is probably texture unit interpolators but i don't how to resolve. I don't see mistake in my shaders. Any advice please?
mistake is using fract in vertex shader. it's too early. It should be in fragment shader:
lowp vec4 texel1 = texture2D(texture1, fract(texCoord) * 0.5);
lowp vec4 texel2 = texture2D(texture1, fract(texCoord) * 0.5 + vec2(0.5, 0.0));
lowp vec4 texel3 = texture2D(texture1, fract(texCoord) * 0.5 + vec2(0.0, 0.5));
lowp vec4 texel4 = texture2D(texture1, fract(texCoord) * 0.5 + vec2(0.5, 0.5));
... and mix them together..
anyway it's slow because dependent texture reads on ipad ios.

Resources