three.js: access canvas size in custom shader - three.js

Do built-in uniforms for vertex shader in three.js, namely:
uniform mat4 modelMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat3 normalMatrix;
uniform vec3 cameraPosition;
contain information about the window size or at least the aspect ratio?
For now I'm passing the size as additional vec2 uniform and I'm just wondering if it's redundant.

Related

How to change transparency of texture in shader

I wanna create shader what will be changing alpha color of texture from invisible to visible.
I've seen method to change gl_FragColor.a. But for me it doesn't work. What ever i setting there always all screen is black.
precision mediump float;
uniform vec4 vColor;
uniform sampler2D uTexture;
varying vec2 vTexCoordinate;
uniform float alphaMod;
void main(){
gl_FragColor = texture2D(uTexture, vTexCoordinate);
gl_FragColor.a = alphaMod;
}
Modified object should be barely visible but for now is invisible.

Three.js THREE.InstancedBufferGeometry multiple sampler2D vertical tearings

I have created a RawShaderMaterial for an InstancedBufferGeometry object that is rendering well with 1 sampler2D uniform. As soon as it uses a second sampler2D uniform, it renders with a lot of vertical tearings
Here is the fragment shader:
precision highp float;
uniform sampler2D utexture1;
uniform sampler2D utexture2;
varying float vindex;
varying vec2 vUv;
varying vec3 mapcolor;
vec4 gettexture(){
vec4 color;
if(vindex==0.){
color = texture2D(utexture1, vUv)*vec4(mapcolor,1.);
}else if(vindex==1.){
color = texture2D(utexture1, vUv)*vec4(mapcolor,1.);
}else if(vindex==2.){
color = texture2D(utexture2, vUv)*vec4(mapcolor,1.);
}else if(vindex==3.){
color = texture2D(utexture2, vUv)*vec4(mapcolor,1.);
}
return color;
}
void main() {
gl_FragColor = gettexture();
}
Notes: The 2 textures used for the sampler2D have the same size (512x512) and they are loaded before material creation.
Anyone knows where these vertical tearings come from?
Thank you in advance for your help!

Sampler2d and samplerCube arrays in the same shader

I am trying to build a shader (in glsl 1.0) that uses an array of sampler2d textures, and an array of samplerCube textures. Strangely, I can't link it (and I can't get an error message) if I alternate the usage of the two textures. By trials and error, I located the code that caused the issue:
This fail to compile:
varying highp vec2 vTextureCoord;
varying highp vec2 vTextureCoord2;
varying lowp vec3 vLighting;
varying lowp vec4 vColor;
varying lowp vec3 vNormal;
#ifdef Textures\r\n"
uniform sampler2D uSampler[Textures];
uniform lowp vec2 texFlag[Textures];
uniform lowp mat3 texMat[Textures];
uniform lowp float tex_coord_set[Textures];
uniform samplerCube uSamplerC[Textures];
#endif
uniform lowp vec3 fogColor;
varying lowp float fogBlend;
void main(void) {
#ifdef Textures
mediump vec4 pixColor = vColor;
mediump vec3 SpecNormal = reflect(vec3(0.0,0.0,1.0), vNormal);
mediump vec4 texelColor;
pixColor += textureCube(uSamplerC[0], vNormal);
pixColor += texture2D(uSampler[0], vec2(0.0,0.0));
pixColor += textureCube(uSamplerC[1], vNormal);
pixColor += texture2D(uSampler[1], vec2(0.0,0.0));
else
mediump vec4 pixColor = vColor;
#endif
But if I put:
pixColor += textureCube(uSamplerC[0], vNormal);
pixColor += textureCube(uSamplerC[1], vNormal);
pixColor += texture2D(uSampler[0], vec2(0.0,0.0));
pixColor += texture2D(uSampler[1], vec2(0.0,0.0));
(putting first all sampler of one type) it seems to link perfectly. Why?
(note: this code, in current form, is pointless, I used it only to expose the issue)

Can't get OpenGL code to work properly

I'm very new to OpenGL, GLSL and WebGL. I'm trying to get this sample code to work in a tool like http://shdr.bkcore.com/ but I can't get it to work.
Vertex Shader:
void main()
{
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
gl_TexCoord[0] = gl_MultiTexCoord0;
}
Fragment Shader:
precision highp float;
uniform float time;
uniform vec2 resolution;
varying vec3 fPosition;
varying vec3 fNormal;
uniform sampler2D tex0;
void main()
{
float border = 0.01;
float circle_radius = 0.5;
vec4 circle_color = vec4(1.0, 1.0, 1.0, 1.0);
vec2 circle_center = vec2(0.5, 0.5);
vec2 uv = gl_TexCoord[0].xy;
vec4 bkg_color = texture2D(tex0,uv * vec2(1.0, -1.0));
// Offset uv with the center of the circle.
uv -= circle_center;
float dist = sqrt(dot(uv, uv));
if ( (dist > (circle_radius+border)) || (dist < (circle_radius-border)) )
gl_FragColor = bkg_color;
else
gl_FragColor = circle_color;
}
I figured that this code must be from an outdated version of the language, so I changed the vertex shader to:
precision highp float;
attribute vec2 position;
attribute vec3 normal;
varying vec2 TextCoord;
attribute vec2 textCoord;
uniform mat3 normalMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
varying vec3 fNormal;
varying vec3 fPosition;
void main()
{
gl_Position = vec4(position, 0.0, 1.0);
TextCoord = vec2(textCoord);
}
That seemed to fix the error messages about undeclared identifiers and not being able to "convert from 'float' to highp 4-component something-or-other", but I have no idea if, functionally, this will do the same thing as the original intended.
Also, when I convert to this version of the Vertex Shader I have no idea what I'm supposed to do with this line in the Fragment Shader:
vec2 uv = gl_TexCoord[0].xy;
How do I convert this line to fit in with the converted vertex shader and how can I be sure that the vertex shader is even converted correctly?
gl_TexCoord is from desktop OpenGL, and not part of OpenGL ES. You'll need to create a new user-defined vec2 varying to hold the coordinate value.

Compute normals in shader issue

I have the following vertex shader to rotate normals. Before I implemented that, I passed also the rotation matrix of the mesh to calculate the normals. That time lighting was just fine.
#version 150
uniform mat4 projection;
uniform mat4 modelview;
in vec3 position;
in vec3 normal;
in vec2 texcoord;
out vec3 fposition;
out vec3 fnormal;
out vec2 ftexcoord;
void main()
{
mat4 mvp = projection * modelview;
fposition = vec3(mvp * vec4(position, 1.0));
fnormal = normalize(mat3(transpose(inverse(modelview))) * normal);
ftexcoord = texcoord;
gl_Position = mvp * vec4(position, 1.0);
}
But with this shader, the lighting computed in the fragment shader turns with the camera. I haven't changed the fragment shader, so the issue should be in the code above.
What am I doing wrong in computing the normals?
The steps you use to create the normal Matrix might be out of order.
Try:
fnormal = normalize(transpose(inverse(mat3(modelview))) * normal)
Edit:
Since you are inverting the mat4, the translation values (which get truncated when a mat4 is converted to a mat3) are probably affecting the calculation of the inverse matrix.

Resources