How to upgrade the older demo to latest version of three.js? - three.js

I got a webgl demo running with old version of Three.js (r52).
I want to adapt the code to new version of Three.js (r1xx).
But I got some black screen after I switched to the new version of Three.js library and upgrade some api.(e.g. Moving the attributes values to BufferGeometry)
This is the shader code passed to ShaderMaterial:
<script type="x-shader/x-vertex" id="vertexshader">
attribute float size;
attribute vec3 customColor;
attribute float time;
uniform float globalTime;
varying vec3 vColor;
varying float fAlpha;
void main() {
vColor = customColor;
vec3 pos = position;
float animTime = min(1.4, max(1.0, globalTime - time));
vec3 animated = vec3( pos.x * animTime, pos.y * animTime, pos.z * animTime );
vec4 mvPosition = modelViewMatrix * vec4( animated, 1.0 );
fAlpha = 1.0 - (globalTime*0.5);
gl_PointSize = size * ( 300.0 / length( mvPosition.xyz ) );
gl_Position = projectionMatrix * mvPosition;
}
</script>
<script type="x-shader/x-fragment" id="fragmentshader">
uniform vec3 color;
uniform sampler2D texture;
varying vec3 vColor;
varying float fAlpha;
void main() {
// fog
float depth = gl_FragCoord.z / gl_FragCoord.w;
float near = 30.0;
float far = 290.0;
float fog = 0.0 + smoothstep( near, far, depth );
vec4 outColor = texture2D( texture, gl_PointCoord );
if ( outColor.a < 0.25 ) discard; // alpha be gone
gl_FragColor = vec4( color * vColor, fAlpha );
gl_FragColor = gl_FragColor * outColor;
gl_FragColor = mix( gl_FragColor, vec4( vec3(0.0,0.0,0.0), gl_FragColor.w ), fog );
}
</script>
It seems that the ShaderMaterial is not working well.
Does anyone know if I missed something?
Older one:
https://loooog.github.io/globe/legacy
The failed new one:
https://loooog.github.io/globe/
The code can be seen from the chrome dev console.
Update: the shader material issue has been solved. The line material is still not working

//var line = new THREE.Line(lineGeometry, lineMaterial, THREE.LineSegments);
var line = new THREE.LineSegments(lineGeometry, lineMaterial);

Related

Why does the color set by vertex and fragment shaders in three js depend on the camera position of the scene?

I made 1000 points and gave them coordinates to create a ring shape. I then gave a shader material to the points and pointed to the vertex and fragment shaders.
Vertex Shader:
const vertexShader = `
uniform float uTime;
uniform float uRadius;
varying vec3 vColor;
varying float vDistance;
void main() {
vDistance = distance(position, vec3(0.0));
// Do Not Touch
gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4( position, 1.0 );
gl_PointSize = 5.0;
}
`
export default vertexShader
Fragment Shader:
const fragmentShader = `
uniform float uDistance[1000];
uniform float uResolutionWidth;
uniform float uResolutionHeight;
varying float vDistance;
void main() {
vec2 resolution = vec2(uResolutionWidth, uResolutionHeight);
vec2 st = gl_FragCoord.xy/resolution;
float pct = distance(st, vec2(1.0));
vec3 color = vec3(mix(vec3(1.0, 0.0, 0.0), vec3(0.0, 0.0, 1.0), pct));
gl_FragColor = vec4( color, 1.0 );
}
`
export default fragmentShader
What I had wanted to do was assign a color to each point based on its distance to the origin. However I realized what I did is assign a color based on the pointer distance to the camera, or at least it's what its looking like
EDIT:
I tried to pass along a varying vDistance like so
varying vec3 vRealPosition;
void main() {
vDistance = distance(position, vec3(0.0));
vColor = mix(vec3(1.0, 0.0, 0.0), vec3(0.0, 0.0, 1.0), vDistance);
vRealPosition = position;
// Do Not Touch
gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4( position, 1.0 );
gl_PointSize = 5.0;
}
But when I used it in fragment shader all points are just blue
varying vec3 vRealPosition;
void main() {
vec2 resolution = vec2(uResolutionWidth, uResolutionHeight);
vec2 st = gl_FragCoord.xy/resolution;
float pct = distance(vRealPosition, vec3(0.0));
vec3 color = vec3(mix(vec3(1.0, 0.0, 0.0), vec3(0.0, 0.0, 1.0), pct));
gl_FragColor = vec4( vColor, 1.0 );
}

Upgrading to Three js 0.130.1 version Points rendered with shadermaterial and buffergeometry not rendering

We were using Three 0.115 version and everything was working. Since we got vulnerability issues for < 0.125, we decided to upgrade to latest version. Then we are getting issues with shader material.
We have an application that uses Point cloud rendered with buffer geometry(positions, sizes and colors bufferattributes) and shadermaterial.
function vertexShader() {
return `attribute float size;
attribute vec3 customColor;
varying vec3 vColor;
attribute float visibility;
varying float vVisible;
void main() {
vColor = customColor;
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_PointSize = size * ( 300.0 / -mvPosition.z );
gl_Position = projectionMatrix * mvPosition;
vVisible = visibility;
}`
}
function fragmentShader() {
return `uniform vec3 color;
uniform sampler2D pointTexture;
varying vec3 vColor;
varying float vVisible;
void main() {
gl_FragColor = vec4( color * vColor, 1.0 );
gl_FragColor = gl_FragColor * texture2D( pointTexture, gl_PointCoord );
if ( gl_FragColor.a < ALPHATEST ) discard;
if (vVisible < 0.5) discard;
}`
}
and in our javascript init code.
const material = new THREE.ShaderMaterial({
uniforms: {
color: { value: new THREE.Color(0xffffff) },
texture: { value: new THREE.TextureLoader().load(circle) },
resolution: { value: new THREE.Vector2() },
},
vertexShader: vertexShader(),
fragmentShader: fragmentShader(),
alphaTest: 0.9,
blending: THREE.AdditiveBlending
});
there is no error in console. But points are not rendered.
we use raycast for detecting points and that works without any issue.
Any idea why after upgrading to latest version of three, rendering of points fails?
is this something to do with shadermaterial?
Thanks for the help :)

three.js animated texture in particles with custom shadermaterial

How do I make each particle animate and cycle through all the tiles in the sprite sheet?
Here's my shader program:
<script type="x-shader/x-vertex" id="vertexshader">
attribute vec2 offset;
varying vec2 vOffset;
void main()
{
vOffset = offset;
gl_PointSize = 25.0;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
</script>
<script type="x-shader/x-fragment" id="fragmentshader">
uniform sampler2D texture;
uniform vec2 repeat;
varying vec2 vOffset;
void main()
{
vec2 uv = vec2( gl_PointCoord.x, 1.0 - gl_PointCoord.y );
vec4 tex = texture2D( texture, uv * repeat + vOffset );
if ( tex.a < 0.5 ) discard;
gl_FragColor = tex;
}
</script>
Basically an example from here: http://jsfiddle.net/myy7x4zd/4/

Shadow artifacts in opengl

I am trying to render an object and two lights, one of the lights cast shadows. Everything works ok but I noticed that there are some obvious artifacts, as shown in the below image, some shadows seem to overflow to bright areas.
Below is the shaders to render depth information into a framebuffer
<script id="shadow-shader-vertex" type="x-shader/x-vertex">
attribute vec4 aVertexPosition;
uniform mat4 uObjMVP;
void main() {
gl_Position = uObjMVP * aVertexPosition;
}
</script>
<script id="shadow-shader-fragment" type="x-shader/x-vertex">
precision mediump float;
void main() {
//pack gl_FragCoord.z
const vec4 bitShift = vec4(1.0, 256.0, 256.0 * 256.0, 256.0 * 256.0 * 256.0);
const vec4 bitMask = vec4(1.0/256.0, 1.0/256.0, 1.0/256.0, 0.0);
vec4 rgbaDepth = fract(gl_FragCoord.z * bitShift);
rgbaDepth -= rgbaDepth.gbaa * bitMask;
gl_FragColor = rgbaDepth;
}
</script>
In the above shaders, uObjMVP is the MVP matrix used when looking from the position of the light that cast shadow (the warm light, the cold light does not cast shadow)
And here are the shaders to draw everything:
<script id="shader-vertex" type="x-shader/x-vertex">
//position of a vertex.
attribute vec4 aVertexPosition;
//vertex normal.
attribute vec3 aNormal;
//mvp matrix
uniform mat4 uObjMVP;
uniform mat3 uNormalMV;
//shadow mvp matrix
uniform mat4 uShadowMVP;
//interplate normals
varying vec3 vNormal;
//for shadow calculation
varying vec4 vShadowPositionFromLight;
void main() {
gl_Position = uObjMVP * aVertexPosition;
//convert normal direction from object space to view space
vNormal = uNormalMV * aNormal;
vShadowPositionFromLight = uShadowMVP * aVertexPosition;
}
</script>
<script id="shader-fragment" type="x-shader/x-fragment">
precision mediump float;
uniform sampler2D uShadowMap;
varying vec3 vNormal;
varying vec4 vShadowPositionFromLight;
struct baseColor {
vec3 ambient;
vec3 diffuse;
};
struct directLight {
vec3 direction;
vec3 color;
};
baseColor mysObjBaseColor = baseColor(
vec3(1.0, 1.0, 1.0),
vec3(1.0, 1.0, 1.0)
);
directLight warmLight = directLight(
normalize(vec3(-83.064, -1.99, -173.467)),
vec3(0.831, 0.976, 0.243)
);
directLight coldLight = directLight(
normalize(vec3(37.889, 47.864, -207.187)),
vec3(0.196, 0.361, 0.608)
);
vec3 ambientLightColor = vec3(0.3, 0.3, 0.3);
float unpackDepth(const in vec4 rgbaDepth) {
const vec4 bitShift = vec4(1.0, 1.0/256.0, 1.0/(256.0*256.0), 1.0/(256.0*256.0*256.0));
float depth = dot(rgbaDepth, bitShift);
return depth;
}
float calVisibility() {
vec3 shadowCoord = (vShadowPositionFromLight.xyz/vShadowPositionFromLight.w)/2.0 + 0.5;
float depth = unpackDepth(texture2D(uShadowMap, shadowCoord.xy));
return (shadowCoord.z > depth + 0.005) ? 0.4 : 1.0;
}
vec3 calAmbientLight(){
return ambientLightColor * mysObjBaseColor.ambient;
}
vec3 calDiffuseLight(const in directLight light, const in float visibility){
vec3 inverseLightDir = light.direction * -1.0;
float dot = max(dot(inverseLightDir, normalize(vNormal)), 0.0);
return light.color * mysObjBaseColor.diffuse * dot * visibility;
}
void main() {
vec3 ambientLight = calAmbientLight();
float visibility = calVisibility();
vec3 warmDiffuseLight = calDiffuseLight(warmLight, visibility);
// cold light does not cast shadow and hence visilibility is always 1.0
vec3 coldDiffuseLight = calDiffuseLight(coldLight, 1.0);
gl_FragColor = vec4(coldDiffuseLight + warmDiffuseLight + ambientLight, 1.0);
}
</script>
If I simply draw the depth information out on to the canvas,
void main() {
// vec3 ambientLight = calAmbientLight();
// float visibility = calVisibility();
// vec3 warmDiffuseLight = calDiffuseLight(warmLight, visibility);
// // cold light does not cast shadow and hence visilibility is always 1.0
// vec3 coldDiffuseLight = calDiffuseLight(coldLight, 1.0);
// gl_FragColor = vec4(coldDiffuseLight + warmDiffuseLight + ambientLight, 1.0);
vec3 shadowCoord = (vShadowPositionFromLight.xyz/vShadowPositionFromLight.w)/2.0 + 0.5;
gl_FragColor = vec4(unpackDepth(texture2D(uShadowMap, shadowCoord.xy)), 0.0, 0.0, 1.0);
}
I would get this image
Thanks in advance.

why Particle system with shader doesn't work? three.js

Hi can anyone help me whith this? I have this shader, it works with THREE.Mesh but doesn't with THREE.Particlesystem?
I want each particle to have a portion of a given map(texture) and change their positions accordingly, something like this http://www.chromeexperiments.com/detail/webcam-displacement/?f=webgl
<script id="vs" type="x-shader/x-vertex">
uniform sampler2D map;
varying vec2 vUv;
void main() {
vUv = uv;
vec4 color = texture2D( map, vUv );
float value = ( color.r + color.g + color.b ) / 3.0;
vec4 pos = vec4( position.xy, value * 100.0, 1.0 );
gl_PointSize = 20.0;
gl_Position = projectionMatrix * modelViewMatrix * pos;
}
</script>
<script id="fs" type="x-shader/x-fragment">
uniform sampler2D map;
varying vec2 vUv;
void main() {
gl_FragColor = texture2D( map, vUv );
}
</script>
ParticleSystem doesn't really support UVs as there aren't faces, just single points. Texture mapping particles is done with gl_PointCoord (IIRC), but that gives you same mapping for every particle. In order to give different portion of the same texture to each particle, you should use BufferGeometry, which in the latest version of three.js supports all attributes including custom ones (and it is very efficient and fast!). You'd then supply a vec2 offset attribute for each particle: you get the correct UV from this offset and the gl_PointCoord.

Resources