Shadow won't update when geometry is changed using VertexShader - three.js

I'm using a custom shader to curve a plane. My custom shader extends the Lambert shader so it supports lights and shadows. It all works as expected, but when the vertexShader changes the geometry of the plane, the shadow doesn't update. Is there anything I'm missing to flag that the geometry has updated in my vertexShader and the shadow needs to change?
[Here is a screenshot of the problem. The plane is curved with a vertexShader, but the shadow doesn't update][1]
[1]: http://i.stack.imgur.com/6kfCF.png
Here is the demo/code: http://dev.cartelle.nl/curve/
If you drag the "bendAngle" slider you can see that the shadow doesn't update.
One work-around I thought was to get the bounding box of my curved plane. Then use those points and create a new Mesh/Box and use that object to cast the shadow. But then I wasn't sure how to get the coordinates of the new curved geometry. When I would check geometry.boundingBox after the shader was applied, it would also just give me the original coordinates every time.
Thanks
Johnny

If you are modifying the geometry positions in the vertex shader, and you are casting shadows, you need to specify a custom depth material so the shadows will respond to the modified positions.
In your custom depth material's vertex shader, you modify the vertex positions in the same way you modified them in the material's vertex shader.
An example of a custom depth material can be seen in this three.js example, (although vertices are not modfied in the vertex shader in that example; they are modified on the CPU).
In your case, you would create a vertex shader for the custom depth material using a pattern like so:
<script type="x-shader/x-vertex" id="vertexShaderDepth">
uniform float bendAngle;
uniform vec2 bounds;
uniform float bendOffset;
uniform float bendAxisAngle;
vec3 bendIt( vec3 ip, float ba, vec2 b, float o, float a ) {
// your code here
return ip;
}
void main() {
vec3 p = bendIt( position, bendAngle, bounds, bendOffset, bendAxisAngle );
vec4 mvPosition = modelViewMatrix * vec4( p, 1.0 );
gl_Position = projectionMatrix * mvPosition;
}
</script>
And fragment shader like this:
<script type="x-shader/x-fragment" id="fragmentShaderDepth">
vec4 pack_depth( const in float depth ) {
const vec4 bit_shift = vec4( 256.0 * 256.0 * 256.0, 256.0 * 256.0, 256.0, 1.0 );
const vec4 bit_mask = vec4( 0.0, 1.0 / 256.0, 1.0 / 256.0, 1.0 / 256.0 );
vec4 res = fract( depth * bit_shift );
res -= res.xxyz * bit_mask;
return res;
}
void main() {
gl_FragData[ 0 ] = pack_depth( gl_FragCoord.z );
}
</script>
Then in your javascript, you specify the custom depth material:
uniforms = {};
uniforms.bendAngle = { type: "f", value: properties.bendAngle };
uniforms.bendOffset = { type: "f", value: properties.offset };
uniforms.bendAxisAngle = { type: "f", value: properties.bendAxisAngle };
uniforms.bounds = { type: "v2", value: new THREE.Vector2( - 8, 16 ) };
var vertexShader = document.getElementById( 'vertexShaderDepth' ).textContent;
var fragmentShader = document.getElementById( 'fragmentShaderDepth' ).textContent;
myObject.customDepthMaterial = new THREE.ShaderMaterial( {
uniforms: uniforms,
vertexShader: vertexShader,
fragmentShader: fragmentShader
} );
three.js r.74

Related

Forge Viewer Autodesk v7 issues with recolouring THREE.BufferGeoemtry when using THREE.ShaderMaterial

EDIT: The Forge Viewer I'm using has customized version of Three.js release r71 (source) which is why I'm using outdated code. The current release of Three.js is r121.
I've created THREE.Group() that contains various THREE.Pointcloud(geometry, material). One of the Points is composed of THREE.BufferGeometry() and THREE.ShaderMaterial().
When I add a colour attribute to a BufferGeometry, only only red (1,0,0), white (1,1,1), or yellow (1,1,0) seem to work. This image is when I set the colour to (1,0,0). This image is when I set the colour to blue (0,0,1).
My question is, how do I resolve this? Is the issue in the shaders? Is the issue with how I build the BufferGeometry? Is it a bug? Thanks.
My shaders:
var vShader = `uniform float size;
varying vec3 vColor;
void main() {
vColor = color;
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_PointSize = size * ( size / (length(mvPosition.xyz) + 0.00001) );
gl_Position = projectionMatrix * mvPosition;
}`
var fShader = `varying vec3 vColor;
uniform sampler2D sprite;
void main() {
gl_FragColor = vec4(vColor, 1.0 ) * texture2D( sprite, gl_PointCoord );
if (gl_FragColor.x < 0.2) discard;
}`
My material:
var materialForBuffers = new THREE.ShaderMaterial( {
uniforms: {
size: { type: 'f', value: this.pointSize},
sprite: { type: 't', value: THREE.ImageUtils.loadTexture("../data/white.png") },
},
vertexShader: vShader,
fragmentShader: fShader,
transparent: true,
vertexColors: true,
});
How the color is added:
const colors = new Float32Array( [ 1.0, 0.0, 0.0 ] );
geometryForBuffers.addAttribute('color', new THREE.BufferAttribute( colors, 3 ));
Link to code
It looks like you may already be using parts of that sample code but if not, please refer to https://github.com/petrbroz/forge-point-clouds/blob/develop/public/scripts/extensions/pointcloud.js (live demo https://forge-point-clouds.autodesk.io). This sample code uses the color geometry attribute already to specify colors of individual points.

Setting texture to geometry in Three.js

I have this geometry: Picture
I want to add the same effect that mountains are with snow texture and so on:
Texture splatting with Three.js
Little background what info I give to shaders from Three.js:
//importing grass and snow textures:
var grassTexture = new THREE.ImageUtils.loadTexture( 'images/grass-512.jpg' );
grassTexture.wrapS = grassTexture.wrapT = THREE.RepeatWrapping;
var snowTexture = new THREE.ImageUtils.loadTexture( 'images/snow-512.jpg' );
snowTExture.wrapS = snowTExture.wrapT = THREE.RepeatWrapping;
this.customUniforms = {
grassTexture: { value: grassTexture },
snowTexture: { value: snowTexture },
};
var customMaterial = new THREE.ShaderMaterial({
uniforms: customUniforms,
side: THREE.DoubleSide,
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent,
});
//creating mesh, geometry is the model in picture.
mesh = new THREE.Mesh(geometry, customMaterial);
Vertex and fragment shaders:
//vertexShader:
varying vec2 vUV;
void main(){
vUV = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
I get full red model with this:
//fragmentShader:
void main(){
gl_FragColor = vec4(1, 0.0, 0.0, 1.0) ;
}
I want textures that are higher with snowTexture and lower with grassTexture.
uniform sampler2D grassTexture;
uniform sampler2D snowTexture;
varying vec2 vUV;
//Something like this?:
vec4 grass = texture2D( grassTexture, vUV);
vec4 snow = texture2D( snowTexture, vUV);
gl_FragColor = vec4(0.0, 0.0, 0.0, 1.0) + grass + snow;
This really not that hard to understand, let me walk you through the logic.
In your case, you don't want to use a displacement map. So, you need to set up a varying height on your vertexShader to map your vertices up-coordinates [0,1] to your fragmentShader.
//vertexShader:
varying vec2 vUV;
varying float height;
void main() {
vUV = uv;
float maxPosition = 30.0; // this is an example value.
height = max( 0.0, min(1.0, position.y/maxPosition ) ); // assuming +y is up
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
Now you can access height from your fragmentShader and use that information to select where you want your transitions to occur.
uniform sampler2D grassTexture;
uniform sampler2D snowTexture;
varying vec2 vUV;
varying float height;
void main(){
vec4 grass = (1.0 - smoothstep( 0.48, 0.52, height)) * texture2D( grassTexture, vUV);
vec4 snow = (smoothstep(0.48, 0.52, height) - 0.0) * texture2D( snowTexture, vUV);
gl_FragColor = vec4(0.0, 0.0, 0.0, 1.0) + grass + snow;
}
The link provided uses function smoothstep to make a gradual transition between the textures. We can create transitions using the follow pattern ( a - b ) * textureColor.
In this case, a controls when the texture starts to contribute to the fragment color.
b controls when the texture stops contributing.
In other words, your grass texture will have already started contributing at every height, so we map a to 1.0. It stops contributing around 0.5, so we give b a smooth fade-out as it approaches that 0.5.
Your snow texture, on the other hand, will only start contributing around 0.5. So, we give a a smooth fade-in as it approaches 0.5. It will never stop contributing, so we set b as 0.0.
Hope this clears things up for you.

Render only alpha values from texture

What i want to do is to load a texture with only alpha values from a PNG-picture while using the color of the material to render the RGB. To give you some context i use this for GPU-picking to find sprites that are clicked on. This way i can know if a sprite was clicked on or if the user clicked on the transparent part of the sprite.
I tried using the THREE.AlphaFormat as format and i tried all the types but
what i get is a sprite with correct alpha, but the color from the texture is combined with the color of the material.
Here is the code i tried so far
var type = THREE.UnsignedByteType;
var spriteMap = new THREE.TextureLoader().load( url );
spriteMap.format = THREE.AlphaFormat;
spriteMap.type = type;
var spriteMaterial = new THREE.SpriteMaterial( { map: spriteMap , color: idcolor.getHex() } ); //
var sprite = new THREE.Sprite( spriteMaterial );
sprite.position.set( this.position.x , this.position.y , this.position.z );
sprite.scale.set( this.scale.x , this.scale.y , this.scale.z );
Selection.GpuPicking.pickingScene.add( sprite );
Any ideas on how to achieve this?
three.js r.91
I didn't manage to do what i wanted with combining texture and material. My solution was to create a plane and add my own custom shaders handling the Sprite functionality. I copied the shaders for sprites from three.js library and removed the code i didn't need since i only needed the correct alpha and one color to be visible.
My code for creating a sprite with color from material and alpha values from texture
//Create the position, scale you want for your sprite and add the url to your texture
var spriteMap = new THREE.TextureLoader().load( url , function( texture ){
var geometry = new THREE.PlaneGeometry( 1.0, 1.0 );
uniforms = {
color: {type:"v3", value: color },
map: {value: texture },
opacity: {type:"f", value: 1.0 },
alphaTest: {type:"f", value: 0.0 },
scale: {type:"v3", value: scale }
};
var material = new THREE.ShaderMaterial(
{
uniforms: uniforms,
vertexShader: vertexShader, //input the custom shader here
fragmentShader: fragmentShader, //input the custom shader here
transparent: true,
});
var mesh = new THREE.Mesh( geometry, material );
mesh.position.set( position.x , position.y , position.z );
mesh.scale.set( scale.x , scale.y ,sprite.scale.z );
scene.add(mesh);
} );
This is my vertex shader:
uniform vec3 scale;
varying vec2 vUV;
void main() {
float rotation = 0.0;
vUV = uv;
vec3 alignedPosition = position * scale;
vec2 rotatedPosition;
rotatedPosition.x = cos( rotation ) * alignedPosition.x - sin( rotation ) * alignedPosition.y;
rotatedPosition.y = sin( rotation ) * alignedPosition.x + cos( rotation ) * alignedPosition.y;
vec4 mvPosition;
mvPosition = modelViewMatrix * vec4( 0.0, 0.0 , 0.0 , 1.0 );
mvPosition.xy += rotatedPosition;
gl_Position = projectionMatrix * mvPosition;
}
my fragment shader:
varying vec2 vUV;
uniform vec3 color;
uniform sampler2D map;
uniform float opacity;
uniform float alphaTest;
void main() {
vec4 texture = texture2D( map, vUV );
//showing the color from material, but uses alpha from texture
gl_FragColor = vec4( color , texture.a * opacity );
if ( gl_FragColor.a < alphaTest ) discard;
}

How to maintain the glow effect of a json model rotating in three.js scene?

I add a json model with glow effect into the scene.
As follows:
I try to rotate the json model automatically.
However, it looks weird when it is rotating.
The glow effect of the model does not work.
I assume that the position of the json model does not be changed when this model is rotating. As the result, the viewVector.value of the ShaderMaterial is constant when this model is rotating(I do not change position of the camera).
if(jsonMesh){
jsonMesh.rotation.y += 0.1;
jsonMesh.material.uniforms.viewVector.value =
new THREE.Vector3().subVectors( camera.position, jsonMesh.position );
}
This is the Three.ShaderMaterial.
VertexShader and FragmentShader
<script id="vertexShader" type="x-shader/x-vertex">
uniform vec3 viewVector;
uniform float c;
uniform float p;
varying float intensity;
void main()
{
vec3 vNormal = normalize( normalMatrix * normal );
vec3 vNormel = normalize( normalMatrix * viewVector );
intensity = pow( c - dot(vNormal, vNormel), p );
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
</script>
<script id="fragmentShader" type="x-shader/x-fragment">
uniform vec3 glowColor;
varying float intensity;
void main()
{
vec3 glow = glowColor * intensity;
gl_FragColor = vec4( glow, 1.0 );
}
</script>
Three.ShaderMaterial.
var customMaterial = new THREE.ShaderMaterial(
{
uniforms:
{
"c": { type: "f", value: 1.0 },
"p": { type: "f", value: 1.4 },
glowColor: { type: "c", value: new THREE.Color(0xffff00) },
viewVector: { type: "v3", value: camera.position }
},
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent,
side: THREE.FrontSide,
blending: THREE.AdditiveBlending,
transparent: true
}
);
How should I modify the code in this case?
Here is the Demo and source code.
You can use built in three.js functions for this. Instead of using the camera position, I chose to show you how to set a light source position in the world. That way you can match the light source on your custom shader to any light sources you plan to add later to your 3d world. Feel free to change the worldLightPoint value to camera.position instead of new THREE.Vector3(100,100,100). and in that case the effect will remain constant with the camera position.
var v = new THREE.Vector3();
//var worldLightPoint = camera.position;
var worldLightPoint = new THREE.Vector3(100,100,100);
function update()
{
controls.update();
stats.update();
if(jsonMesh){
jsonMesh.rotation.y += 0.1;
jsonMesh.material.uniforms.viewVector.value = jsonMesh.worldToLocal(v.copy(worldLightPoint));
}
}

Vertex Displacement Shader has radial distortion. How do I fix?

I'm writing a vertex displacement shader. I successfully mapped the vertices of a plane to the brightness values of a video with a GLSL Shader and Three.js, but the GLSL Shader is mapping the values in a radial fashion which might be appropriate for texturing a sphere, but not this plane. There is radial distortion coming from the center outward. How do I fix this radial distortion?
RuttEtraShader = {
uniforms: {
"tDiffuse": { type: "t", value: null },
"opacity": { type: "f", value: 1.0 }
},
vertexShader: [
'uniform sampler2D tDiffuse;',
'varying vec3 vColor;',
"varying vec2 vUv;",
'void main() {',
'vec4 newVertexPos;',
'vec4 dv;',
'float df;',
"vUv = uv;",
'dv = texture2D( tDiffuse, vUv.xy );',
'df = 1.33*dv.x + 1.33*dv.y + 16.33*dv.z;',
'newVertexPos = vec4( normalize(position) * df * 10.3, 0.0 ) + vec4( position, 1.0 );',
'vColor = vec3( dv.x, dv.y, dv.z );',
'gl_Position = projectionMatrix * modelViewMatrix * newVertexPos;',
'}'
].join("\n"),
fragmentShader: [
'varying vec3 vColor;',
'void main() {',
'gl_FragColor = vec4( vColor.rgb, 1.0 );',
'}'
].join("\n")
};
texture = new THREE.Texture( video );
texture.minFilter = THREE.LinearFilter;
texture.magFilter = THREE.LinearFilter;
texture.format = THREE.RGBFormat;
texture.generateMipmaps = true;
videoMaterial = new THREE.ShaderMaterial( {
uniforms: {
"tDiffuse": { type: "t", value: texture },
},
vertexShader: RuttEtraShader.vertexShader,
fragmentShader: RuttEtraShader.fragmentShader,
depthWrite: false,
depthTest: true,
transparent: true,
overdraw: false
} );
videoMaterial.renderToScreen = true;
geometry = new THREE.PlaneGeometry(720, 480, 720, 480);
geometry.overdraw = false;
geometry.dynamic = true;
mesh = new THREE.Mesh( geometry, videoMaterial );
mesh.position.x = 0;
mesh.position.y = 0;
mesh.position.z = 0;
mesh.visible = true;
scene.add( mesh );
Your displacement is radiating from the center of the screen due to the fact that you are using a direction vector relative to the point (0,0,0) in Normalized Device Coordinates (NDC).
Without re-defining any of your matrices, you might be able to solve this simply by using newVertexPos = vec4( normalize(position - origin) * df * 10.3, 0.0 ) + vec4( position, 1.0 );, where origin is the point you want to radiate from. In NDC, vec3 origin = vec3 (-1.0,-1.0,0.0) would have everything radiate from the lower-left corner.
You will still have a very noticeable radial displacement if you only make this change, however. Another thing you might consider is using non-uniform scaling of your displacement. Instead of * df * 10.3, you might use * df * vec3 (1.0, 1.0, 10.3). This will make the Z displacement much more pronounced.
You can mix and match both approaches to find what looks best. I suspect that increasing the scale of the Z displacement by itself would produce the results you are looking for, but it is always helpful to understand why the displacement is radiating from the center of the screen nevertheless.
I have a feeling your plane may be situated at Z=0, so the Z displacement will always be 0 unless you move it back/forward. If you add 1.0 to position.z it will move it to the far plane and if you subtract 1.0 from position.z it will move it to the near plane.

Resources