Glow effect shader works on desktop but not on mobile - three.js

I'm currently working on a personal project to generate a planet using procedural methods. The problem is that I am trying to achieve a glow effect using glsl. The intended effect works for desktop but not for mobile.
The following link illustrate the problem:
Intended Effect
iPhone6S result
The planet are composed as follows: Four IcosahedronBufferGeometry meshes composing earth, water, cloud and glow effect. I have tried disabling glow effect, then it works as intended for mobile. Therefore, the conclusion is that the problem lies within the glow effect.
Here are the code for the glow effect (fragment and vertex shader):
Vertex shader:
varying float intensity;
void main() {
/* Calculates dot product of the view vector (cameraPosition) and the normal */
/* High value exponent = less intense since dot product is always less than 1 */
vec3 vNormal = vec3(modelMatrix * vec4(normal, 0.0));
intensity = pow(0.2 - dot(normalize(cameraPosition), vNormal), 2.8);
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
Fragment Shader
varying float intensity;
void main() {
vec3 glow = vec3(255.0/255.0, 255.0/255.0, 255.0/255.0) * intensity;
gl_FragColor = vec4( glow, 1.0);
}
THREE.js Code
var glowMaterial, glowObj, glowUniforms, sUniforms;
sUniforms = sharedUniforms();
/* Uniforms */
glowUniforms = {
lightPos: {
type: sUniforms["lightPos"].type,
value: sUniforms["lightPos"].value,
}
};
/* Material */
glowMaterial = new THREE.ShaderMaterial({
uniforms: THREE.UniformsUtils.merge([
THREE.UniformsLib["ambient"],
THREE.UniformsLib["lights"],
glowUniforms
]),
vertexShader: glow_vert,
fragmentShader: glow_frag,
lights: true,
side: THREE.FrontSide,
blending: THREE.AdditiveBlending,
transparent: true
});
/* Add object to scene */
glowObj = new THREE.Mesh(new THREE.IcosahedronBufferGeometry(35, 4), glowMaterial);
scene.add(glowObj);
There are no error/warning messages in the console both for desktop and mobile using remote web inspector. As previously shown, it seems for mobile, the glow is plain white meanwhile for desktop, the intensity/color/opactiy of the material based on the value of the dot product in vertex shader works.

Related

Changing color of occluded part of mesh within shader

I am working with THREE.JS and want to be able to have a mesh that changes the occluded part of itself into a different color.
Simple Example
The above image is a simple example, where the wall is in front of the Mesh and obstructing part of the Mesh but not all of it. The visible part of the mesh should be colored green whilst the occluded part should be colored red. Note that the wall is not transparent; the occluded part of the mesh should still be rendered using depthTest = false.
I've tried messing around with some basic shaders, but I don't really how to get started. Currently, the core parts of my code looks like this:
// My cube's material
const overlayMat = new THREE.ShaderMaterial({
uniforms: {
"maskTexture": { value: null },
},
vertexShader:
`varying vec2 vUv;
void main() {
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}`,
fragmentShader:
`varying vec2 vUv;
uniform sampler2D maskTexture;
void main() {
vec4 maskColor = texture2D(maskTexture, vUv);
float visibilityFactor = 1.0 - maskColor.g > 0.0 ? 1.0 : 0.5;
// Attempt to set the green value to the visibility factor
gl_FragColor = vec4(1.0, visibilityFactor, 0.0, 1.0);
}`,
depthTest: false,
depthWrite: false,
transparent: true
});
// My mask
let renderTargetMaskBuffer = new THREE.WebGLRenderTarget(innerWidth, innerHeight, {
minFilter: THREE.LinearFilter,
magFilter: THREE.LinearFilter,
format: THREE.RGBAFormat
});
// Inside my animate function:
function animate() {
// ...
overlayMat.uniforms["maskTexture"].value = renderTargetMaskBuffer.depthTexture;
renderer.render(scene, camera);
}
Which does not work, the cube remains one constant color.
Full code (with the "wall" that occludes part of the mesh) (JSFiddle)

Forge Viewer Autodesk v7 issues with recolouring THREE.BufferGeoemtry when using THREE.ShaderMaterial

EDIT: The Forge Viewer I'm using has customized version of Three.js release r71 (source) which is why I'm using outdated code. The current release of Three.js is r121.
I've created THREE.Group() that contains various THREE.Pointcloud(geometry, material). One of the Points is composed of THREE.BufferGeometry() and THREE.ShaderMaterial().
When I add a colour attribute to a BufferGeometry, only only red (1,0,0), white (1,1,1), or yellow (1,1,0) seem to work. This image is when I set the colour to (1,0,0). This image is when I set the colour to blue (0,0,1).
My question is, how do I resolve this? Is the issue in the shaders? Is the issue with how I build the BufferGeometry? Is it a bug? Thanks.
My shaders:
var vShader = `uniform float size;
varying vec3 vColor;
void main() {
vColor = color;
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_PointSize = size * ( size / (length(mvPosition.xyz) + 0.00001) );
gl_Position = projectionMatrix * mvPosition;
}`
var fShader = `varying vec3 vColor;
uniform sampler2D sprite;
void main() {
gl_FragColor = vec4(vColor, 1.0 ) * texture2D( sprite, gl_PointCoord );
if (gl_FragColor.x < 0.2) discard;
}`
My material:
var materialForBuffers = new THREE.ShaderMaterial( {
uniforms: {
size: { type: 'f', value: this.pointSize},
sprite: { type: 't', value: THREE.ImageUtils.loadTexture("../data/white.png") },
},
vertexShader: vShader,
fragmentShader: fShader,
transparent: true,
vertexColors: true,
});
How the color is added:
const colors = new Float32Array( [ 1.0, 0.0, 0.0 ] );
geometryForBuffers.addAttribute('color', new THREE.BufferAttribute( colors, 3 ));
Link to code
It looks like you may already be using parts of that sample code but if not, please refer to https://github.com/petrbroz/forge-point-clouds/blob/develop/public/scripts/extensions/pointcloud.js (live demo https://forge-point-clouds.autodesk.io). This sample code uses the color geometry attribute already to specify colors of individual points.

Does a shader material keep being rendered in Three.js, when no input value changed?

I'm programming shader materials on a smarthone with Three.js, so saving GPU performance is critical to me.
Situations:
Lots of planes are designed as UI interfaces in my application, and will play animations(shader animation) in the beginning, or in many user-interaction activities. After animation, those UIes are in a "static" status(a "static" image combined by shader), no need to do combination in shader again and again(yes, it renders, the "render" here means "shader multitexture, or program combination" ). If can't stop those meanless rendering, it'll consume a big percentage of GPU performence.
Questions:
As we knows, when any input values of the shader material are changed, whether "needupdate" is "false" or not, webGLRender will update the output material.
But, if no input value of the Uniforms is changed, will the webGLRender render it again, or just abandon the rendering? This is Question 1.
I've read the "webGLRender.js" code, with a bad understanding, still not very sure about how shader materials are rendered.(from code, i believe it keeps on rendering, while not very sure.)
The Question 2:
If webGLRender keeps rendering the shader material, even no input value is changed, is there a way to stop webGLRender from rendering the "unchanged" shader material, to save GPU performance?
Here's my simple test:
fragmentShader:
uniform float col;
void main() {
gl_FragColor = vec4(col, 0.58, 0.06, 1.0);
}
vertexShader:
void main() {
gl_Position = projectionMatrix * modelViewMatrix * vec4(position.x+5.0, position.y, position.z+25.0, 1.0);
}
Javascript code:
var cube1, geo1, mat1;
function newbox2(){
geo1 = new THREE.BoxGeometry(12, 4, 2);
mat1 = new THREE.ShaderMaterial( {
uniforms: {
col: { type: 'f', value: 1.0 },
},
vertexShader: "void main() {gl_Position = projectionMatrix * modelViewMatrix * vec4(position.x+5.0, position.y, position.z+25.0, 1.0); }",
fragmentShader: "uniform float col; void main() {gl_FragColor = vec4(col, 0.58, 0.06, 1.0);}"
});
cube1 = new THREE.Mesh(geo1, mat1);
scene.add(cube1);
cube1.position.set(5, 5, 10);
}
Thanks.
If you have any kind of motion, animation, orbit controls and such, and a loop, the renderer will render everything every frame. Even your camera changing position is considered a change in the shader "input values" (uniforms).
the only way to achieve something like this is to not render again.

Can't change opacity anymore when upgrading from Three.js r52 to r55

Basically I'm upgrading my app, from r52 to r55. This app use animations (Tweens) for updating lines, but also a ParticleSystem. Everything worked just fine in r52, scaling, moving and changing opacity.
I used these WebGLRenderer constructor settings:
clearColor: 0x1f1f1f
clearAlpha: 1
antialias: true
sortObjects: false
And a simple shader I took from the examples:
<script type="x-shader/x-vertex" id="vertexshader">
attribute float size;
attribute vec3 customColor;
attribute float customOpacity;
varying vec3 vColor;
varying float vOpacity;
void main() {
vColor = customColor;
vOpacity = customOpacity;
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_PointSize = size * ( 300.0 / length( mvPosition.xyz ) );
gl_Position = projectionMatrix * mvPosition;
}
</script>
<script type="x-shader/x-fragment" id="fragmentshader">
uniform vec3 color;
uniform sampler2D texture;
varying vec3 vColor;
varying float vOpacity;
void main() {
gl_FragColor = vec4( color * vColor, vOpacity );
gl_FragColor = gl_FragColor * texture2D( texture, gl_PointCoord );
}
</script>
I initialized the particle ShaderMaterial using:
blending : THREE.AdditiveBlending
depthTest : false
transparent : false
and the ParticleSystem by manually setting:
system.sortParticles = true
system.matrixAutoUpdate = true
system.visible = true
system.dynamic = true
So here how it renders in Three.js r52:
Now I've read the Migration wiki page, and concluded I only had to change a few names, nothing in the WebGLRenderer constructor, materials or shaders attributes.
I've upgraded to r55 and now visuals are broken:
Lines and particles are now all bright (opacity not taken in account).
Moreover for particles now the alpha mask is broken (if you watch carefully the color is different, and there is a "square cut" when overlapping with other particles, something I had in r52 and fixed by simply tuning the WebGLRender settings)
What could have changed? I tried to change settings in the WebGL constructor, alphas, background colors.. but it didn't help.
Likely, you need to set your shader material to transparent:
material.transparent = true;
three.js r.55

Shadow won't update when geometry is changed using VertexShader

I'm using a custom shader to curve a plane. My custom shader extends the Lambert shader so it supports lights and shadows. It all works as expected, but when the vertexShader changes the geometry of the plane, the shadow doesn't update. Is there anything I'm missing to flag that the geometry has updated in my vertexShader and the shadow needs to change?
[Here is a screenshot of the problem. The plane is curved with a vertexShader, but the shadow doesn't update][1]
[1]: http://i.stack.imgur.com/6kfCF.png
Here is the demo/code: http://dev.cartelle.nl/curve/
If you drag the "bendAngle" slider you can see that the shadow doesn't update.
One work-around I thought was to get the bounding box of my curved plane. Then use those points and create a new Mesh/Box and use that object to cast the shadow. But then I wasn't sure how to get the coordinates of the new curved geometry. When I would check geometry.boundingBox after the shader was applied, it would also just give me the original coordinates every time.
Thanks
Johnny
If you are modifying the geometry positions in the vertex shader, and you are casting shadows, you need to specify a custom depth material so the shadows will respond to the modified positions.
In your custom depth material's vertex shader, you modify the vertex positions in the same way you modified them in the material's vertex shader.
An example of a custom depth material can be seen in this three.js example, (although vertices are not modfied in the vertex shader in that example; they are modified on the CPU).
In your case, you would create a vertex shader for the custom depth material using a pattern like so:
<script type="x-shader/x-vertex" id="vertexShaderDepth">
uniform float bendAngle;
uniform vec2 bounds;
uniform float bendOffset;
uniform float bendAxisAngle;
vec3 bendIt( vec3 ip, float ba, vec2 b, float o, float a ) {
// your code here
return ip;
}
void main() {
vec3 p = bendIt( position, bendAngle, bounds, bendOffset, bendAxisAngle );
vec4 mvPosition = modelViewMatrix * vec4( p, 1.0 );
gl_Position = projectionMatrix * mvPosition;
}
</script>
And fragment shader like this:
<script type="x-shader/x-fragment" id="fragmentShaderDepth">
vec4 pack_depth( const in float depth ) {
const vec4 bit_shift = vec4( 256.0 * 256.0 * 256.0, 256.0 * 256.0, 256.0, 1.0 );
const vec4 bit_mask = vec4( 0.0, 1.0 / 256.0, 1.0 / 256.0, 1.0 / 256.0 );
vec4 res = fract( depth * bit_shift );
res -= res.xxyz * bit_mask;
return res;
}
void main() {
gl_FragData[ 0 ] = pack_depth( gl_FragCoord.z );
}
</script>
Then in your javascript, you specify the custom depth material:
uniforms = {};
uniforms.bendAngle = { type: "f", value: properties.bendAngle };
uniforms.bendOffset = { type: "f", value: properties.offset };
uniforms.bendAxisAngle = { type: "f", value: properties.bendAxisAngle };
uniforms.bounds = { type: "v2", value: new THREE.Vector2( - 8, 16 ) };
var vertexShader = document.getElementById( 'vertexShaderDepth' ).textContent;
var fragmentShader = document.getElementById( 'fragmentShaderDepth' ).textContent;
myObject.customDepthMaterial = new THREE.ShaderMaterial( {
uniforms: uniforms,
vertexShader: vertexShader,
fragmentShader: fragmentShader
} );
three.js r.74

Resources