Perpendicular falloff material - three.js

I want to make a falloff semi-transparent shader, opaque when normals are perpendicular to camera direction and transparent when normals face towards the camera. Here is the code I use so far :
vec3 vertexNormal = normalize( normalMatrix * normal );
vec3 viewDir = vec3( 0.0, 0.0, 1.0 );
float dotProd = dot(vertexNormal, viewDir);
alpha = abs ( 1.0 - dotProd );
It works but when the objects are not located in the center of the camera view, the falloff isn't consistent anymore, farther side have a larger falloff :
Falloff larger towards edge of camera view
Is there a way to get consistent falloff thickness all over the camera view (all sphere would be distorded by perspective but the falloff contour would be the same everywhere) ?
Thanks in advance!

Unless you’re using an orthographic camera your view dir is incorrect.
Try
vec4 vp = modelViewMatrix * vec4( position, 1.);
vec3 viewdir = - normalize(vp.xyz);

Related

How can I color points in Three JS using OpenGL and Fragment Shaders to depend on the points' distance to the scene origin

To clarify I am using React, React Three Fiber, Three JS
I have 1000 points mapped into the shape of a disc, and I would like to give them texture via ShaderMaterial. It takes a vertexShader and a fragmentShader. For the color of the points I want them to transition in a gradient from blue to red, the further away points are blue and the closest to origin points are red.
This is the vertexShader:
const vertexShader = `
uniform float uTime;
uniform float uRadius;
varying float vDistance;
void main() {
vec4 mvPosition = modelViewMatrix * vec4(position, 1.0);
vDistance = length(mvPosition.xyz);
gl_Position = projectionMatrix * mvPosition;
gl_PointSize = 5.0;
}
`
export default vertexShader
And here is the fragmentShader:
const fragmentShader = `
uniform float uDistance[1000];
varying float vDistance;
void main() {
// Calculate the distance of the fragment from the center of the point
float d = 1.0 - length(gl_PointCoord - vec2(0.5, 0.5));
// Interpolate the alpha value of the fragment based on its distance from the center of the point
float alpha = smoothstep(0.45, 0.55, d);
// Interpolate the color of the point between red and blue based on the distance of the point from the origin
vec3 color = mix(vec3(1.0, 0.0, 0.0), vec3(0.0, 0.0, 1.0), vDistance);
// Set the output color of the fragment
gl_FragColor = vec4(color, alpha);
}
`
export default fragmentShader
I have tried solving the problem at first by passing an array of normalized distances for every point, but I now realize the points would have no idea how to associate which array index is the distance correlating to itself.
The main thing I am confused about it is how gl_FragColor works. In the example linked the idea is that every point from the vertexShader file will have a vDistance and use that value to assign a unique color to itself in the fragmentShader
So far I have only succeeded in getting all of the points to be the same color, they do not seem to differ based on distance at all

Three.js shader pointLight position

I'm trying to write my shader and add light sources, I sort of figured it out and did it.
But there is a problem, the position of the light source is incorrectly determined, when the camera rotates or moves, something unimaginable happens.
So I get the position of the vertex shader
vec3 vGlobalPosition = (modelMatrix * vec4(position, 1.0 )).xyz
Now I'm trying to make an illuminated area
float lightDistance = pointLights[ i ].distance;
vec3 lightPosition = pointLights[ i ].position;
float diffuseCoefficient = max(
1.0 - (distance(lightPosition,vGlobalPosition) / lightDistance ), 0.0);
gl_FragColor.rgb += color.rgb * diffuseCoefficient;
But as I wrote earlier if you rotate the camera, the lighting area moves to different positions.
I set the light position manually and everything became normal.
vec3 lightPosition = vec3(2000,0,2000);
...
The question is how to get the right position of the light source? I need a global position, what position is contained in the light source I do not know.
Added an example: http://codepen.io/korner/pen/XMzEaG
Your problem lies with vPos. Currently you do:
vPos = (modelMatrix * vec4(position, 1.0)).xyz
Instead you need to multiply the position with modelViewMatrix:
vPos = (modelViewMatrix * vec4(position, 1.0)).xyz;
You need to use modelViewMatrix because PointLight.position is relative to the camera.

THREE.JS GLSL sprite always front to camera

I'm creating a glow effect for car stop lights and found a shader that makes it possible to always face the camera:
uniform vec3 viewVector;
uniform float c;
uniform float p;
varying float intensity;
void main() {
vec3 vNormal = normalize( normalMatrix * normal );
vec3 vNormel = normalize( normalMatrix * -viewVector );
intensity = pow( c - dot(vNormal, vNormel), p );
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
This solution is quite simple and almost works. It reacts to camera movement and it would be great. BUT this element is a child of a car. The car itself is moving around and when it rotates the material stops pointing directly at the camera.
I don't want to use SpritePlugin or LensFlarePlugin because they slow down my game by 20fps so I'll stick to this lightweight solution.
I found a solution for Direct 3d that you have to remove rotation data from tranformation matrix, but I don't know how to do this in THREE.js
I guess that instead of adding calculations with car transformation there must be a way to simplify this shader instead.
How to simplify this shader so the material always faces the camera?
From the link below: "To do spherical billboarding, just remove all rotations by setting the identity matrix". How to do it ShaderMaterial in THREE.js?
http://www.geeks3d.com/20140807/billboarding-vertex-shader-glsl/
The problem here I think is intercepting transformation matrix from ShaderMaterial before it's passed to the shader, but I'm not sure.
Probably irrelevant but here's also fragment shader:
uniform vec3 glowColor;
varying float intensity;
void main() {
vec3 glow = glowColor * intensity;
gl_FragColor = vec4( glow, 1.0 );
}
edit: for now I found a workaround which is eliminating parent's rotation influence by setting opposite quaternion. Not perfect and it's happening in CPU not GPU
this.quaternion._x = -this.parent.quaternion._x;
this.quaternion._y = -this.parent.quaternion._y;
this.quaternion._z = -this.parent.quaternion._z;
this.quaternion._w = -this.parent.quaternion._w;
Are you looking for an implementation of billboarding? (make a 2D sprite always face camera) If so, all you need to do is this:
"vec3 billboard(vec2 v, mat4 view){",
" vec3 up = vec3(view[0][1], view[1][1], view[2][1]);",
" vec3 right = vec3(view[0][0], view[1][0], view[2][0]);",
" vec3 p = right * v.x + up * v.y;",
" return p;",
"}"
v is the offset from the center, basically the 4 vertices in a plane that faces the z-axis. Eg. (1.0, 1.0), (1.0, -1.0), (-1.0, 1.0), and (-1.0, -1.0).
Use it like so:
"vec3 worldPos = billboard(a_offset, u_view);"
// then do whatever else.

Three.js, custom shader and png texture with transparency

I have an extremely simple PNG texture: a grey circle with a transparent background.
I use it as a uniform map for a THREE.ShaderMaterial:
var uniforms = THREE.UniformsUtils.merge( [basicShader.uniforms] );
uniforms['map'].value = THREE.ImageUtils.loadTexture( "img/particle.png" );
uniforms['size'].value = 100;
uniforms['opacity'].value = 0.5;
uniforms['psColor'].value = new THREE.Color( 0xffffff );
Here is my fragment shader (just part of it):
gl_FragColor = vec4( psColor, vOpacity );
gl_FragColor = gl_FragColor * texture2D( map,vec2( gl_PointCoord.x, 1.0 - gl_PointCoord.y ) );
gl_FragColor = gl_FragColor * vec4( vColor, 1.0 );
I applied the material to some particles (THREE.PointCloud mesh) and it works quite well:
But if i turn the camera of more than 180 degrees I see this:
I understand that the fragment shader is not correctly taking into account the alpha value of the PNG texture.
What is the best approach in this case, to get the right color and opacity (from custom attributes) and still get the alpha right from the PNG?
And why is it behaving correctly on one side?
Transparent objects must be rendered from back to front -- from furthest to closest. This is because of the depth buffer.
But PointCloud particles are not sorted based on distance from the camera. That would be too inefficient. The particles are always rendered in the same order, regardless of the camera position.
You have several work-arounds.
The first is to discard fragments for which the alpha is low. You can use a pattern like so:
if ( textureColor.a < 0.5 ) discard;
Another option is to set material.depthTest = false or material.depthWrite = false. You might not like the side effects, however, if you have other objects in the scene.
three.js r.71

How to get fullscreen texture coordinates for a fullscreen texture from a previous rendering pass?

I do two rendering passes in webgl application using three.js (contrived example here):
renderer.render(depthScene, camera, depthTarget);
renderer.render(scene, camera);
The first rendering pass is to the render target depthTarget which I want to access in the second rendering pass as a texture uniform:
uniform sampler2D tDepth;
float unpack_depth( const in vec4 rgba_depth ) { ... }
void main() {
vec2 screenTexCoord = vec2( 1.0, 1.0 );
float depth = 1.0 - unpack_depth( texture2D( tDepth, screenTexCoord ) );
gl_FragColor = vec4( vec3( depth ), 1.0 );
}
My question is how do I get the value for screenTexCoord? It is not gl_FragCoord.xy.
To avoid a possible misunderstanding: I don't want to render the texture from the first pass to a quad. I want to use the texture from the first pass while rendering the geometry in the second pass.
EDIT:
According to the WebGL specification gl_FragCoord contains window coordinates which are normalized device coordinates (ndc) scaled by the viewport. The ndc are within [-1, 1] so the following should yield coordinates within [0, 1] for texture lookup:
vec2 ndcXY = gl_FragCoord.xy / vec2( viewWidth, viewHeight );
vec2 screenTexCoord = (ndcXY+1.0)/2.0;
But somewhere I must be wrong because the updated example does still not show the (packed) depth?!
Finally figured it out myself. The correct way to calculate the texture coordinates is just:
vec2 screenTexCoord = gl_FragCoord.xy / vec2( viewWidth, viewHeight );
See a working example here.

Resources