How do I scale the size of points in Three JS based on distance to camera? - three.js

I am using React, Three JS, React Three Fiber. I have a bunch of points in my scene in the shape of a sphere, however I have a problem where the far side points are more visible than what's closest to the camera. My best guess is I have an issue with how color is being calculated based on distance, because the points become white if I zoom out to much, or it's the fact they do not become smaller as I zoom out.
Here is how I make the scene and camera with R3F
function App() {
return (
<Canvas className="App" style={{width: innerWidth, height: innerHeight}}>
<OrbitControls makeDefault enableDamping={true} enablePan={false}/>
<PerspectiveCamera makeDefault position={[0, 4, 21]} aspect={innerWidth/innerHeight} near={1} far={1000} fov={60}/>
<ExampleShader/>
<SphereFloaters/>
<axesHelper/>
</Canvas>
)
}
And for the sphere component I am setting a bufferGeometry inside the javascript part and using a shaderMaterial to pass along a vertex and fragment shader
...
<shaderMaterial
depthWrite={false}
depthTest={false}
transparent={true}
fragmentShader={fragmentShader}
vertexShader={vertexShader}
uniforms={uniforms}
blending={THREE.AdditiveBlending}
/>
I am setting the color gradient inside the vertex shader as well as setting the pointSize from an attribute in the following manner, though I should add I don't think it's a problem here because the points should still appear to be distant as I zoom out since I am using PerspectiveCamera
// set the vertex color
float d = length((position)/vec3(96.0, 35.0, 96.0));
d = clamp(d, 0.0, 1.0);
vColor = mix(vec3(227.0, 155.0, 0.0), vec3(100.0, 50.0, 255.0), d) / 255.;
...
gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4( transformedPosition, 1.0 );
gl_PointSize = pointSizes;
vRealPosition = gl_Position;
And the FragmentShader
void main() {
float d = length(gl_PointCoord.xy - 0.5);
float alpha = smoothstep(0.5, 0.01, d);
gl_FragColor = vec4( vColor, alpha );
}

Related

Three JS Perspective Camera not making distant objects appear smaller [duplicate]

I am using React, Three JS, React Three Fiber. I have a bunch of points in my scene in the shape of a sphere, however I have a problem where the far side points are more visible than what's closest to the camera. My best guess is I have an issue with how color is being calculated based on distance, because the points become white if I zoom out to much, or it's the fact they do not become smaller as I zoom out.
Here is how I make the scene and camera with R3F
function App() {
return (
<Canvas className="App" style={{width: innerWidth, height: innerHeight}}>
<OrbitControls makeDefault enableDamping={true} enablePan={false}/>
<PerspectiveCamera makeDefault position={[0, 4, 21]} aspect={innerWidth/innerHeight} near={1} far={1000} fov={60}/>
<ExampleShader/>
<SphereFloaters/>
<axesHelper/>
</Canvas>
)
}
And for the sphere component I am setting a bufferGeometry inside the javascript part and using a shaderMaterial to pass along a vertex and fragment shader
...
<shaderMaterial
depthWrite={false}
depthTest={false}
transparent={true}
fragmentShader={fragmentShader}
vertexShader={vertexShader}
uniforms={uniforms}
blending={THREE.AdditiveBlending}
/>
I am setting the color gradient inside the vertex shader as well as setting the pointSize from an attribute in the following manner, though I should add I don't think it's a problem here because the points should still appear to be distant as I zoom out since I am using PerspectiveCamera
// set the vertex color
float d = length((position)/vec3(96.0, 35.0, 96.0));
d = clamp(d, 0.0, 1.0);
vColor = mix(vec3(227.0, 155.0, 0.0), vec3(100.0, 50.0, 255.0), d) / 255.;
...
gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4( transformedPosition, 1.0 );
gl_PointSize = pointSizes;
vRealPosition = gl_Position;
And the FragmentShader
void main() {
float d = length(gl_PointCoord.xy - 0.5);
float alpha = smoothstep(0.5, 0.01, d);
gl_FragColor = vec4( vColor, alpha );
}

How can I make waves from the center of a Plane Geometry in Three.JS using the vertex shader?

I've been learning Three.js and I can't seem to wrap my head around shaders. I have an idea of what I want, and I know the mathematical tools within the GLSL language and what they do in simple terms, but I don't understand how they work together.
I have a plane geometry with a shader material, I want to be able to create waves from the center of the vertex shader, but I am unsure how to accomplish this.
Also, if there is a course or documentation you can provide that could explain simple concepts regarding vertex and fragment shaders that would be great!
This is what I have done so far:
varying vec2 vUv;
varying float vuTime;
varying float vElevation;
uniform float uTime;
void main(){
vec4 modelPosition = modelMatrix * vec4(position, 1.0);
float elevation = sin(modelPosition.x * 10.0 - uTime) * 0.1;
modelPosition.y += elevation;
vec4 viewPosition = viewMatrix * modelPosition;
vec4 projectedPosition = projectionMatrix * viewPosition;
gl_Position = projectedPosition;
vuTime = uTime;
vUv = uv;
vElevation = elevation;
}
I have set up a simple animation using the sin function and a time variable passed to the shader which creates a simple wave effect without the use of noise. I am trying to create a circular wave stemming from the center of the plane geometry.
What I THINK I have to do is use PI to offset the position away from the center while the wave is moving with uTime. To get to the center of the Plane geometry I need to offset the position with 0.5 float.
That is my understanding right now and I would love to know if I'm correct in my thinking or what a correct way is of accomplishing this.
I also am passing the varying variable to the fragment shader to control the color at the elevation.
Thanks for any help you guys provide; I appreciate it!
In your shader code, try to change this line
float elevation = sin(modelPosition.x * 10.0 - uTime) * 0.1;
to this
float elevation = sin(length(modelPosition.xz) * 10.0 - uTime) * 0.1;
You can use either UV coords or position.
let scene = new THREE.Scene();
let camera = new THREE.PerspectiveCamera(60, innerWidth / innerHeight, 1, 1000);
camera.position.set(0, 10, 10).setLength(10);
let renderer = new THREE.WebGLRenderer();
renderer.setSize(innerWidth, innerHeight);
document.body.appendChild(renderer.domElement);
let controls = new THREE.OrbitControls(camera, renderer.domElement);
scene.add(new THREE.GridHelper(10, 10, "magenta", "yellow"));
let g = new THREE.PlaneGeometry(10, 10, 50, 50);
let m = new THREE.ShaderMaterial({
wireframe: true,
uniforms: {
time: {value: 0},
color: {value: new THREE.Color("aqua")}
},
vertexShader:`
#define PI 3.1415926
#define PI2 PI*2.
uniform float time;
void main(){
vec3 pos = position;
pos.z = sin((length(uv - 0.5) - time) * 6. * PI2);
gl_Position = projectionMatrix * modelViewMatrix * vec4(pos, 1.);
}
`,
fragmentShader:`
uniform vec3 color;
void main(){
gl_FragColor = vec4(color, 1.);
}
`
});
let o = new THREE.Mesh(g, m);
o.rotation.x = -Math.PI * 0.5;
scene.add(o);
let clock = new THREE.Clock();
renderer.setAnimationLoop(() => {
let t = clock.getElapsedTime();
m.uniforms.time.value = t * 0.1;
renderer.render(scene, camera);
});
body{
overflow: hidden;
margin: 0;
}
<script src="https://threejs.org/build/three.min.js"></script>
<script src="https://threejs.org/examples/js/controls/OrbitControls.js"></script>

Glow effect shader works on desktop but not on mobile

I'm currently working on a personal project to generate a planet using procedural methods. The problem is that I am trying to achieve a glow effect using glsl. The intended effect works for desktop but not for mobile.
The following link illustrate the problem:
Intended Effect
iPhone6S result
The planet are composed as follows: Four IcosahedronBufferGeometry meshes composing earth, water, cloud and glow effect. I have tried disabling glow effect, then it works as intended for mobile. Therefore, the conclusion is that the problem lies within the glow effect.
Here are the code for the glow effect (fragment and vertex shader):
Vertex shader:
varying float intensity;
void main() {
/* Calculates dot product of the view vector (cameraPosition) and the normal */
/* High value exponent = less intense since dot product is always less than 1 */
vec3 vNormal = vec3(modelMatrix * vec4(normal, 0.0));
intensity = pow(0.2 - dot(normalize(cameraPosition), vNormal), 2.8);
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
Fragment Shader
varying float intensity;
void main() {
vec3 glow = vec3(255.0/255.0, 255.0/255.0, 255.0/255.0) * intensity;
gl_FragColor = vec4( glow, 1.0);
}
THREE.js Code
var glowMaterial, glowObj, glowUniforms, sUniforms;
sUniforms = sharedUniforms();
/* Uniforms */
glowUniforms = {
lightPos: {
type: sUniforms["lightPos"].type,
value: sUniforms["lightPos"].value,
}
};
/* Material */
glowMaterial = new THREE.ShaderMaterial({
uniforms: THREE.UniformsUtils.merge([
THREE.UniformsLib["ambient"],
THREE.UniformsLib["lights"],
glowUniforms
]),
vertexShader: glow_vert,
fragmentShader: glow_frag,
lights: true,
side: THREE.FrontSide,
blending: THREE.AdditiveBlending,
transparent: true
});
/* Add object to scene */
glowObj = new THREE.Mesh(new THREE.IcosahedronBufferGeometry(35, 4), glowMaterial);
scene.add(glowObj);
There are no error/warning messages in the console both for desktop and mobile using remote web inspector. As previously shown, it seems for mobile, the glow is plain white meanwhile for desktop, the intensity/color/opactiy of the material based on the value of the dot product in vertex shader works.

Lost fragments in my shader

I'm trying to do a tilesystem in Threejs: Green for ground / Blue for water.
I'm using a shader on a PlaneBufferGeometry.
Here is what I have so far :
Relevant code :
JS: variable chunk and function DoPlaneStuff() (both at the beginning)
HTML: vertex and fragment shader
var chunk = {
// number of width and height segments for PlaneBuffer
segments: 32,
// Heightmap: 0 = water, 1 = ground
heightmap: [
[1, 0, 0],
[1, 1, 0],
[1, 0, 1],
],
// size of the plane
size: 40
};
function DoPlaneStuff() {
var uniforms = {
heightmap: {
type: "iv1",
// transform the 2d Array to a simple array
value: chunk.heightmap.reduce((p, c) => p.concat(c), [])
},
hmsize: {
type: "f",
value: chunk.heightmap[0].length
},
coord: {
type: "v2",
value: new THREE.Vector2(-chunk.size / 2, -chunk.size / 2)
},
size: {
type: "f",
value: chunk.size
}
};
console.info("UNIFORMS GIVEN :", uniforms);
var shaderMaterial = new THREE.ShaderMaterial({
uniforms: uniforms,
vertexShader: document.getElementById("v_shader").textContent,
fragmentShader: document.getElementById("f_shader").textContent
});
var plane = new THREE.Mesh(
new THREE.PlaneBufferGeometry(chunk.size, chunk.size, chunk.segments, chunk.segments),
shaderMaterial
);
plane.rotation.x = -Math.PI / 2;
scene.add(plane);
}
// --------------------- END OF RELEVANT CODE
window.addEventListener("load", Init);
function Init() {
Init3dSpace();
DoPlaneStuff();
Render();
}
var camera_config = {
dist: 50,
angle: (5 / 8) * (Math.PI / 2)
}
var scene, renderer, camera;
function Init3dSpace() {
scene = new THREE.Scene();
renderer = new THREE.WebGLRenderer({
antialias: true,
logarithmicDepthBuffer: true
});
camera = new THREE.PerspectiveCamera(
50,
window.innerWidth / window.innerHeight,
0.1,
1000
);
this.camera.position.y = camera_config.dist * Math.sin(camera_config.angle);
this.camera.position.x = 0;
this.camera.position.z = 0 + camera_config.dist * Math.cos(camera_config.angle);
this.camera.rotation.x = -camera_config.angle;
var light = new THREE.HemisphereLight(0xffffff, 10);
light.position.set(0, 50, 0);
scene.add(light);
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);
}
function Render() {
renderer.render(scene, camera);
}
body {
overflow: hidden;
margin: 0;
}
<script src="//cdnjs.cloudflare.com/ajax/libs/three.js/r70/three.min.js"></script>
<!-- VERTEX SHADER -->
<script id="v_shader" type="x-shader/x-vertex">
// size of the plane
uniform float size;
// coordinates of the geometry
uniform vec2 coord;
// heightmap size (=width and height of the heightmap)
uniform float hmsize;
uniform int heightmap[9];
varying float colorValue;
void main() {
int xIndex = int(floor(
(position.x - coord.x) / (size / hmsize)
));
int yIndex = int(floor(
(-1.0 * position.y - coord.y) / (size / hmsize)
));
// Get the index of the corresponding tile in the array
int index = xIndex + int(hmsize) * yIndex;
// get the value of the tile
colorValue = float(heightmap[index]);
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
</script>
<!-- FRAGMENT SHADER -->
<script id="f_shader" type="x-shader/x-fragment">
varying float colorValue;
void main() {
// default color is something is not expected: RED
gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);
// IF WATER
if (colorValue == 0.0) {
// BLUE
gl_FragColor = vec4( 0.0, 0.0, 1.0, 1.0 );
}
// IF GROUND
if (colorValue == 1.0) {
// GREEN
gl_FragColor = vec4( 0.1, 0.6, 0.0, 1.0 );
}
}
</script>
As you can see it's almost working, but I have these red lines splitting green and blue areas and I can't figure out why.
I call these red fragments the "lost one" because they don't map to any tile, and I can't get why.
I could only notice that with a greater value of chunk.segments (which is the number of height and width segments for the geometry) I can have thiner red lines.
I would like to know how to have a gradient fill between green and blue zones instead of red.
The red lines are formed by triangles that have some vertices lying in a ground tile and other vertices in a water tile. The GPU then interpolates the colorValue along the triangle, producing a smooth gradient with values from 0 to 1, instead of a sharp step that you probably expect.
There are several solutions for this. You can change the condition in your shader to choose the color based on the mid point: if colorValue < 0.5, output blue, otherwise green. That won't work well if you decide you want more tile types later on, though. A better solution would be to generate your geometry in a way that all vertices of all triangles lie in a single tile. That will involve doubling up vertices that lie on the tile boundaries. You can also add the flat interpolation qualifier to colorValue, but it's harder to control which vertices' attribute the triangle will end up using.
... I just noticed that you do want a gradient instead of a sharp step. That's even easier. You need to move the color selection code from the fragment shader to the vertex shader and just return the resulting interpolated color in the fragment shader.

Shadow won't update when geometry is changed using VertexShader

I'm using a custom shader to curve a plane. My custom shader extends the Lambert shader so it supports lights and shadows. It all works as expected, but when the vertexShader changes the geometry of the plane, the shadow doesn't update. Is there anything I'm missing to flag that the geometry has updated in my vertexShader and the shadow needs to change?
[Here is a screenshot of the problem. The plane is curved with a vertexShader, but the shadow doesn't update][1]
[1]: http://i.stack.imgur.com/6kfCF.png
Here is the demo/code: http://dev.cartelle.nl/curve/
If you drag the "bendAngle" slider you can see that the shadow doesn't update.
One work-around I thought was to get the bounding box of my curved plane. Then use those points and create a new Mesh/Box and use that object to cast the shadow. But then I wasn't sure how to get the coordinates of the new curved geometry. When I would check geometry.boundingBox after the shader was applied, it would also just give me the original coordinates every time.
Thanks
Johnny
If you are modifying the geometry positions in the vertex shader, and you are casting shadows, you need to specify a custom depth material so the shadows will respond to the modified positions.
In your custom depth material's vertex shader, you modify the vertex positions in the same way you modified them in the material's vertex shader.
An example of a custom depth material can be seen in this three.js example, (although vertices are not modfied in the vertex shader in that example; they are modified on the CPU).
In your case, you would create a vertex shader for the custom depth material using a pattern like so:
<script type="x-shader/x-vertex" id="vertexShaderDepth">
uniform float bendAngle;
uniform vec2 bounds;
uniform float bendOffset;
uniform float bendAxisAngle;
vec3 bendIt( vec3 ip, float ba, vec2 b, float o, float a ) {
// your code here
return ip;
}
void main() {
vec3 p = bendIt( position, bendAngle, bounds, bendOffset, bendAxisAngle );
vec4 mvPosition = modelViewMatrix * vec4( p, 1.0 );
gl_Position = projectionMatrix * mvPosition;
}
</script>
And fragment shader like this:
<script type="x-shader/x-fragment" id="fragmentShaderDepth">
vec4 pack_depth( const in float depth ) {
const vec4 bit_shift = vec4( 256.0 * 256.0 * 256.0, 256.0 * 256.0, 256.0, 1.0 );
const vec4 bit_mask = vec4( 0.0, 1.0 / 256.0, 1.0 / 256.0, 1.0 / 256.0 );
vec4 res = fract( depth * bit_shift );
res -= res.xxyz * bit_mask;
return res;
}
void main() {
gl_FragData[ 0 ] = pack_depth( gl_FragCoord.z );
}
</script>
Then in your javascript, you specify the custom depth material:
uniforms = {};
uniforms.bendAngle = { type: "f", value: properties.bendAngle };
uniforms.bendOffset = { type: "f", value: properties.offset };
uniforms.bendAxisAngle = { type: "f", value: properties.bendAxisAngle };
uniforms.bounds = { type: "v2", value: new THREE.Vector2( - 8, 16 ) };
var vertexShader = document.getElementById( 'vertexShaderDepth' ).textContent;
var fragmentShader = document.getElementById( 'fragmentShaderDepth' ).textContent;
myObject.customDepthMaterial = new THREE.ShaderMaterial( {
uniforms: uniforms,
vertexShader: vertexShader,
fragmentShader: fragmentShader
} );
three.js r.74

Resources