Three JS Perspective Camera not making distant objects appear smaller [duplicate] - three.js

I am using React, Three JS, React Three Fiber. I have a bunch of points in my scene in the shape of a sphere, however I have a problem where the far side points are more visible than what's closest to the camera. My best guess is I have an issue with how color is being calculated based on distance, because the points become white if I zoom out to much, or it's the fact they do not become smaller as I zoom out.
Here is how I make the scene and camera with R3F
function App() {
return (
<Canvas className="App" style={{width: innerWidth, height: innerHeight}}>
<OrbitControls makeDefault enableDamping={true} enablePan={false}/>
<PerspectiveCamera makeDefault position={[0, 4, 21]} aspect={innerWidth/innerHeight} near={1} far={1000} fov={60}/>
<ExampleShader/>
<SphereFloaters/>
<axesHelper/>
</Canvas>
)
}
And for the sphere component I am setting a bufferGeometry inside the javascript part and using a shaderMaterial to pass along a vertex and fragment shader
...
<shaderMaterial
depthWrite={false}
depthTest={false}
transparent={true}
fragmentShader={fragmentShader}
vertexShader={vertexShader}
uniforms={uniforms}
blending={THREE.AdditiveBlending}
/>
I am setting the color gradient inside the vertex shader as well as setting the pointSize from an attribute in the following manner, though I should add I don't think it's a problem here because the points should still appear to be distant as I zoom out since I am using PerspectiveCamera
// set the vertex color
float d = length((position)/vec3(96.0, 35.0, 96.0));
d = clamp(d, 0.0, 1.0);
vColor = mix(vec3(227.0, 155.0, 0.0), vec3(100.0, 50.0, 255.0), d) / 255.;
...
gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4( transformedPosition, 1.0 );
gl_PointSize = pointSizes;
vRealPosition = gl_Position;
And the FragmentShader
void main() {
float d = length(gl_PointCoord.xy - 0.5);
float alpha = smoothstep(0.5, 0.01, d);
gl_FragColor = vec4( vColor, alpha );
}

Related

How do I scale the size of points in Three JS based on distance to camera?

I am using React, Three JS, React Three Fiber. I have a bunch of points in my scene in the shape of a sphere, however I have a problem where the far side points are more visible than what's closest to the camera. My best guess is I have an issue with how color is being calculated based on distance, because the points become white if I zoom out to much, or it's the fact they do not become smaller as I zoom out.
Here is how I make the scene and camera with R3F
function App() {
return (
<Canvas className="App" style={{width: innerWidth, height: innerHeight}}>
<OrbitControls makeDefault enableDamping={true} enablePan={false}/>
<PerspectiveCamera makeDefault position={[0, 4, 21]} aspect={innerWidth/innerHeight} near={1} far={1000} fov={60}/>
<ExampleShader/>
<SphereFloaters/>
<axesHelper/>
</Canvas>
)
}
And for the sphere component I am setting a bufferGeometry inside the javascript part and using a shaderMaterial to pass along a vertex and fragment shader
...
<shaderMaterial
depthWrite={false}
depthTest={false}
transparent={true}
fragmentShader={fragmentShader}
vertexShader={vertexShader}
uniforms={uniforms}
blending={THREE.AdditiveBlending}
/>
I am setting the color gradient inside the vertex shader as well as setting the pointSize from an attribute in the following manner, though I should add I don't think it's a problem here because the points should still appear to be distant as I zoom out since I am using PerspectiveCamera
// set the vertex color
float d = length((position)/vec3(96.0, 35.0, 96.0));
d = clamp(d, 0.0, 1.0);
vColor = mix(vec3(227.0, 155.0, 0.0), vec3(100.0, 50.0, 255.0), d) / 255.;
...
gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4( transformedPosition, 1.0 );
gl_PointSize = pointSizes;
vRealPosition = gl_Position;
And the FragmentShader
void main() {
float d = length(gl_PointCoord.xy - 0.5);
float alpha = smoothstep(0.5, 0.01, d);
gl_FragColor = vec4( vColor, alpha );
}

How can I make waves from the center of a Plane Geometry in Three.JS using the vertex shader?

I've been learning Three.js and I can't seem to wrap my head around shaders. I have an idea of what I want, and I know the mathematical tools within the GLSL language and what they do in simple terms, but I don't understand how they work together.
I have a plane geometry with a shader material, I want to be able to create waves from the center of the vertex shader, but I am unsure how to accomplish this.
Also, if there is a course or documentation you can provide that could explain simple concepts regarding vertex and fragment shaders that would be great!
This is what I have done so far:
varying vec2 vUv;
varying float vuTime;
varying float vElevation;
uniform float uTime;
void main(){
vec4 modelPosition = modelMatrix * vec4(position, 1.0);
float elevation = sin(modelPosition.x * 10.0 - uTime) * 0.1;
modelPosition.y += elevation;
vec4 viewPosition = viewMatrix * modelPosition;
vec4 projectedPosition = projectionMatrix * viewPosition;
gl_Position = projectedPosition;
vuTime = uTime;
vUv = uv;
vElevation = elevation;
}
I have set up a simple animation using the sin function and a time variable passed to the shader which creates a simple wave effect without the use of noise. I am trying to create a circular wave stemming from the center of the plane geometry.
What I THINK I have to do is use PI to offset the position away from the center while the wave is moving with uTime. To get to the center of the Plane geometry I need to offset the position with 0.5 float.
That is my understanding right now and I would love to know if I'm correct in my thinking or what a correct way is of accomplishing this.
I also am passing the varying variable to the fragment shader to control the color at the elevation.
Thanks for any help you guys provide; I appreciate it!
In your shader code, try to change this line
float elevation = sin(modelPosition.x * 10.0 - uTime) * 0.1;
to this
float elevation = sin(length(modelPosition.xz) * 10.0 - uTime) * 0.1;
You can use either UV coords or position.
let scene = new THREE.Scene();
let camera = new THREE.PerspectiveCamera(60, innerWidth / innerHeight, 1, 1000);
camera.position.set(0, 10, 10).setLength(10);
let renderer = new THREE.WebGLRenderer();
renderer.setSize(innerWidth, innerHeight);
document.body.appendChild(renderer.domElement);
let controls = new THREE.OrbitControls(camera, renderer.domElement);
scene.add(new THREE.GridHelper(10, 10, "magenta", "yellow"));
let g = new THREE.PlaneGeometry(10, 10, 50, 50);
let m = new THREE.ShaderMaterial({
wireframe: true,
uniforms: {
time: {value: 0},
color: {value: new THREE.Color("aqua")}
},
vertexShader:`
#define PI 3.1415926
#define PI2 PI*2.
uniform float time;
void main(){
vec3 pos = position;
pos.z = sin((length(uv - 0.5) - time) * 6. * PI2);
gl_Position = projectionMatrix * modelViewMatrix * vec4(pos, 1.);
}
`,
fragmentShader:`
uniform vec3 color;
void main(){
gl_FragColor = vec4(color, 1.);
}
`
});
let o = new THREE.Mesh(g, m);
o.rotation.x = -Math.PI * 0.5;
scene.add(o);
let clock = new THREE.Clock();
renderer.setAnimationLoop(() => {
let t = clock.getElapsedTime();
m.uniforms.time.value = t * 0.1;
renderer.render(scene, camera);
});
body{
overflow: hidden;
margin: 0;
}
<script src="https://threejs.org/build/three.min.js"></script>
<script src="https://threejs.org/examples/js/controls/OrbitControls.js"></script>

Glow effect shader works on desktop but not on mobile

I'm currently working on a personal project to generate a planet using procedural methods. The problem is that I am trying to achieve a glow effect using glsl. The intended effect works for desktop but not for mobile.
The following link illustrate the problem:
Intended Effect
iPhone6S result
The planet are composed as follows: Four IcosahedronBufferGeometry meshes composing earth, water, cloud and glow effect. I have tried disabling glow effect, then it works as intended for mobile. Therefore, the conclusion is that the problem lies within the glow effect.
Here are the code for the glow effect (fragment and vertex shader):
Vertex shader:
varying float intensity;
void main() {
/* Calculates dot product of the view vector (cameraPosition) and the normal */
/* High value exponent = less intense since dot product is always less than 1 */
vec3 vNormal = vec3(modelMatrix * vec4(normal, 0.0));
intensity = pow(0.2 - dot(normalize(cameraPosition), vNormal), 2.8);
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
Fragment Shader
varying float intensity;
void main() {
vec3 glow = vec3(255.0/255.0, 255.0/255.0, 255.0/255.0) * intensity;
gl_FragColor = vec4( glow, 1.0);
}
THREE.js Code
var glowMaterial, glowObj, glowUniforms, sUniforms;
sUniforms = sharedUniforms();
/* Uniforms */
glowUniforms = {
lightPos: {
type: sUniforms["lightPos"].type,
value: sUniforms["lightPos"].value,
}
};
/* Material */
glowMaterial = new THREE.ShaderMaterial({
uniforms: THREE.UniformsUtils.merge([
THREE.UniformsLib["ambient"],
THREE.UniformsLib["lights"],
glowUniforms
]),
vertexShader: glow_vert,
fragmentShader: glow_frag,
lights: true,
side: THREE.FrontSide,
blending: THREE.AdditiveBlending,
transparent: true
});
/* Add object to scene */
glowObj = new THREE.Mesh(new THREE.IcosahedronBufferGeometry(35, 4), glowMaterial);
scene.add(glowObj);
There are no error/warning messages in the console both for desktop and mobile using remote web inspector. As previously shown, it seems for mobile, the glow is plain white meanwhile for desktop, the intensity/color/opactiy of the material based on the value of the dot product in vertex shader works.

ThreeJS/GLSL Projection Mapping a Gradient

I'm trying to apply a gradient texture to a model using a gradient texture derived from CSS. The idea is that a user could adjust the stops/colors of a gradient and then apply the gradient to a model to match the current camera view as it's rotated around. I've had a very hard time understanding how to implement something like this tutorial.
I've created a very simple example with a hard coded gradient image and Suzanne the monkey, which you can find here:
https://github.com/abogartz/projection-mapping
(To run this, you can use the provided Browser-Sync setup or just run a simple server on index.html)
Right now, the Suzanne model applies the texture as per its own UVs. This results in a gradient that is not linear across the face:
What I would like is to use "projection mapping" instead, where the gradient starts from the leftmost vertex and ends at the rightmost, no matter how the camera is rotated (I'll save the camera matrix on a user action and use that as a uniform later).
The result should be more like this (of course with lighting,etc)
My current shader looks like this:
<script id='fragmentShader' type='x-shader/x-fragment'>
uniform vec2 u_mouse;
uniform vec2 u_resolution;
uniform float u_time;
uniform sampler2D u_gradient_tex;
varying vec2 vUv;
void main() {
gl_FragColor = texture2D(u_gradient_tex,vUv);
}
Obviously, the vUv varying is not what I want, so how do I calculate the projection coordinate instead?
Check this out, I created this for you so you can see the rough idea on how you would implement this:
http://glslsandbox.com/e#37464.0
I've done this with a circle but you could do it with a model as well.
Essentially all you need to do is change gl_FragColor = texture2D(u_gradient_tex,vUv); to something like gl_FragColor = texture2D(u_gradient_tex,gl_FragCoord.xy * some_scaling_factor);
This is changing the texture mapping to be dependant on the FragCoord rather than the model UV.
I don't think there is an "easy" way to do what you want. If you want the gradient to always stretch from the left edge of the model to the right edge of the model regardless of orientation then you need to compute the left most and right most vertex position from that perspective / camera angle. Otherwise the gradient would have no anchor (on the left) and no width (how far to stretch to fit)
Typical projection mapping is somewhat described here
Programatically generate simple UV Mapping for models
You need the position of the projector, then you project from that projector to the points on your mesh to generate UV coordinates. In your case the projector can always be the camera so you can ignore that part. You'd use planar mapping but you'd need to compute the left most vertex's position and the right most so you can align the projection so it matches silhouette of your 3D model.
If all you want is a single model with silhouette you can just set the background to your CSS gradient, clear to black then draw with 0,0,0,0 with model to cut a hole.
var scene = new THREE.Scene();
var camera = new THREE.PerspectiveCamera( 75, 1, 0.1, 1000 );
var renderer = new THREE.WebGLRenderer({alpha: true});
document.body.appendChild( renderer.domElement );
renderer.setClearColor(0x000000);
var geometry = new THREE.BoxGeometry( 1, 1, 1 );
var material = new THREE.MeshBasicMaterial( {
color: 0x000000,
opacity: 0,
} );
var cube = new THREE.Mesh( geometry, material );
scene.add( cube );
camera.position.z = 2;
function resize() {
var canvas = renderer.domElement;
var width = canvas.clientWidth;
var height = canvas.clientHeight;
if (canvas.width !== width || canvas.height !== height) {
renderer.setSize(width, height, false);
camera.aspect = width / height;
camera.updateProjectionMatrix();
}
}
function render(time) {
time *= 0.001; // convert to seconds;
resize();
cube.position.z = Math.sin(time);
cube.rotation.x = time * 0.817;
cube.rotation.y = time * 0.923;
renderer.render( scene, camera );
requestAnimationFrame( render );
}
requestAnimationFrame( render );
body { margin: 0; }
canvas {
width: 100vw;
height: 100vh;
display: block;
background: linear-gradient(to right, rgba(255,0,0,1) 0%, rgba(255,191,0,1) 23%, rgba(34,255,0,1) 41%, rgba(0,64,255,1) 55%, rgba(170,0,255,1) 75%, rgba(255,0,0,1) 100%);
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/three.js/r83/three.min.js"></script>

Shadow won't update when geometry is changed using VertexShader

I'm using a custom shader to curve a plane. My custom shader extends the Lambert shader so it supports lights and shadows. It all works as expected, but when the vertexShader changes the geometry of the plane, the shadow doesn't update. Is there anything I'm missing to flag that the geometry has updated in my vertexShader and the shadow needs to change?
[Here is a screenshot of the problem. The plane is curved with a vertexShader, but the shadow doesn't update][1]
[1]: http://i.stack.imgur.com/6kfCF.png
Here is the demo/code: http://dev.cartelle.nl/curve/
If you drag the "bendAngle" slider you can see that the shadow doesn't update.
One work-around I thought was to get the bounding box of my curved plane. Then use those points and create a new Mesh/Box and use that object to cast the shadow. But then I wasn't sure how to get the coordinates of the new curved geometry. When I would check geometry.boundingBox after the shader was applied, it would also just give me the original coordinates every time.
Thanks
Johnny
If you are modifying the geometry positions in the vertex shader, and you are casting shadows, you need to specify a custom depth material so the shadows will respond to the modified positions.
In your custom depth material's vertex shader, you modify the vertex positions in the same way you modified them in the material's vertex shader.
An example of a custom depth material can be seen in this three.js example, (although vertices are not modfied in the vertex shader in that example; they are modified on the CPU).
In your case, you would create a vertex shader for the custom depth material using a pattern like so:
<script type="x-shader/x-vertex" id="vertexShaderDepth">
uniform float bendAngle;
uniform vec2 bounds;
uniform float bendOffset;
uniform float bendAxisAngle;
vec3 bendIt( vec3 ip, float ba, vec2 b, float o, float a ) {
// your code here
return ip;
}
void main() {
vec3 p = bendIt( position, bendAngle, bounds, bendOffset, bendAxisAngle );
vec4 mvPosition = modelViewMatrix * vec4( p, 1.0 );
gl_Position = projectionMatrix * mvPosition;
}
</script>
And fragment shader like this:
<script type="x-shader/x-fragment" id="fragmentShaderDepth">
vec4 pack_depth( const in float depth ) {
const vec4 bit_shift = vec4( 256.0 * 256.0 * 256.0, 256.0 * 256.0, 256.0, 1.0 );
const vec4 bit_mask = vec4( 0.0, 1.0 / 256.0, 1.0 / 256.0, 1.0 / 256.0 );
vec4 res = fract( depth * bit_shift );
res -= res.xxyz * bit_mask;
return res;
}
void main() {
gl_FragData[ 0 ] = pack_depth( gl_FragCoord.z );
}
</script>
Then in your javascript, you specify the custom depth material:
uniforms = {};
uniforms.bendAngle = { type: "f", value: properties.bendAngle };
uniforms.bendOffset = { type: "f", value: properties.offset };
uniforms.bendAxisAngle = { type: "f", value: properties.bendAxisAngle };
uniforms.bounds = { type: "v2", value: new THREE.Vector2( - 8, 16 ) };
var vertexShader = document.getElementById( 'vertexShaderDepth' ).textContent;
var fragmentShader = document.getElementById( 'fragmentShaderDepth' ).textContent;
myObject.customDepthMaterial = new THREE.ShaderMaterial( {
uniforms: uniforms,
vertexShader: vertexShader,
fragmentShader: fragmentShader
} );
three.js r.74

Resources