Three.js/Webgl vertex.y does not update - three.js

In effort to learn vertex/fragment shaders I decided to create a simple rain effect by updating the y position of a point in the vertex shader and resetting it back to animate through again using Three.js PointCloud. I got it to animate across the screen once but gets stuck after resetting the y position.
uniform float size;
uniform float delta;
varying float vOpacity;
varying float vTexture;
void main() {
vOpacity = opacity;
vTexture = texture;
gl_PointSize = 164.0;
vec3 p = position;
vec3 p = position;
p.y -= delta * 50.0;
vec4 mvPosition = modelViewMatrix * vec4(1.0 * p, 1.0 );
vec4 nPos = projectionMatrix * mvPosition;
if(nPos.y < -200.0){
nPos.y = 100.0;
}
gl_Position = nPos;
}
Any ideas? Thanks

shader does not change the vertex position permanently
that means
gl_Position = nPos;
will not propagate to your position attribute in geometry
shader only runs on graphics card and has no access to memory of the browser
you can change your code to this:
nPos.y = mod(nPos.y, 300.0) - 200.0;
now the y coordinate should change as you want it to(going from 100 to -200 then back to 100)

Related

Why can't I use Three JS to rotate a mesh if I used a vertex shader to set gl_Position?

I have a bunch of points mapped to a disc and I want to make them rotate about the origin of the scene. I have tried to make the points rotate using the standard approach:
<mesh ref={mesh}>
..
useFrame((state) => {
const { clock } = state;
gPoints.current.material.uniforms.uTime.value = clock.elapsedTime;
mesh.current.rotation.y += 0.01;
});
However it does not rotate anything. I pass along predetermined starting coordinates to the vertex shader as a uniform,
const vertexShader = `
uniform float uTime;
uniform float uDistance[1000];
uniform float uPointSizes[1000];
varying vec3 vColor;
varying float vDistance;
varying vec4 vRealPosition;
void main() {
float minPointSize = 1.0;
float maxPointSize = 10.0;
vDistance = distance(position, vec3(0.0));
vColor = mix(vec3(100.0/255.0, 15.0/255.0, 0.0), vec3(200.0/255.0, 50.0/255.0, 100.0/255.0), vDistance/20.0);
float pointSize = uPointSizes[gl_VertexID];
// Do Not Touch
gl_Position = projectionMatrix * viewMatrix * vec4( position, 1.0 );
gl_PointSize = pointSize;
vRealPosition = gl_Position;
}
`
export default vertexShader
Alternatively I have tried getting the points to rotate about the origin inside the vertex shader but I can't get it to work

How to convert a square texture into a trapezoid texture with progressive distortion in GLSL

Im in a Three.js project and Im trying to convert a square with a square texture inside into a trapezoid.
I manage to create the shape but the texture inside, although it fits/cover the shape it do it with an undesired distorsiĆ³n.
Im using a PlaneBufferGeometry with ShaderMaterial and im trying to obtain this distorsion in the shader part (although it would be ok if it is done in the threejs geometry part).
This is my vertex:
uniform sampler2D uTexture;
varying vec2 vUv;
void main(){
float scaleTOP = 0.5;
float scaleBOTTOM = 1.0;
float scaleLEFT = 1.0;
float scaleRIGHT = 1.0;
float scaleX = mix(scaleBOTTOM, scaleTOP, uv.y);
float posX = position.x*scaleX;
float scaleY = mix(scaleLEFT, scaleRIGHT, uv.x);
float posY = position.y*scaleY;
vec3 finalPosition = vec3(posX, posY);
gl_Position = projectionMatrix * modelViewMatrix * vec4( finalPosition, 1.0 );
// Varyings:
vUv = uv;
}
And this is my fragment:
uniform sampler2D uTexture;
varying vec2 vUv;
void main() {
vec4 tex = texture2D ( uTexture, vUv );
gl_FragColor = vec4(tex.r, tex.g, tex.b, 1.0);
}
Unfortunately I manage to distort the square into the trapezoid but the texture is not distorted in the way I want. See figure to see the intended result:
Figure:
My vertex and fragment were ok.
The problem was that the Threejs geometry I was using had only 2 polygons. I was using:
this.bg_geometry = new THREE.PlaneBufferGeometry(width, height, 1, 1)
Thats it... with only one division which only created two triangles which actually can be seen in the figure I posted.
I changed the geometry to:
this.bg_geometry = new THREE.PlaneBufferGeometry(width, height, 100, 100)
...and now the texture is distorted as desired.
Anyway many thanks to #prisoner849 as he put me in the track to pass 4 points as uniforms uPoints in this order: TL,TR,BL,BR to set the shape of the plane.
My vertex shader looks now like this:
uniform vec3 uPoints[4];
varying vec2 vUv;
void main(){
vec3 baselineBottom = (uPoints[3] - uPoints[2]) * uv.x + uPoints[2];
vec3 baselineTop = (uPoints[1] - uPoints[0]) * uv.x + uPoints[0];
vec3 finalPosition = (baselineTop - baselineBottom) * uv.y + baselineBottom;
gl_Position = projectionMatrix * modelViewMatrix * vec4( finalPosition, 1.0 );
vUv = uv;
}

Threejs: compute projected coordinate in fragment shader

I'm struggling with handling Coord in fragment Shader.
In brief, I just want to draw circle with fragment shader using (x,y,z) of world space. But because of camera position and the z of circle's center position, I cannot get actual right projected x and y coords.
Let's suppose that my camera placed at (0, 0, 1000) and perspective with
fov: 45deg
aspect with screen_width/screen_height
nearZ: 1
farZ: 10000
Camera look at (0,0). In this case with three.js, I can get projectionMatrix and ModelViewMatrix of camera(e.g.PerspectiveCamera.projectionMatrix) and also in default I can use viewMatrix in fragmentShader of ShaderMaterial in three.js.
So in fragmentShader, for calculating projected coordinate of circle placed (300, 300, -1000), I write my VertexShader and FragmentShader like below.
My Vertex Shader is only for get projectionMatrix and modelViewMatrix as P and MV.
// vertexShader
varying mat4 P;
varying mat4 MV;
void main(){
P = projectionMatrix;
MV = modelViewMatrix;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
And then, I just calculate x and y using P and MV like below.
// fragmentShader
varying mat4 P;
varying mat4 MV;
uniform float x;
uniform float y;
uniform float z;
uniform float r;
uniform vec2 u_resolution;
float circle(vec2 _st, vec2 _center, float _radius){
vec2 dist = _st - _center + u_resolution;
return 1.-smoothstep(_radius-(_radius*0.01),
_radius+(_radius*0.01),
length(dist));
}
void main(){
vec2 coord = (P * MV * vec4(x, y, z, 1.0)).xy;
float point = circle(gl_FragCoord.xy, coord, r); // ignore r scaling.
gl_FragColor = vec4(vec4(point), point);
}
But the result doesn't match what I expected. And also some weird behaviors were found.
No matter what z of uniform, there's no change at all.
Pixel ratio can be some reason(e.g. retina display has pixel ratio as 2) but from my experiments of it, it has nothing to do with this.
Any mistake that I made? Or any misleading? (somehow there can be mistake in circle function but I think it doesn't make critical problem..)
Lets assume that x, y and z, define the center of a circle in world space. You want to draw a circle in a plane which is parallel to the view port in a screen space pass, where you draw a quad over the entire viewport.
You have to transform the center of the circle from world space coordinates to normalized device coordinates. The best solution would be to do this on the CPU and to set uniform with the result.
According to the code of your question, this can be done in the vertex shader, too. But you have to do a Perspective divide, after the transformation by the model view matrix and the projection matrix, to transform the point form clip space to view normalized device space:
uniform mat4 P;
uniform mat4 MV;
uniform float x;
uniform float y;
uniform float z;
varying vec3 cpt;
void main(){
vec4 cpt_h = projectionMatrix * modelViewMatrix * vec4(x, y, z, 1.0);
vec3 cpt = cpt_h.xyz / cpt_h.w;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
If u_resolution, is the width and the height of the viewport, then the x and y coordinate of the fragment in normalized device space can be calculated by:
vec2 coord = gl_FragCoord.xy / u_resolution.xy * 2.0 - 1.0;
But I recommend to transform the center point of the circle to window (pixel) coordinates, then the radius can be set in pixel, too:
vec2 cpt_p = (cpt.xy * 0.5 + 0.5) * u_resolution.xy;
To calculate the length of a vector you can use the GLSL function length.
The final fragment shader may look like this:
varying vec3 cpt;
uniform vec2 u_resolution;
uniform float u_pixel_ratio; // device pixel ratio
uniform float r; // e.g. 100.0 means a radius of 100 pixel
float circle( vec2 _st, vec2 _center, float _radius )
{
// thickness of the circle in pixel
const float thickness = 20.0;
// distance to the center point in pixel
float dist = length(_st - _center);
return 1.0 - smoothstep(0.0, thickness/2.0, abs(_radius-dist));
}
void main(){
vec2 cpt_p = (cpt.xy * 0.5 + 0.5) * u_resolution.xy * u_pixel_ratio;
float point = circle(gl_FragCoord.xy, cpt_p, r);
gl_FragColor = vec4(point);
}
e.g. a circle with a radius of 50.0 and a thickness of 20.0:
If you want to apply a perspective distortion to the circle, this means the size of the circle decreases by distance, then you have to set the radius r in world coordinates.
Calculate a point on the circle and calculate the distance of the point to the center point of the circle in the vertex shader in normalized device space.
This is the radius which you have to pass from the vertex shader to the fragment shader additional to the center point of the circle.
uniform mat4 P;
uniform mat4 MV;
uniform float x;
uniform float y;
uniform float z;
uniform float r; // e.g. radius in world space
varying vec3 cpt;
varying float radius;
void main(){
vec4 cpt_v = modelViewMatrix * vec4(x, y, z, 1.0);
vec4 rpt_v = vec4(cpt_v.x, cpt_v.y + r, cpt_v.zw);
vec4 cpt_h = projectionMatrix * cpt_v;
vec4 rpt_h = projectionMatrix * rpt_v;
cpt = cpt_h.xyz / cpt_h.w;
vec3 rpt = rpt_v.xyz / rpt_v.w;
radius = length(rpt-cpt);
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
varying vec3 cpt;
varying float radius;
uniform vec2 u_resolution;
uniform float u_pixel_ratio; // device pixel ratio
uniform float r; // e.g. 100.0 means a radius of 100 pixel
float circle( vec2 _st, vec2 _center, float _radius )
{
const float thickness = 20.0;
float dist = length(_st - _center);
return 1.0 - smoothstep(0.0, thickness/2.0, abs(_radius-dist));
}
void main()
{
vec2 cpt_p = (cpt.xy * 0.5 + 0.5) * u_resolution.xy * u_pixel_ratio;
float radius_p = radius * 0.5 * u_resolution.y * u_pixel_ratio.y;
float point = circle(gl_FragCoord.xy, cpt_p, radius_p);
gl_FragColor = vec4(point);
}

Shader Z space perspective ShaderMaterial BufferGeometry

I'm changing the z coordinate vertices on my geometry but find that the Mesh Stays the same size, and I'm expecting it to get smaller. Tweening between vertex positions works as expected in X,Y space however.
This is how I'm calculating my gl_Position by tweening the amplitude uniform in my render function:
<script type="x-shader/x-vertex" id="vertexshader">
uniform float amplitude;
uniform float direction;
uniform vec3 cameraPos;
uniform float time;
attribute vec3 tweenPosition;
varying vec2 vUv;
void main() {
vec3 pos = position;
vec3 morphed = vec3( 0.0, 0.0, 0.0 );
morphed += ( tweenPosition - position ) * amplitude;
morphed += pos;
vec4 mvPosition = modelViewMatrix * vec4( morphed * vec3(1, -1, 0), 1.0 );
vUv = uv;
gl_Position = projectionMatrix * mvPosition;
}
</script>
I also tried something like this from calculating perspective on webglfundamentals:
vec4 newPos = projectionMatrix * mvPosition;
float zToDivideBy = 1.0 + newPos.z * 1.0;
gl_Position = vec4(newPos.xyz, zToDivideBy);
This is my loop to calculate another vertex set that I'm tweening between:
for (var i = 0; i < positions.length; i++) {
if ((i+1) % 3 === 0) {
// subtracting from z coord of each vertex
tweenPositions[i] = positions[i]- (Math.random() * 2000);
} else {
tweenPositions[i] = positions[i]
}
}
I get the same results with this -- objects further away in Z-Space do not scale / attenuate / do anything different. What gives?
morphed * vec3(1, -1, 0)
z is always zero in your code.
[x,y,z] * [1,-1,0] = [x,-y,0]

Whats causing this artefacts?

I'm trying to map a 3D Texture to a Voxel Terrain, the pixel color from the texture should align with the terrain geometry and I don't want any gradients.
It's kinda working but depending on the camera angle I can see artefacts and I would like to know what's causing this and how to fix it.
Is it some kind of bleeding? I've already tried to change my texture as described here:
ogr3d tilemap but without success.
I think it's not a mipmapping bug because I turned it off and if I turn it on it looks way worse.
The geometry is grid aligned.
I set min and mag filtering to nearest and wrapping to clampToEdge.
fragment:
vec2 computeSliceOffset(float slice, float slicesPerRow, vec2 sliceSize) {
return sliceSize * vec2(mod(slice, slicesPerRow), floor(slice / slicesPerRow));
}
vec4 sampleAs3DTexture(sampler2D tex, vec3 texCoord, float size, float numRows, float slicesPerRow) {
float slice = texCoord.z * size;
float sliceZ = floor(slice);
float zOffset = fract(slice);
vec2 sliceSize = vec2(1.0 / slicesPerRow, 1.0 / numRows);
vec2 sliceOffset = computeSliceOffset(sliceZ, slicesPerRow, sliceSize);
vec2 uv = texCoord.xy * sliceSize;
vec4 sliceColor = texture2D(tex, sliceOffset + uv);
return sliceColor;
}
vec3 texCoord = mod(worldPosition.xyz,128.0)/128.0;
vec4 myColor = sampleAs3DTexture(texture_0,texCoord,64.0,8.0,8.0);

Resources