How can I move only specific vertices from my vertexshader ? (And how to choose them) - three.js

I created a square like this :
THREE.PlaneBufferGeometry(1, 1, 1, 50);
Regarding its material I used a shader material.
THREE.ShaderMaterial()
In my vertexShader function I call a 2d noise function that moves each vertices of my square like this :
But in the end I only want the left side of my square to move. I think if I only call the 50 first vertices or 1 vertices every 2, this should work.
Here's the code of my vertexShader :
void main() {
vUv = uv;
vec3 pos = position.xyz;
pos.x += noiseFunction(vec2(pos.y, time));
gl_Position = projectionMatrix * modelViewMatrix * vec4(pos, 1.0);
}
Does anyone know how can I only select the left-side vertices of my square ? Thanks

The position vector maps the vertex position in local-space, which means that the center of the quad is in the position (0,0).
Therefore, if you want to apply these changes only to the vertices in the left side, you need check if the x coordinate of the vertex is negative x-space.
void main() {
vUv = uv;
vec3 pos = position.xyz;
if ( pos.x < 0.0 ) {
pos.x += noiseFunction(vec2(pos.y, time));
}
// to avoid conditional branching, remove the entire if-block
// and replace it with the line below
// pos.x += noiseFunction(vec2(pos.y, time)) * max(sign(-pos.x), 0.0);
gl_Position = projectionMatrix * modelViewMatrix * vec4(pos, 1.0);
}
I've used an if-statement to make it clear what I meant, but in reality you should avoid it.
That way you prevent conditional branching on GPU.

Related

How to color only the faces where the normals are perpendicular to the camera

I am trying to do the math for a shader that needs to darken on the faces that have normals perpendicular to the camera (dot product is 0). So basically how do I get this dot product?
How do I fix the following?
uniform float time;
uniform vec3 eye_dir;
varying float darkening;
void main(){
float product=dot(normalize(eye_dir),normalize(normal.xyz));
darkening=product;
gl_Position=
projectionMatrix*
modelViewMatrix*
vec4(position,1.);
}
// in THREE.js
this.camera.getWorldDirection(this.eyeDir);
...
cell.material.uniforms.eye_dir = new Uniform(this.eyeDir);
To do what you want you've to calculate the vector from the fragment to the camera. The easiest way to do this, is to do it in view space (camera space), because in view space the position of the camera is (0, 0, 0).
Transform the position by the modelViewMatrix from model space to view space and the normal by the normalMatrix from model space to view space. See WebGLProgram.
Since the result of the dot product is 1.0 when the vectors are orientated in the same direction, the darkening is 1.0 - abs(dotproduct).
varying float darkening;
void main(){
vec4 view_pos = modelViewMatrix * vec4(position, 1.0);
vec3 view_dir = normalize(-view_pos.xyz); // vec3(0.0) - view_pos;
vec3 view_nv = normalize(normalMatrix * normal.xyz);
float NdotV = dot(view_dir, view_nv);
darkening = 1.0 - abs(NdotV);
gl_Position = projectionMatrix * view_pos;
}
Note, the Dot product of eye_dir and normal doesn't make any sense at all, because eye_dir is a vector in world space and normal is a vector in model (object) space.

Difficulty with proper layering in THREE.js scene

I am working on a hex-based game. I am currently trying to add a "fog of war" effect where certain tiles lay under an alpha mask to show that information is unknown. Unfortunately I'm running into some problems achieving the effect that I want. The way I'm implementing the fog is to create a mesh over all the tiles that has no alpha if the tile is "visible" and .7 if it is not. I think adjust the mesh position based on the camera position so it always stays in perspective. This is the effect:
Unfortunately, the first way I tried to do this has an undesired effect at low viewing angles. Because I'm shifting the fog to lay over tiles even as perspective changes, at low angles it will also cover the tops of mountains and trees. See below:
The second thing I tried was implementing a two scene solution from How to change the zOrder of object with Threejs?. I put the fog and the unseen tiles in one scene, and the seen tiles in another, and then rendered the seen tiles on top of the unseen. That solved the darkness problem for far tiles, however it now introduces another problem for near tiles. See below:
I'm a little stumped what to do. I'm fairly new to THREE.js (at least the more advanced parts of the library) so I'm wondering if there's something I'm missing that might work.
For reference, here's my vertex shader for the fog:
varying vec4 vColor;
void main() {
vec3 cRel = cameraPosition - position;
float dx = (20.0 * cRel.x) / cRel.y;
float dz = (20.0 * cRel.z) / cRel.y;
gl_Position = projectionMatrix *
modelViewMatrix *
vec4(
position.x + dx,
position.y,
position.z + dz,
1.0
);
if(color.x == 1.0 && color.y == 1.0 && color.z == 1.0) {
vColor = vec4(0.0, 0.0, 0.0, 0.0);
} else {
vColor = vec4(color, 0.7);
}
}
and my fragment shader:
varying vec4 vColor;
float expGradient(float val, float max) {
return (max + 1.0 / 10.0) * val / (val + 1.0 / 10.0);
}
void main() {
gl_FragColor = vec4(
vColor.x,
vColor.y,
vColor.z,
expGradient(vColor.w, 0.7)
);
}
I'm using the color of (1.0, 1.0, 1.0) to signify that it should be "seen".

Three.js/Webgl vertex.y does not update

In effort to learn vertex/fragment shaders I decided to create a simple rain effect by updating the y position of a point in the vertex shader and resetting it back to animate through again using Three.js PointCloud. I got it to animate across the screen once but gets stuck after resetting the y position.
uniform float size;
uniform float delta;
varying float vOpacity;
varying float vTexture;
void main() {
vOpacity = opacity;
vTexture = texture;
gl_PointSize = 164.0;
vec3 p = position;
vec3 p = position;
p.y -= delta * 50.0;
vec4 mvPosition = modelViewMatrix * vec4(1.0 * p, 1.0 );
vec4 nPos = projectionMatrix * mvPosition;
if(nPos.y < -200.0){
nPos.y = 100.0;
}
gl_Position = nPos;
}
Any ideas? Thanks
shader does not change the vertex position permanently
that means
gl_Position = nPos;
will not propagate to your position attribute in geometry
shader only runs on graphics card and has no access to memory of the browser
you can change your code to this:
nPos.y = mod(nPos.y, 300.0) - 200.0;
now the y coordinate should change as you want it to(going from 100 to -200 then back to 100)

glClipPlane - Is there an equivalent in webGL?

I have a 3D mesh. Is there any possibility to render the sectional view (clipping) like glClipPlane in OpenGL?
I am using Three.js r65.
The latest shader that I have added is:
Fragment Shader:
uniform float time;
uniform vec2 resolution;
varying vec2 vUv;
void main( void )
{
vec2 position = -1.0 + 2.0 * vUv;
float red = abs( sin( position.x * position.y + time / 2.0 ) );
float green = abs( cos( position.x * position.y + time / 3.0 ) );
float blue = abs( cos( position.x * position.y + time / 4.0 ) );
if(position.x > 0.2 && position.y > 0.2 )
{
discard;
}
gl_FragColor = vec4( red, green, blue, 1.0 ); }
Vertex Shader:
varying vec2 vUv;
void main()
{
vUv = uv;
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_Position = projectionMatrix * mvPosition;
}
Unfortunately in the OpenGL-ES specification against which WebGL has been specified there are no clip planes and the vertex shader stage lacks the gl_ClipDistance output, by which plane clipping is implemented in modern OpenGL.
However you can use the fragment shader to implement per-fragment clipping. In the fragment shader test the position of the incoming fragment against your set of clip planes and if the fragment does not pass the test discard it.
Update
Let's have a look at how clip planes are defined in fixed function pipeline OpenGL:
void ClipPlane( enum p, double eqn[4] );
The value of the first argument, p, is a symbolic constant,CLIP PLANEi, where i is
an integer between 0 and n − 1, indicating one of n client-defined clip planes. eqn
is an array of four double-precision floating-point values. These are the coefficients
of a plane equation in object coordinates: p1, p2, p3, and p4 (in that order). The
inverse of the current model-view matrix is applied to these coefficients, at the time
they are specified, yielding
p' = (p'1, p'2, p'3, p'4) = (p1, p2, p3, p4) inv(M)
(where M is the current model-view matrix; the resulting plane equation is unde-
fined if M is singular and may be inaccurate if M is poorly-conditioned) to obtain
the plane equation coefficients in eye coordinates. All points with eye coordinates
transpose( (x_e, y_e,z_e, w_e) ) that satisfy
(p'1, p'2, p'3, p'4)  x_e  ≥ 0
 y_e 
 z_e 
 w_e 
lie in the half-space defined by the plane; points that do not satisfy this condition
do not lie in the half-space.
So what you do is, you add uniforms by which you pass the clip plane parameters p' and add another out/in pair of variables between the vertex and fragment shader to pass the vertex eye space position. Then in the fragment shader the first thing you do is performing the clip plane equation test and if it doesn't pass you discard the fragment.
In the vertex shader
in vec3 vertex_position;
out vec4 eyespace_pos;
uniform mat4 modelview;
void main()
{
/* ... */
eyespace_pos = modelview * vec4(vertex_position, 1);
/* ... */
}
In the fragment shader
in vec4 eyespace_pos;
uniform vec4 clipplane;
void main()
{
if( dot( eyespace_pos, clipplane) < 0 ) {
discard;
}
/* ... */
}
In the newer versions (> r.76) of three.js clipping is supported in the THREE.WebGLRenderer. There is an array property called clippingPlanes where you can add your custom clipping planes (THREE.Plane instances).
For three.js you can check these two examples:
1) WebGL clipping (code base here on GitHub)
2) WebGL clipping advanced (code base here on GitHub)
A simple example
To add a clipping plane to the renderer you can do:
var normal = new THREE.Vector3( -1, 0, 0 );
var constant = 0;
var plane = new THREE.Plane( normal, constant );
renderer.clippingPlanes = [plane];
Here a fiddle to demonstrate this.
You can also clip on object level by adding a clipping plane to the object material. For this to work you have to set the renderer localClippingEnabled property to true.
// set renderer
renderer.localClippingEnabled = true;
// add clipping plane to material
var normal = new THREE.Vector3( -1, 0, 0 );
var constant = 0;
var color = 0xff0000;
var plane = new THREE.Plane( normal, constant );
var material = new THREE.MeshBasicMaterial({ color: color });
material.clippingPlanes = [plane];
var mesh = new THREE.Mesh( geometry, material );
Note: In r.77 some of the clipping functionality in the THREE.WebGLRenderer was moved moved to a separate THREE.WebGLClipping class, check here for reference in the three.js master branch.

moving from one point to point on sphere

I'm working with a GPU based particle system.
There are 1 million particles computed by passing in the x,y,z positions as rgb values on a 1024*1024 texture. The same is being done for their velocities.
I'm trying to make them move from an arbitrary point to a point on sphere.
My current shader, which I'm using for the computation, is moving from one point to another directly.
I'm not using the mass or velocity texture at the moment
// float mass = texture2D( posArray, texCoord.st).a;
vec3 p = texture2D( posArray, texCoord.st).rgb;
// vec3 v = texture2D( velArray, texCoord.st).rgb;
// map into 'cinder space'
p = (p * - 1.0) + 0.5;
// vec3 acc = -0.0002*p; // Centripetal force
// vec3 ayAcc = 0.00001*normalize(cross(vec3(0, 1 ,0),p)); // Angular force
// vec3 new_v = v + mass*(acc+ayAcc);
vec3 new_p = p + ((moveToPos - p) / duration);
// map out of 'cinder space'
new_p = (new_p - 0.5) * -1.0;
gl_FragData[0] = vec4(new_p.x, new_p.y, new_p.z, mass);
//gl_FragData[1] = vec4(new_v.x, new_v.y, new_v.z, 1.0);
moveToPos is the mouse pointer as a float (0.0f > 1.0f)
the coordinate system is being translated from (0.5,0.5 > -0.5,-0.5) to (0.0,0.0 > 1.0,1.0)
I'm completely new to vector maths, and the calculations that are confusing me. I know I need to use the formula:
x=Rsinϕcosθ
y=Rsinϕsinθ
z=Rcosϕ
but calculating the angles from moveToPos(xyz) > p(xyz) is remaining a problem
I wrote the original version of this GPU-particles shader a few years back (now #: https://github.com/num3ric/Cinder-Particles). Here is one possible approach to your problem.
I would start with a fragment shader applying a spring force to the particles so that they more or less are constrained to the surface of a sphere. Something like this:
uniform sampler2D posArray;
uniform sampler2D velArray;
varying vec4 texCoord;
void main(void)
{
float mass = texture2D( posArray, texCoord.st).a;
vec3 p = texture2D( posArray, texCoord.st).rgb;
vec3 v = texture2D( velArray, texCoord.st).rgb;
float x0 = 0.5; //distance from center of sphere to be maintaned
float x = distance(p, vec3(0,0,0)); // current distance
vec3 acc = -0.0002*(x - x0)*p; //apply spring force (hooke's law)
vec3 new_v = v + mass*(acc);
new_v = 0.999*new_v; // friction to slow down velocities over time
vec3 new_p = p + new_v;
//Render to positions texture
gl_FragData[0] = vec4(new_p.x, new_p.y, new_p.z, mass);
//Render to velocities texture
gl_FragData[1] = vec4(new_v.x, new_v.y, new_v.z, 1.0);
}
Then, I would pass a new vec3 uniform for the mouse position intersecting a sphere of the same radius (done outside the shader in Cinder).
Now, combining this with the previous soft spring constraint. You could add a tangential force towards this attraction point. Start with a simple (mousePos - p) acceleration, and then figure out a way to make this force exclusively tangential using cross-products.
I'm not sure how the spherical coordinates approach would work here.
x=Rsinϕcosθ
y=Rsinϕsinθ
z=Rcosϕ
Where do you get ϕ and θ? The textures stores the positions and velocities in cartesian coordinates. Plus, converting back and forth is not really an option.
My explanation could be too advanced if you are not comfortable with vectors. Unfortunately, shaders and particle animation are very mathematical by nature.
Here is a solution that I've worked out - it works, however if I move the center point of the spheres outside their own bounds, I lose particles.
#define NPEOPLE 5
uniform sampler2D posArray;
uniform sampler2D velArray;
uniform vec3 centerPoint[NPEOPLE];
uniform float radius[NPEOPLE];
uniform float duration;
varying vec4 texCoord;
void main(void) {
float personToGet = texture2D( posArray, texCoord.st).a;
vec3 p = texture2D( posArray, texCoord.st).rgb;
float mass = texture2D( velArray, texCoord.st).a;
vec3 v = texture2D( velArray, texCoord.st).rgb;
// map into 'cinder space'
p = (p * - 1.0) + 0.5;
vec3 vec_p = p - centerPoint[int(personToGet)];
float len_vec_p = sqrt( ( vec_p.x * vec_p.x ) + (vec_p.y * vec_p.y) + (vec_p.z * vec_p.z) );
vec_p = ( ( radius[int(personToGet)] /* mass */ ) / len_vec_p ) * vec_p;
vec3 new_p = ( vec_p + centerPoint[int(personToGet)] );
new_p = p + ( (new_p - p) / (duration) );
// map out of 'cinder space'
new_p = (new_p - 0.5) * -1.0;
vec3 new_v = v;
gl_FragData[0] = vec4(new_p.x, new_p.y, new_p.z, personToGet);
gl_FragData[1] = vec4(new_v.x, new_v.y, new_v.z, mass);
}
I'm passing in arrays of 5 vec3f's and a float mapped as 5 center points and radii.
The particles are setup with a random position at the beginning and move towards the number in the array mapped to the alpha value of the position array.
My aim is to pass in blob data from openCV and map the spheres to people on a camera feed.
It's really uninteresting visually at the moment, so will need to use the velocity texture to add to the behaviour of the particles.

Resources