Using a custom vertex shader with THREE.Points - three.js

I see that Threejs has a Points Material to draw a geometry as points rather than as triangles. However, I want to manipulate the vertices using my own vertex shader, using a Shader Material. In WebGL, I think I could just call gl_drawArrays using gl.Points instead of gl.Triangles. How can I tell the renderer to draw the geometry as points? Is there a better way to go about this?

little addition, I had no joy until I added gl_PointSize to my vertex shader:
void main(){
gl_PointSize = 100.;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.);
}
found the answer in the GPU particle system example.

Found my solution right after asking the question. Just create a THREE.Points object instead of THREE.Mesh using whatever geometry and the Shader Material you want to use.
THREE.Points(geometry, new THREE.ShaderMaterial(parameters));

Related

texture Coordination in simple 2d world (THREE.Points) without UV

I want to load an image in THREE.Points and I'm using bufferGeometry without uv attribute and shaderMaterial
can anyone explain me what is vu and why I need it? I have a 2d world (all points have z=0 position ) so can I use simple vec2 instead of saving uv coordinations in attribute?
thanks in advance

Manually specify view and model matrices in Three.js?

Is there any way to manually specify view and model matrices?
I know Three.js is not supposed to be used in this way, but I am currently developing some educational materials to teach a typical computer graphics pipeline and would like to explicitly supply model/view/projection matrices to a shader. While I understood which matrices are model/view/projection matrices in Three.js from this issue, I haven't been able to find a good way to manually control them.
So far, I was able to specify the projection matrix by using camera.projectionMatrix.makePerspective() and the model matrix by using applyMatrix(). Actually, applyMatrix() is not ideal from the educational point of view because it internally decomposes the matrix to position, quaternion and scale and probably reconstructs the model matrix from those values and supply it to a shader.
One possible solution is to use ShaderMaterial() and specify all of the three matrices as uniforms. However, I may want to avoid it because they are also passed to a shader implicitly and the name "material" might confuse students.
Does anybody have suggestions to do this kind of stuff in Three.js?
However, I may want to avoid it because they are also passed to a shader implicitly and the name "material" might confuse students.
I'm not sure if this is the best approach. A Material in three.js should indeed be more than a shader. It consists of two shaders, but other stuff as well. For example if you set myMaterial.transparent = true; you will trigger a completely different flow of WebGLRenderer which in turn sets up different WebGL calls. Setting the blending mode for example is not what a shader does.
It would probably be worth explaining this abstraction, rather than renaming it.
...matrices in Three.js from this issue, I haven't been able to find a good way to manually control them.
With RawShaderMaterial you should be able to write the whole shader from scratch.
uniform mat4 uMyProjectionMatrix;
uniform mat4 uMyModelMatrix;
uniform mat4 uMyViewMatrix;
uniform mat5 uMyModelViewMatrix;
attribute vec3 aMyPosition;
void main(){
gl_Position = uMyProjectionMatrix * uMyViewMatrix * uMyModelMatrix * vec4( aMyPositon , 1.);
}
It is entirely up to you to define what those are. Is the projection matrix orthographic or not for example.
With ShaderMaterial you get these automagically:
void main(){
gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4( position , 1. );
}
projectionMatrix and viewMatrix are derived from the camera's properties, as you can see in the link (btw i've no idea why that's not in the documentation, i found myself referring to that particular issue a bunch of times :) ).
Both of these can be modified. Automagically if you do
myCamera.far = newFar;
myCamera.fov = newFov;
myCamera.updateProjectionMatrix(); //this will be the new projectionMatrix in GLSL
but, nothing should be preventing you from doing
myCamera.projectionMatrix.array[3] = mySkewLogic;
Same applies to modelMatrix:
myObject.position.x = newX;
myObject.updateMatrixWorld();
//or
myObject.matrixWorld.array[3] = someXTranslationLogic;

Sphere Impostor on ThreeJS

I am developing a sphere impostor shader on GLSL with ThreeJS. My algorithm is based on the publication from Sigg et al. named "GPU-Based Ray-Casting of Quadratic Surfaces".
When using a classic geometry approach, you need dozens or even hundreds triangles to represent each sphere. It may cause memory overload if you need to show thousands of spheres. The sphere impostor allows you to store only positions and radius on the geometry to show a sphere, giving much more performance than the previous technique.
For now, I succeeded to develop the shader, even by using ThreeJS shader chunks to ensure a full ThreeJS compatibility. You can find a demo page here. However, there is a last thing not working on this implementation.
When moving the objects on the scene, it seems that the object using the sphere impostor is delayed compared to a normal mesh. You can also notice that some times, the spheres are "cut" like on this picture.
This second bug makes me think that the sprite is nicely placed into the scene by the vertex shader but the fragment shader is computing wrong coordinates. I suspect two pieces of code where the problem could be :
Two varyings provided by the vertex shader to the fragment shader that should give the same value for each pixel of a sprite. I don't know how to verify this.
varying float projMatrix11;
varying float projMatrix22;
I don't know if I'm doing well to update my shader uniforms
group.traverse(function(o) {
if (!o.material) { return; }
var u = o.material.uniforms;
if (!u) { return; }
modelViewMatrixInverse.getInverse(
o.modelViewMatrix
);
if (u.projectionMatrixInverse) {
u.projectionMatrixInverse.value = projectionMatrixInverse;
}
if (u.projectionMatrixTranspose) {
u.projectionMatrixTranspose.value = projectionMatrixTranspose;
}
if (u.modelViewMatrixInverse) {
u.modelViewMatrixInverse.value = modelViewMatrixInverse;
}
if (u.viewport) {
u.viewport.value = viewport;
}
});
I wasn't able to debug the problem and hope someone knowing better ThreeJS than I can give me some clues about it.
I really hope we can solve this problem, so we may be able to propose this feature to the whole community of ThreeJS ;)
Note : I delayed the calls of requestAnimationFrame for you to facilitate debugging
EDIT : After digging more, the problem may come from how I'm updating custom uniforms. One of it uses the modelViewMatrix to get it's inverse. But the modelViewMatrix is updated only during the render call of the WebGLRenderer, so the frame delay may come from there. How can I update a uniform which is depending to other uniforms and keep them synchronized on ThreeJS ?
Answer found alone, I will explain it here if someone encounters the same trouble.
The problem is that I was updating modelViewMatrixInverse uniform by using the modelViewMatrix provided by ThreeJS. This uniform is only updated during the call of render() method of the WebGLRenderer and my modelViewMatrixInverse was one frame late at each render call. That's why my custom shader was everytime one frame late than ThreeJS native shaders.

Easy way to use light in fragment shader

I have a Collada object which i load per Three.js into my scene.
Now I want to change some vertex positions of the model in the
vertex shader, which is no problem.
But with this I have to skip the exported collada material and take a
ShaderMaterial.
The problem is now that I have to calculate the complete lighting of
the scene in my fragment shader.
Before, with the collada material, the complete lighting was calculated by the framework by using a directional light and a hemisphere light.
So my question is, if there is a solution where I can leave the fragmentShader untouched and all the colors are calculated as if I would use no ShaderMaterial.
I tried to use THREE.ShaderLib and pass only the fragmentShader of the phong shader
and my own vertexShader. But this gave only errors that not the same varyings are defined in both shaders.
Unfortunately, you are correct. You have to create your own ShaderMaterial, and it will be tedious to incorporate scene lighting.
If your scene lighting is not that important, you can hack in some ambient light and a single light at the camera location in your fragment shader.
If your scene lighting is important, then you need to set the ShaderMaterial parameter lights: true, and you will have access to the scene light uniforms in your vertex and fragment shaders.
three.js r.63

Cameras and Modelview in OpenGL ES (WebGL)

I'm having a little trouble with my OpenGL transformations -- I have a vertex shader that sets gl_Position to projection * view * model * vertex. I have code that generates a view matrix by inverting the model matrix of a camera in space, but when I set the object the camera is looking at to rotate in space, it seems as if the camera is rotating instead.
What could be causing this?
Apparently I had projection * model * view * vertex instead. Oops!

Resources