Add clipping to THREE.ShaderMaterial - three.js

I'm trying to create a shader that takes into account the clipping planes I'm defining in the scene. These clipping planes work fine for all of the 'vanilla' materials I'm using: THREE.MeshLambertMaterial, THREE.MeshPhongMaterial, and THREE.MeshPhysicalMaterial, but THREE.ShaderMaterial is missing this implementation. This is an example of what I mean: https://jsfiddle.net/fraguada/27LrLsv5/
In this example there are two cubes, one with a THREE.MeshStandardMaterial and one with a material defined by THREE.ShaderMaterial. The cube with a THREE.MeshStandardMaterial clips ok. The cube with THREE.ShaderMaterial does not clip.
(I'm not typically defining the vertex/fragment shader in script tags as I show in the jsfiddle, instead I'm defining them in a similar manner to this: https://github.com/mrdoob/three.js/blob/dev/examples/js/shaders/BasicShader.js.)
So, a few questions:
Should THREE.ShaderMaterial include Clipping Planes out of the box? (there is a clipping property, but not sure what it enables)
If not, how could I modify this shader to include the necessary params and shader chunks to enable clipping?

Actually, clipping is done inside the Three.js shaders.
To make it work, you have to handle it inside your shader, by adding those 4 "shader chunks" :
clipping_planes_pars_vertex.glsl at the top of your vertex shader ;
clipping_planes_vertex.glsl inside the main() of your vertex shader ;
clipping_planes_pars_fragment.glsl at the top of your fragment shader ;
clipping_planes_fragment.glsl inside the main() of your fragment shader.
You can access those chunks by simply adding #include <(chunk name)> to your shaders.
Then, set material.clipping = true; and it should work.
Check this fiddle.
Note
To make your shader work, I also added begin_vertex.glsl and project_vertex.glsl.
I checked the current phong shader implementation to understand where to put those chunks.
Note 2
This code should work with a shader implemented with an array of strings but note that you can also reference shader chunks with THREE.ShaderChunk[(chunk name)].
This should be more suitable in your case.

Related

GLES fragment shader, get 0-1 range when using TextureRegion - libGDX

I have a fragment shader in which I use v_texCoords as a base for some effects. This works fine if I use a single Texture, as v_texCoords always ranges from 0 - 1, so the center point is always (0.5, 0.5) for example. If I am drawing from part of a TextureRegion though, my shader messes up because v_texCoords no longer ranges from 0-1. Is there any methods or variabels I can use to get a consistent 0-1 range in my fragment shader? I want to avoid setting uniforms as this would mean I need to flush the batch for every draw.
Thanks!
Nothing like this exists at the shader level - TextureRegions are entirely a libgdx construct that doesn't exist at all at the OpenGL ES API level.
Honestly for what you are trying I'd simply suggest not overloading the texture coordinate for two orthogonal purposes, and just add a separate vertex attribute which provides the 0-to-1 number.

How can I light emission per vertex and per vertex lighting in ThreeJS?

I want to see a chart with color specified per vertex and to get little bit of shading too.
But if I use MeshBasicMaterial I only get VertexColor with no dynamic shading.
On the other hand, if I use MeshPhongMaterial I just get shading but without emissiveness from my vertex colors.
As the THREE.JS PhongMaterial supports vertexColors, giving you a nice combination of dynamic lighting and vertex colors, I'm not quite sure I understand your question. Perhaps that is something you should investigate more?
However, as an alternative to writing a custom shader you could try rendering your model in multiple passes.
This will not give you as much control over the way the vertex colors and phong lighting are combined as a shader would, but often a simple add/multiply blend can give pretty decent results.
Algorithm:
- create two meshes for the BufferGeometry, one with the BasicMaterial and one with the PhongMaterial
- for the PhongMaterial, set
depthFunc = THREE.EqualDepth
transparent = true;
blending = THREE.AdditiveBlending(or MultiplyBlending)
- render the first mesh
- render the second mesh at the exact same spot

Approach to write a fragment shader for each triangle in a mesh

I have a mesh that consists of several triangles (order 100). I would like to define a different fragment shader for each of them. So to be able to show different kind of reflection behaviour for each triangle.
How should I approach this problem? Should I start defining a GLSL program and try to distinguish between different triangles? this answer is suggesting me that this is not the right approach glDrawElements and flat shading . Even this Using a different vertex and fragment shader for each object in webgl seems not the right approach since I do not want to have multiple objects, but just one with different materials(fragment shaders) on it.
My suggestion would be to create a super shader which can handle all the different scenarios you desire.
In order to set this up you'll need attributes that dictate which part of the shader to use.
e.g. in your vertex or fragment shader:
attribute bool flatShading;
attribute bool phongShading;
if (flatShading) {
// perform flat shading
} else if (phongShading) {
// perform phong shading
}
Then setup your buffers as so that the vertices in each triangle have a certain shading attribute applied.

Three.js doesn't use different shader programs for different mesh objects, why?

I've tried to figure out, how three.js is working and have tried some shader debugger for it.
I've added two simple planes with basic material (single color without any shading model), which are rotating within rendering process.
First of all, my question was... Why is three.js using a single shader program (look at the WebGL context function .useProgram()) for both meshes.
I suppose, that objects are the same, and that's why for performance reasons a single shader program is using for similar objects.
But... I have changed my three.js application source code, and now there are a plane and a cube in scene, which are rotating.
And let's look in shader debugger again:
Here you can see, that three.js is using again one shader program, but the objects are different right now. And this moment is not clear for me.
If to look at that shader, it seems to be very generic and huge shader program, and there are also two different shader programs, which were compiled, but not used.
So, why is three.js using a single shader program? What are those correct (or maybe not) reasons?
Most of the work done in a shader is related to the material part of the mesh, not the geometry.
In webgl (or opengl for that matter) the geometry as you understand it (if it is a cube, a sphere, or whatever) is pretty irrelevant.
It would be a little bit more relevant if you talk about how the geometry is constructed. But in these days where faces of more than 3 vertices are gone, and triangle strips are seldom used, that are few different geometries... face3 geometries, line geometries, particle geometries, and buffer geometries.
Most of the time, the key difference to use a different shader will be in the material.

Relation between shader and UV texture mapping

I'm using OpenGL ES + GLKit. I've never been this low-level before in my life so I still have to learn a lot of things. I've developed a Unity games before and you just give it a .obj file and corresponding texture and it's done. (UV mapping happens to be inside the .obj file?)
I want to develop a kind of special Toon Shader with some different characteristics for use with 3D model. So I need to write a vertex shader (.vsh) and fragment shader (.fsh) right?
However, I just know that in order to apply a texture to a model with correct UV coordinate, you have to do this in shader? (am I right?) With "Texture Shader".
So, If I want to both apply the texture with UV mapping then apply my special Toon Shader, I have to write both in the same shader? There is no way I can create a plug-and-play Toon shader so I can use it with anything?
As a side question, which file format is a UV coordinate and how can I take that in to a shader program? What kind of attribute variable?
So I need to write a vertex shader (.vsh) and fragment shader (.fsh)
right?
Yes.
However, I just know that in order to apply a texture to a model with
correct UV coordinate
True
There is no way I can create a plug-and-play Toon shader so I can use
it with anything?
Check Uber-Shaders
and how can I take that in to a shader program? What kind of attribute
variable?
You are defining your attributes in shader by yourself. Check this GLSL tutorial

Resources