Is it possible to distinguish somehow one spotlight from another in a fragment shader? For example, I created and added to the scene two spotlights A and B with the same characteristics. Now in the fragment shader I would like to determine if the current light is A or B while iterating over the spotLights array.
Related
I'm trying to create a shader that takes into account the clipping planes I'm defining in the scene. These clipping planes work fine for all of the 'vanilla' materials I'm using: THREE.MeshLambertMaterial, THREE.MeshPhongMaterial, and THREE.MeshPhysicalMaterial, but THREE.ShaderMaterial is missing this implementation. This is an example of what I mean: https://jsfiddle.net/fraguada/27LrLsv5/
In this example there are two cubes, one with a THREE.MeshStandardMaterial and one with a material defined by THREE.ShaderMaterial. The cube with a THREE.MeshStandardMaterial clips ok. The cube with THREE.ShaderMaterial does not clip.
(I'm not typically defining the vertex/fragment shader in script tags as I show in the jsfiddle, instead I'm defining them in a similar manner to this: https://github.com/mrdoob/three.js/blob/dev/examples/js/shaders/BasicShader.js.)
So, a few questions:
Should THREE.ShaderMaterial include Clipping Planes out of the box? (there is a clipping property, but not sure what it enables)
If not, how could I modify this shader to include the necessary params and shader chunks to enable clipping?
Actually, clipping is done inside the Three.js shaders.
To make it work, you have to handle it inside your shader, by adding those 4 "shader chunks" :
clipping_planes_pars_vertex.glsl at the top of your vertex shader ;
clipping_planes_vertex.glsl inside the main() of your vertex shader ;
clipping_planes_pars_fragment.glsl at the top of your fragment shader ;
clipping_planes_fragment.glsl inside the main() of your fragment shader.
You can access those chunks by simply adding #include <(chunk name)> to your shaders.
Then, set material.clipping = true; and it should work.
Check this fiddle.
Note
To make your shader work, I also added begin_vertex.glsl and project_vertex.glsl.
I checked the current phong shader implementation to understand where to put those chunks.
Note 2
This code should work with a shader implemented with an array of strings but note that you can also reference shader chunks with THREE.ShaderChunk[(chunk name)].
This should be more suitable in your case.
I have a fragment shader in which I use v_texCoords as a base for some effects. This works fine if I use a single Texture, as v_texCoords always ranges from 0 - 1, so the center point is always (0.5, 0.5) for example. If I am drawing from part of a TextureRegion though, my shader messes up because v_texCoords no longer ranges from 0-1. Is there any methods or variabels I can use to get a consistent 0-1 range in my fragment shader? I want to avoid setting uniforms as this would mean I need to flush the batch for every draw.
Thanks!
Nothing like this exists at the shader level - TextureRegions are entirely a libgdx construct that doesn't exist at all at the OpenGL ES API level.
Honestly for what you are trying I'd simply suggest not overloading the texture coordinate for two orthogonal purposes, and just add a separate vertex attribute which provides the 0-to-1 number.
I am doing a particle system in WebGL using Three.js, and I want to do all the computation of the particles in the shaders. To achieve that, the positions (for example) of the particles are stored in a texture which is sampled by the vertex shader of each particle (POINT primitive).
The position texture is in fact two render targets which are swapped each frame after being updated off screen. Each pixel of this texture represent a particle.
To update a position, I read one of he render targets (texture2D), do some computation, and write on the other render target (fragment output).
To perform the "do some computation" step, I need some per particle attributes, like its velocity (and a lot of others). Since this step is done in the fragment shader, I can't use the vertex attributes buffers, so I have to store these properties in separate textures and sample each of them in the fragment shader.
It works, but sampling textures is slow as far as I know, and I wonder if there is some better ways to do this, like having one vertex per particle, each rendering a single fragment of the position texture.
I know that OpenGL 4 as some alternative ways to deal with this, like UBO or SSBO, but I'm not sure about WebGL.
I have a mesh that consists of several triangles (order 100). I would like to define a different fragment shader for each of them. So to be able to show different kind of reflection behaviour for each triangle.
How should I approach this problem? Should I start defining a GLSL program and try to distinguish between different triangles? this answer is suggesting me that this is not the right approach glDrawElements and flat shading . Even this Using a different vertex and fragment shader for each object in webgl seems not the right approach since I do not want to have multiple objects, but just one with different materials(fragment shaders) on it.
My suggestion would be to create a super shader which can handle all the different scenarios you desire.
In order to set this up you'll need attributes that dictate which part of the shader to use.
e.g. in your vertex or fragment shader:
attribute bool flatShading;
attribute bool phongShading;
if (flatShading) {
// perform flat shading
} else if (phongShading) {
// perform phong shading
}
Then setup your buffers as so that the vertices in each triangle have a certain shading attribute applied.
I'm using OpenGL ES + GLKit. I've never been this low-level before in my life so I still have to learn a lot of things. I've developed a Unity games before and you just give it a .obj file and corresponding texture and it's done. (UV mapping happens to be inside the .obj file?)
I want to develop a kind of special Toon Shader with some different characteristics for use with 3D model. So I need to write a vertex shader (.vsh) and fragment shader (.fsh) right?
However, I just know that in order to apply a texture to a model with correct UV coordinate, you have to do this in shader? (am I right?) With "Texture Shader".
So, If I want to both apply the texture with UV mapping then apply my special Toon Shader, I have to write both in the same shader? There is no way I can create a plug-and-play Toon shader so I can use it with anything?
As a side question, which file format is a UV coordinate and how can I take that in to a shader program? What kind of attribute variable?
So I need to write a vertex shader (.vsh) and fragment shader (.fsh)
right?
Yes.
However, I just know that in order to apply a texture to a model with
correct UV coordinate
True
There is no way I can create a plug-and-play Toon shader so I can use
it with anything?
Check Uber-Shaders
and how can I take that in to a shader program? What kind of attribute
variable?
You are defining your attributes in shader by yourself. Check this GLSL tutorial