What is the current solution in r136 to blend lights, shadows and color in a ShaderMaterial ? I already found the solution for the fog support.
I found some examples in previous revision (r108) like this codesandbox.
Actually, I'm looking for this kind of result : codesandbox.
Should I copy MeshPhongMaterial shaders as code base for my own shaders ?
The usage of custom shaders is mandatory in my projects, that's why i'm not using built-in materials.
Any idea or example ?
Thanks !
This question is huge, and does not have a single answer. Creating lights, shadows, and color varies from material to material, and includes so many elements that it would require a full course to learn.
However, you can look at the segments of shader code used by Three.js in this folder called /ShaderChunk. If you look up "light", you'll see shader segments (or "chunks"), for each material, like toon, lambert, physical, etc. Some materials need parameters to be defined at the beginning of the shader code, (those are the _pars files), some are calculated in the vertex shader, some in fragment, some need to split the code between _begin and _end, etc:
Shadows are even more complex because they require a separate render pass to build the shadowmap. Like I said, re-building your own lights, shadows, and color is a huge undertaking, and it would need a full course to learn. I hope this answer at least points you in the right direction.
Related
I'd like to have a dynamic GLSL shader texture to be used as a reference map (for displacement and other stuff) on multiple, different materials on different Meshes.
My approach would be to do the computation one time, using a THREE.WebGLRenderTarget, setup a ortho cam, a 1X1 plane with a THREE.ShaderMaterial, and access the WebGLRenderTarget.texture, that I'd embed in a "master" object, whenever and wherever I need.
Is there any "official" object I can / may use for this? I seen the postprocessing objects are pretty similar (EG ShaderPass) but I'm unsure if and how to use them.
Thank you.
I've tried to figure out, how three.js is working and have tried some shader debugger for it.
I've added two simple planes with basic material (single color without any shading model), which are rotating within rendering process.
First of all, my question was... Why is three.js using a single shader program (look at the WebGL context function .useProgram()) for both meshes.
I suppose, that objects are the same, and that's why for performance reasons a single shader program is using for similar objects.
But... I have changed my three.js application source code, and now there are a plane and a cube in scene, which are rotating.
And let's look in shader debugger again:
Here you can see, that three.js is using again one shader program, but the objects are different right now. And this moment is not clear for me.
If to look at that shader, it seems to be very generic and huge shader program, and there are also two different shader programs, which were compiled, but not used.
So, why is three.js using a single shader program? What are those correct (or maybe not) reasons?
Most of the work done in a shader is related to the material part of the mesh, not the geometry.
In webgl (or opengl for that matter) the geometry as you understand it (if it is a cube, a sphere, or whatever) is pretty irrelevant.
It would be a little bit more relevant if you talk about how the geometry is constructed. But in these days where faces of more than 3 vertices are gone, and triangle strips are seldom used, that are few different geometries... face3 geometries, line geometries, particle geometries, and buffer geometries.
Most of the time, the key difference to use a different shader will be in the material.
I'm trying to highlight meshes (animated characters etc) in my game on a mouse-over event.
They have multiple textures and sometimes skin.
I thought I would wrap them into a ShaderMaterial and on hit-test change uniforms to brighten it up with a fragment shader.
To do this, can I somehow just manipulate the regular shading?
Can I mix multiple materials, making my shader take color values from the standard shader and just tweak them?
Or do I need whole separate render pass and blend it with composer?
Or maybe just something else entirely, like ambient light applied to just one object/shader?
Thanks for any suggestions.
repost, see comments for details/discussion:
"you could change the whole material/shader on mouse over, although i guess this is somewhat performance intensive, depending on the number of switches the user usually does and what the rest of your app is doing. What i used once is the emissive color of the regular phong material with material.emissive.setRGB() for example. This will give you some nice effects, too".
There are some examples of this that you can probably learn a lot from. Take a look at their source:
Mouse over meshes
Interactive cubes
In addition to what GuyGood said, if you do indeed decide to use .setRGB() on your material you need to use the values of Red Green Blue ranging from 0 to 1 as documented in the Three.js Documentation
Or if you prefer, like I do, the .setHex() function also exists.
I am currently learning to apply more materials with lighting to my application, but then I got confused on how I should scale it. I'm using WebGL and I'm learning from learningwebgl.com (which they say the same as NeHe OpenGL tutorial), and it only shows simple shader programs that every sample have one program with embedded lighting on it.
Say I have multiple lighting setup, like some point lights/spot lights, and I have multiple meshes with different materials, but every mesh need to react with those lights. What should I do? make individual shader programs where you put colors/textures to meshes and then switch to lighting program? or always have every shader strings in my application with those lights (as functions) as default in it, append it to loaded shaders, and simply make variable passes to enable them?
Also I am focusing on per-fragment lighting, so maybe things only happen in fragment shaders.
There are generally 2 approches
Have an uber shader
In this case you make a big shader with every option possible and lots of branching or ways to effectively nullify parts of the shader (like multiplying with 0)
A simple example might be to have an array of lights uniforms in the shader. For lights you don't want to have an effect you just set their color 0,0,0,0 or their power to 0 so they are still calculated but they contribute nothing to the final scene.
Generate shaders on the fly
In this case for each model you figure out what options you need for that shader and generate the appropriate shader with the exact features you need.
A variation of #2 is the same but all the various shaders needed are generated offline.
Most game engines use technique #2 as it's far more efficient to use the smallest shader possible for each situation than to run an uber shader but many smaller projects and even game prototypes often use an uber shader because it's easier then generating shaders. Especially if you don't know all the options you'll need yet.
As the title says, I would like to reuse a given ShaderMaterial for different meshes, but with a different set of uniforms for each mesh (in fact, some uniforms may vary between meshes, but not necessarily all of them): is it possible ?
It seems a waste of resources to me to have to create a full ShaderMaterial for each mesh in this circumstance, the idea being to have a single vertex/fragment shader program but to configurate it through different uniforms, whose values would change depending on the mesh. If I create a new ShaderMaterial for each mesh, I will end up with a lots of duplications (vertex+fragment programs + all other data members of the Material / ShaderMaterial classes).
If the engine was able to call a callback before drawing a mesh, I could change the uniforms and achieve what I want to do. Another possibility would be to have a "LiteShaderMaterial" which would hold a pointer to the shared ShaderMaterial + only the specific uniforms for my mesh.
Note that my question is related to this one Many meshes with the same geometry and material, can I change their colors? but is still different, as I'm mostly concerned about the waste of resources - performance wise I don't think it would be a lot different between having multiple ShaderMaterial or a single one, as the engine should be smart enough to note that all materials have the same programs and don't resend them to the gfx card.
Thanks
When cloning a ShaderMaterial, the attributes and vertex/fragment programs are copied by reference. Only the uniforms are copied by value, which is what you want.
This should work efficiently.
You can prove it to yourself by creating a ShaderMaterial and then using ShaderMaterial.clone() to clone it for each mesh. Then assign each material unique uniform values.
In the console type "render.info". It should show 1 program.
three.js r.64
You can safely create multiple ShaderMaterial instances with the same parameters, with clone or otherwise. Three.js will do some extra checks as a consequence of material.needsUpdate being initially true for each instance, but then it will be able to reuse the same program for all instances.
In newer releases another option is to use a single ShaderMaterial, but to add changes to uniforms in the objects' onBeforeRender functions. This avoids unnecessary calls to initMaterial in the renderer, but whether or not this makes it a faster solution overall would have to be tested. It may be a risky solution if you push too much what is being modified before the rendering, as in worst case the single material could then have to be recompiled multiple times during the render. I recommend this guide for further tips.