Compute normals from displacement map in three.js r.58? - three.js

I'm using the normal shader in three.js r.58, which I understand requires a normal map. However, I'm using a dynamic displacement map, so a pre-computed normal map won't work in this situation.
All the examples I've found of lit displacement maps either use flat shading or pre-computed normal maps. Is it possible to calculate the normals dynamically based on the displaced vertices instead?
Edit: I've posted a demo of a sphere with a displacement map showing flat normals:
Here's a link to the github repo with all of my examples illustrating this problem, and the solutions I eventually found:
https://github.com/meetar/three.js-normal-map-0

This answer is based on your comments above.
You can do what you want, but it is quite sophisticated, and you will of course have to modify the three.js 'normal' shader.
Have a look at http://alteredqualia.com/three/examples/webgl_cubes_indexed.html. Look at the fragment shader, and you will see
vec3 normal = normalize( cross( dFdx( vViewPosition ), dFdy( vViewPosition ) ) );
Alteredqualia is using a derivative normal in the fragment shader ( instead of an attribute normal ) because the vertex positions are changing in the vertex shader, and the normal is not known.
What he is doing is calculating the normal using the cross product of the x and y screen-space derivatives of the fragment position.
This will set the normal as the face normal. It will be discontinuous at hard edges.
three.js r.58

What I was describing above is called a "bump map" and it comes as a default with the three.js phong shader. I combined the normalmap shader with chunks of the phong shader responsible for bump mapping:
http://meetar.github.io/three.js-normal-map-0/bump.html
Though the normals are a bit noisy they are basically correct.

You can also calculate a normal map from the displacement map with JavaScript. This results in smooth normals, and is a good option if your displacement map isn't changing too often.
This method uses the code found in this demo: http://mrdoob.com/lab/javascript/height2normal/
Demo here:
http://meetar.github.io/three.js-normal-map-0/index14.html

Related

How can I light emission per vertex and per vertex lighting in ThreeJS?

I want to see a chart with color specified per vertex and to get little bit of shading too.
But if I use MeshBasicMaterial I only get VertexColor with no dynamic shading.
On the other hand, if I use MeshPhongMaterial I just get shading but without emissiveness from my vertex colors.
As the THREE.JS PhongMaterial supports vertexColors, giving you a nice combination of dynamic lighting and vertex colors, I'm not quite sure I understand your question. Perhaps that is something you should investigate more?
However, as an alternative to writing a custom shader you could try rendering your model in multiple passes.
This will not give you as much control over the way the vertex colors and phong lighting are combined as a shader would, but often a simple add/multiply blend can give pretty decent results.
Algorithm:
- create two meshes for the BufferGeometry, one with the BasicMaterial and one with the PhongMaterial
- for the PhongMaterial, set
depthFunc = THREE.EqualDepth
transparent = true;
blending = THREE.AdditiveBlending(or MultiplyBlending)
- render the first mesh
- render the second mesh at the exact same spot

Manual selection lod of mipmaps in a fragment shader using three.js

I'm writing a physically based shader using glsl es in three.js. For the addition of specular global illumination I use a cubemap dds texture with mipmap chain inside (precalculate with CubeMapGen as it's explained here). I need to access this texture in fragment shader and I would like to select manually the index of mipmap. The correct function for doing this is
vec4 textureCubeLod(samplerCube sampler, vec3 coord, float lod)
but it's available only in vertex shader. In my fragment shader I'm using the similar function
vec4 textureCube(samplerCube sampler, vec3 coord, float bias)
but it doesn't work well, because the bias parameter is just added to the automatically calculated level of detail. So, when I zoom in or zoom out on the scene the LOD of mipmap change, but for my shader it must be the same (it must depends only on the rough parameter, as explained in the link above).
I would like to select manually the level of mipmap in fragment shader only depends on the roughness of the material (for example using the formula mipMapIndex = roughness*numMipMap), so it must be costant with the distance and no automatically changed when zooming. How can I solve this?
It wont work with webGL atm, because there is no support for this feature. You can experiment with textureLOD extensions though with recent builds of chrome canary, but it still needs some tweaking. Go about flags and look for this:
Enable WebGL Draft Extensions
WebGL textureCube bias causing seams

different OpenGL ES vertex color interpolation modes?

I'm trying to implement a simple linear gradient like in photoshop. The color interpolation between vertices seems to go by (additive?)numerical value rather than what you would expect in "paint blending." Here is a visual example with green and red:
The one on the left is roughly what I get, and I want to achieve the one on the right.
Is there any easy way to achieve this?
As #Andon commented, using the texture system is a good way to do this. Here's what you need:
Assign one (or more, but you only need one for this trick) texture coordinate(s) to each vertex when you set up attributes in your vertex buffer.
In your vertex shader, read that attribute and write it to a varying so it gets interpolated for use in the fragment shader.
In your fragment shader, read the varying -- this tells you how far along the gradient ramp you should be in the current fragment; i.e. a blending factor.
At this point, you have two choices:
Use a 1d texture image that looks like the gradient you want, and lookup into it with the texture2D shader function and the varying texture coordinate you got. This will fetch the corresponding texel color so you can output it to gl_FragColor.
Calculate the color blend in the fragment shader. If you pass in the endpoint colors in your shader as uniforms, you combine them according to the blending factor using whatever math you can do in GLSL (including things like Photoshop blend modes).

Duplicate existing Three.js materials to add displacement calculations

I need to do vertex displacements using a texture map in Three.js.
Is there an existing material that supports that?
If not, what is the best way to duplicate an existing Three.js shader so that I can add in some vertex displacement calculations? I would like to keep existing functionalities such as shadows and wireframe on the material.
Vertex displacements using a texture map are supported by THREE.ShaderTerrain[ "terrain" ] and THREE.ShaderLib[ "normalmap" ].
Examples of their use can be found in http://threejs.org/examples/webgl_terrain_dynamic.html and http://threejs.org/examples/webgl_materials_normalmap.html.
If these do not suit your needs, then you will have to write your own shader. Doing so is not easy. It is best to modify an existing shader.
three.js r.61

OpenGL ES 2.0 Per Fragment Lighting for Untextured Generated Geometry

I'm currently generating geometry rather than importing it as a model. This makes it necessary to calculate all normals within the application.
I've implemented Gouraud shading (per vertex lighting) successfully, and now wish to implement Phong shading (per fragment/pixel).
I've had a look at relevant tutorials online and there are two camps: one offers a simple Gouraud-to-Phong reshuffling of shader code which, while offering improved lighting, isn't truly per-pixel. The second does things the right way by utilising normal maps embedded within textures, but these are generated within a modelling toolkit such as RenderMonkey.
My questions are:
How I should go about programmatically generating normals for my
generated geometry, considered it as a vertex set? In other words, given a set of discrete polygonal points, will it be necessary to manually calculated interpolated normals?
Should I store generated normals within a texture as exemplified
online, and if so how would I go about doing this within code rather
than through modelling software?
Computing the lighting in the fragment shader on the intepolated per-vertex normals will definitely yield better results (assuming you properly re-normalize the interpolated normals in the fragment shader) and it is truly per-pixel. Although the strength of the difference may very depending on the model tessellation and the lighting variation. Have you just tried it out?
As long as you don't have any changing normals inside a face (think of bump mapping) and only interpolate the per-vertex normals, a normal map is completely unneccessary, as you get interpolated normals from the rasterizer anyway. Whereas normal mapping can give nicer effects if you really have per-pixel normal variations (like a very rough surface), it is not neccessarily the right way to do per-pixel lighting.
It makes a huge difference if you compute the lighting per-vertex and interpolate the colors or if you compute the lighting per fragment (even if you just interpolate the per-vertex normals, that's what classical Phong shading is about), especially when you have quite large triangles or very shiny surfaces (very high frequency lighting variation).
Like said, if you don't have high-frequency normal variations (changing normals inside a triangle), you don't need a normal map and neither interpolate the per-vertex normals yourself. You just generate per-vertex normals like you did for the per-vertex lighting (e.g. by averaging adjacent face normals). The rasterizer does the interpolation for you.
You should first try out simple per-pixel lighting before delving into techniques like normal mapping. If you got not so finely tessellated geometry or very shiny surfaces, you will surely see the difference to simple per-vertex lighting. Then when this works you can try normal mapping techniques, but for them to work you surely need to first understand the meaning of per-pixel lighting and Phong shading in contrast to Gouraud shading.
Normal maps are not a requirement for per-pixel lighting.
The only requirement, by definition, is that the lighting solution is evaluated for every output pixel/fragment. You can store the normals on the vertexes just as well (and more easily).
Normal maps can either provide full normal data (rgb maps) or simply modulate the stored vertex normals (du/dv maps, appear red/blue). The latter form is perhaps more common and relies on vertex normals to function.
To generate the normals depends on your code and geometry. Typically, you use dot products and surrounding faces or vertexes for smooth normals, or just create a unit vector pointing in whatever is "out" for your geometry.

Resources