Easy way to use light in fragment shader - three.js

I have a Collada object which i load per Three.js into my scene.
Now I want to change some vertex positions of the model in the
vertex shader, which is no problem.
But with this I have to skip the exported collada material and take a
ShaderMaterial.
The problem is now that I have to calculate the complete lighting of
the scene in my fragment shader.
Before, with the collada material, the complete lighting was calculated by the framework by using a directional light and a hemisphere light.
So my question is, if there is a solution where I can leave the fragmentShader untouched and all the colors are calculated as if I would use no ShaderMaterial.
I tried to use THREE.ShaderLib and pass only the fragmentShader of the phong shader
and my own vertexShader. But this gave only errors that not the same varyings are defined in both shaders.

Unfortunately, you are correct. You have to create your own ShaderMaterial, and it will be tedious to incorporate scene lighting.
If your scene lighting is not that important, you can hack in some ambient light and a single light at the camera location in your fragment shader.
If your scene lighting is important, then you need to set the ShaderMaterial parameter lights: true, and you will have access to the scene light uniforms in your vertex and fragment shaders.
three.js r.63

Related

transforming a shader pass to a shader material

I would like to transform the shader pass below to a shader material.
https://github.com/felixturner/bad-tv-shader
Is it possible, how to procede ? For me it should be easy if the shaderMaterial is applied to a plane Geometry.
I tried this transformation on this codesandbox : https://codesandbox.io/s/r3f-wavey-image-shader-forked-4fb238
The only thing I did for the moment is copying the code of the shader pass into a object.
Thanks.

Custom geometry transformation distorts material

Custom geometry's mesh material gets distorted after applying matrix transformation to my custom geometry.
In the sample jsfiddle, I've included my custom TorusGeometry. My custom geometry is placed next to a built-in CylinderGeometry.
You can see the difference in material between these two meshes. If I remove the geometry transformation, both the mesh's materials look fine.
I guess, I'm messing the normals while doing matrix transformation, but not sure to fix it.
https://jsfiddle.net/arundhaj/nrdg2faL/

Three.js Merge objects and textures

My question is related to this article:
http://blog.wolfire.com/2009/06/how-to-project-decals/
If my understanding is correct, a mesh made from the intersection of the original mesh and a cube is added to the scene to make a decal appear.
I need to save the final texture. So I was wondering if there is a way to 'merge' the texture of the original mesh and the added decal mesh?
You'd need to do some tricky stuff to convert from the model geometry space into UV coordinate space so you could draw the new pixels into the texture map. If you want to be able to use more than one material that way, you'd also probably need to implement some kind of "material map" similar to how some deferred rendering systems work. Otherwise you're limited to at most, one material per face, which wouldn't work for detailed decals with alpha.
I guess you could copy the UV coordinates from the original mesh into the decal mesh, and the use that information to reproject the decal texture into the original texture

Three.js - CanvasRenderer problems: flat shading?

I'm trying to use CanvasRenderer (three.js) as a fallback for devices not supporting WebGL. Is there some comparison page with explanation what is different and cannot be used with CanvasRenderer?
I'm experiencing two main issues:
flat shading, lights are completely missing (is MeshPhongMaterial supported?), I don't see any lighting nor shadows (are shadows supported in CanvasRenderer)? All I see is the diffuse texture without any lighting. In WebGL my current setup is PointLight, DirectionalLight, softShadows, antialiasing and MeshPhongMaterial (with diffuse, bump, spec and env map)
this.materialM = new THREE.MeshPhongMaterial({
ambient : 0x050505,
color : this.model.color,
specular : 0xcccccc,
shininess : 100,
bumpScale : BUMP_SCALE,
reflectivity : REFLECTIVITY,
});
transparent polygon edges (I know it can be tweaked with material.overdraw = 0.5 yet it produces other artifacts (as it probably does only some scaling of polys along the normal?), but I can do with this one
Any help on 1. or some general overview of what is not possible in CanvasRenderer when comparing to WebGLRenderer is greatly appreciated!
three.js r68
CanvasRenderer has limitations.
MeshPhongMaterial is not supported in CanvasRenderer -- it falls back to MeshLambertMaterial.
MeshLambertMaterial is supported, but not when the material has a texture -- it falls back to MeshBasicMaterial. ( MeshBasicMaterial is rendered without regards to scene lights. )
Shadows are not supported.
material.overdraw = 0.5 is helpful in hiding polygon edges when the material is opaque. It may still leave artifacts if the material is transparent.
three.js r.68

How to apply texture to mesh without specifying UV's in geometry using three.js?

Is it possible to apply texture to mesh without specifying UV's in geometry in three.js ?
There are classes such as THREE.CubeGeometry, THREE.SphereGeometry, etc. that automatically generate the UV coordinates for you. However, if you are creating your own geometry from scratch (i.e., specifying vertex locations, creating faces, etc.) then the answer is no. Either you need to set the UV coordinates manually when creating the geometry, or you need to write a custom shader which determines the UV coordinates for any given point. Think about it this way: if you don't specify UV coordinates, the points on your geometry have no idea which point on your texture they should display.

Resources