I'm trying to get shadows to work in a custom shader in Three.js. I've tried to add these into my codes:
In uniforms:
THREE.UniformsLib["shadowmap"]
In the fragment shader:
THREE.ShaderChunk["shadowmap_pars_fragment"]
THREE.ShaderChunk["shadowmap_fragment"]
In the vertex shader:
THREE.ShaderChunk["shadowmap_pars_vertex"]
THREE.ShaderChunk["shadowmap_vertex"]
which works. The object can receive shadows.
However, it cannot cast shadows. Does anyone know what other bits of codes are needed?
I believe that you need to mark each object as casting and receiving shadows
I think its just
obj.castShadow = true;
obj.recieveShadow = true;
Related
I want a material with:
Textures
Not receiving lights
Receiving shadows
I tried with the following library materials:
MeshBasicMaterial: Does not support shadows
MeshLamberMaterial: If you disable lights (material.lights = false) it also disables shadows
ShadowMaterial: Does not support textures
Is a custom ShaderMaterial the only way to achieve it?
In three.js, as in real life, shadows are the absence of light. So for a built-in three.js material to receive shadows, it must respond to light.
However, you can modify a built-in material's shader to achieve the effect you want with just a few lines of code. Here is an example to get you started:
THREE.ShaderLib[ 'lambert' ].fragmentShader = THREE.ShaderLib[ 'lambert' ].fragmentShader.replace(
`vec3 outgoingLight = reflectedLight.directDiffuse + reflectedLight.indirectDiffuse + totalEmissiveRadiance;`,
`#ifndef CUSTOM
vec3 outgoingLight = reflectedLight.directDiffuse + reflectedLight.indirectDiffuse + totalEmissiveRadiance;
#else
vec3 outgoingLight = diffuseColor.rgb * ( 1.0 - 0.5 * ( 1.0 - getShadowMask() ) ); // shadow intensity hardwired to 0.5 here
#endif`
);
Then, to use it:
var material = new THREE.MeshLambertMaterial( { map: texture } );
material.defines = material.defines || {};
material.defines.CUSTOM = "";
In spite of its name, this material will behave like MeshBasicMaterial, but will darken when it is in shadow. And furthermore, MeshLambertMaterial will still work as expected.
three.js r.88
In a past version, maybe .72, you could cast and receive shadows with the MeshBasicMaterial. It was simple. Then the concept of ambient light changed in three.js and MeshBasicMaterial could no longer support shadows.
THREE.ShadowMaterial was introduced to compensate for the limitation. It works great! But it really only works on PlaneGeometry because by it's nature, THREE.ShadowMaterial is transparent, so the shadows cast inside and outside the object3d with ShadowMaterial are seen.
The idea is that you use two meshes, one with the MeshBasicMaterial, and the other with ShadowMaterial.
shape = new THREE.BoxGeometry(1,1,1),
basicMaterial = new THREE.MeshBasicMaterial({
color: 0xff0000
}),
mesh = new THREE.Mesh(shape, basicMaterial),
shadowMaterial = new THREE.ShadowMaterial({opacity:.2}),
mesh2 = new THREE.Mesh(shape, shadowMaterial),
You can see an example of the problem, here: https://jsfiddle.net/7d47oLkh/
The shadows cast at the bottom of the box are incorrect for the use-case.
The answer is, NO. There is no easy way to support full-bright basic materials that also accept and cast a shadow in three.js.
Using Threejs (67) with a Webgl renderer, I can't seem to get a plane with a shader material to wear its texture. No matter what I do the material would just stay black.
My code at the moment looks quite basic :
var grassT = new Three.Texture(grass); // grass is an already loaded image.
grassT.wrapS = grassT.wrapT = Three.ClampToEdgeWrapping;
grassT.flipY = false;
grassT.minFilter = Three.NearestFilter;
grassT.magFilter = Three.NearestFilter;
grassT.needsUpdate = true;
var terrainUniforms = {
grassTexture : { type: "t", value: grassT},
}
Then I just have this revelant part in the vertexShader :
vUv = uv;
And on the fragmentShader side :
gl_FragColor = texture2D(grassTexture, vUv);
This results in :
Black material.
No error in console.
gl_FragColor value is always (0.0, 0.0, 0.0, 1.0).
What I tryed / checked:
Everything works fine if I just apply custom plain colors.
All is ok if I use vertexColors with plain colors too.
My texture width / height is indeed a power of 2.
The image is on the same server than the code.
Tested others images with same result.
The image is actually loading in the browser debugger.
UVS for the mesh are corrects.
Played around with wrapT, wrapS, minFilter, magFilter
Adapted the mesh size so the texture has a 1:1 ratio.
Preloaded the image with requirejs image plugin and created the texture from THREE.Texture() instead of using THREE.ImageUtils();
Played around with needsUpdate : true;
Tryed to add defines['USE_MAP'] during material instanciation.
Tryed to add material.dynamic = true.
I have a correct rendering loop (interraction with terrain is working).
What I still wonder :
It's a multiplayer game using a custom port with express + socket.io. Am I hit by any Webgl security policy ?
I have no lights logic at the moment, is that a problem ?
Maybe the shader material needs other "defines" at instanciation ?
I guess I'm overlooking something simpler, this is why I'm asking...
Thanks.
I am applying various effects on the same shader. I have a custom API that merge all different effects uniforms simply by using Three.UniformsUtils.merge() However this function is calling the clone() method on the texture and this is causing to reset needsUpdate to false before the texture reach the renderer.
It appears that you should set your texture needsUpdate property to true when reaching the material level. On the texture level, if the uniform you set get merged, and therefore cloned, later in the process, it'll lose its needsUpdate property.
The issue is also detailled here: https://github.com/mrdoob/three.js/issues/3393
In my case the following wasn't working (grassT is my texture):
grassT.needsUpdate = true
while the following is running perfectly later on in the code:
material.uniforms.grassTexture.value.needsUpdate = true;
Image loading is asynchronous. Most likely, you are rendering your scene before the texture image loads.
You must set the texture.needsUpdate flag to true after the image loads. three.js has a utility that will do that for you:
var texture = THREE.ImageUtils.loadTexture( "texture.jpg" );
Once rendered, the renderer sets the texture.needsUpdate flag back to false.
three.js r.68
I'm loading a jpeg file for light map
var texture = new THREE.ImageUtils.loadTexture("textures/metal.jpg");
Then I apply the texture to THREE.MeshPhongMaterial
var frontMaterial = new THREE.MeshPhongMaterial( {
color: 0xfade7e,
specular: 0xffffff,
ambient: 0xaa0000,
lightMap:texture
} )
Full error message is WebGLRenderingContext: GL ERROR :GL_INVALID_OPERATION : glDrawElements: attempt to access out of range vertices in attribute 2
Is here something wrong? An error occures in all browsers. Three.js r.56
As explained by #alteredq in this thread, a LightMap requires a second set of UVs.
The point of lightmaps is that they can live independently of other textures, thus giving other textures chance to be much higher detail. Lightmaps use their own set of UV coordinates (usually auto-generated by some light baking solution, as opposed to artist-created primary UV set).
Using lightmaps with the same UVs as everything else doesn't make much sense, as then you could achieve basically the same result for less texture cost simply by baking light map together with color map (this is e.g. what Rage uses, it looks fantastic but needs boatload of textures).
Also lightmaps should be multiplicative, not additive. Big use case for lightmaps are pre-baked shadows and ambient occlusion, so you need to be able to darken things.
So the answer to your question is that geometry.faceVertexUvs[0] contains the usual set of UVs; you need to add to your geometry geometry.faceVertexUvs[1].
three.js r.56
This error become because the Three.js buffers are outdated. When your add some textures (map,bumpMap ...) to a Mesh, you must recompose the buffers and update UVs like this :
ob is THREE.Mesh, mt is a Material, tex is a texture.
tex.needsUpdate = true;
mt.map = tex;
ob.material = mt;
ob.geometry.buffersNeedUpdate = true;
ob.geometry.uvsNeedUpdate = true;
mt.needsUpdate = true;
That's all folks !
Hope it's help.
Regards.
Sayris
I'm creating a sphere and adding distortion to it, that's working fine.
When I look at the wireframe it's like this
and with the wireframe turned of it looks like this
As you can see the are no shades and the distortion isn't visible when the wireframe is turned of.
What I'm looking for is what to place in my custom fragmentShader.
I used this
// calc the dot product and clamp
// 0 -> 1 rather than -1 -> 1
vec3 light = vec3(0.5,0.2,1.0);
// ensure it's normalized
light = normalize(light);
// calculate the dot product of
// the light to the vertex normal
float dProd = max(0.0, dot(vNormal, light));
// feed into our frag colour
gl_FragColor = vec4(dProd, dProd, dProd, 1.0);
But that just creates a very ugly false light.
Any ideas anybody?
Thanks in advance,
Wezy
If you want to use Three.js' lights (and I can't think a reason why not), you need to include the corresponding shader chunks:
Take a look at how the WebGLRenderer composes its shaders in THREE.ShaderLib
Pick a material there that is close to what you want and copy its definition (uniforms and both shader codes) to your code, renaming it to something custom
Delete chunks you don't need, and add your custom code to appropriate places
I have this object I'm loading with THREE.objLoader and then create a mesh with it like so:
mesh = new THREE.SceneUtils.createMultiMaterialObject(
geometry,
[
new THREE.MeshBasicMaterial({color: 0xFEC1EA}),
new THREE.MeshBasicMaterial({
color: 0x999999,
wireframe: true,
transparent: true,
opacity: 0.85
})
]
);
In my scene I then add a DirectionalLight, it works and I can see my object, however it's like the DirectionalLight was an ambient one. No face is getting darker or lighter as it should be.
The object is filled with the color, but no lighting is applied to it.
If someone can help me with that it would be much appreciated :)
What could I be missing ?
Jsfiddle here: http://jsfiddle.net/5hcDs/
Ok folks, thanks to Maƫl Nison and mr doob I was able to understand the few things I was missing, being the total 3d noob that I am... I believe people starting to get into the 3d may find useful a little recap:
Basic 3d concepts
A 3d Face is made of some points (Vertex), and a vector called a normal, indicating the direction of the face (which side is the front and which one is the backside).
Not having normals can be really bad, because lighting is applied on the frontside only by default. Hence the black model when trying to apply a LambertMaterial or PhongMaterial.
An OBJ file is a way to describe 3D information. Want more info on this? Read this wikipedia article (en). Also, the french page provides a cube example which can be useful for testing.
Three.js tips and tricks
When normals are not present, the lighting can't be applied, hence the black model render. Three.js can actually compute vertex and face normals with geometry.computeVertexNormals() and/or geometry.computeFaceNormals() depending on what's missing
When you do so, there's a chance Three.js' normal calculation will be wrong and your normals will be flipped, to fix this you can simply loop through your geometry's faces array like so:
/* Compute normals */
geometry.computeFaceNormals();
geometry.computeVertexNormals();
/* Next 3 lines seems not to be mandatory */
mesh.geometry.dynamic = true
mesh.geometry.__dirtyVertices = true;
mesh.geometry.__dirtyNormals = true;
mesh.flipSided = true;
mesh.doubleSided = true;
/* Flip normals*/
for(var i = 0; i<mesh.geometry.faces.length; i++) {
mesh.geometry.faces[i].normal.x = -1*mesh.geometry.faces[i].normal.x;
mesh.geometry.faces[i].normal.y = -1*mesh.geometry.faces[i].normal.y;
mesh.geometry.faces[i].normal.z = -1*mesh.geometry.faces[i].normal.z;
}
You have to use a MeshPhongMaterial. MeshBasicMaterial does not take light in account when computing fragment color.
However, when using a MeshPhongMaterial, your mesh becomes black. I've never used the OBJ loader, but are you sure your model normales are right ?
Btw : you probably want to use a PointLight instead. And its position should probably be set to the camera position (light.position = camera.position should do the trick, as it will allow the light to be moved when the camera position will be edited by the Controls).