ThreeJS predefined shader attributes / uniforms - three.js

I have started with ThreeJS's WebGL renderer after doing some "regular" WebGL with no additional libraries + GLSL shaders. I am trying to write custom shaders now in my ThreeJS program and I noticed that ThreeJS takes care of a lot of the standard stuff such as the projection and model / view matrices. My simple vertex shader now looks like this:
// All of these seem to be predefined:
// vec3 position;
// mat4 projectionMatrix;
// mat4 modelViewMatrix;
// mat3 normalMatrix;
// vec3 normal;
// I added this
varying vec3 vNormal;
void main() {
vNormal = normalMatrix * vec3(normal);
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
My question is: Which other variables (I'm assuming they're uniforms) are predefined for vertex and fragment shaders that I could use? Does ThreeJS help out with light vectors / light color for instance (of course assuming I've added one or more lights to my ThreeJS scene)?
Update (Oct. 9, 2014): This question has been getting quite a few views, and the user Killah mentioned that the existing answers did not lead to a solution anymore with the current version of three.js. I added and accepted my own answer, see it below.

For uniforms, the short answer is the following:
In the vertex shader
"uniform mat4 modelMatrix;",
"uniform mat4 modelViewMatrix;",
"uniform mat4 projectionMatrix;",
"uniform mat4 viewMatrix;",
"uniform mat3 normalMatrix;",
"uniform vec3 cameraPosition;",
and in the fragment shader
"uniform mat4 viewMatrix;",
"uniform vec3 cameraPosition;",
For the complete answer, involving uniforms and attributes, your custom shaders have the string variables prefixVertex and prefixFragment pre-appended.
var vertexGlsl = prefixVertex + vertexShader;
var fragmentGlsl = prefixFragment + fragmentShader;
var glVertexShader = THREE.WebGLShader( gl, gl.VERTEX_SHADER, vertexGlsl );
var glFragmentShader = THREE.WebGLShader( gl, gl.FRAGMENT_SHADER, fragmentGlsl );
The prefixVertex and prefixFragment variable definitions can be found in WebGLProgram.js or in the non-minified version of three.js.
EDIT: Updated to three.js r.73

The uniforms you can use in your shaders all depend on how you setup your material: have you enable lights ? vertex colors ? skinning ? ...
Three JS creates a program that depends heavily on some defines (#ifdef in the code) that are injected at the top of the program depending on the parameters I have spoken about above.
I found the best way to know what is going on is to print the shaders that three JS generates: as you already know GLSL, you will understand easily what the code means and what uniforms you can use. Look for buildProgram in three JS sources, then (r57):
var glFragmentShader = getShader( "fragment", prefix_fragment + fragmentShader );
var glVertexShader = getShader( "vertex", prefix_vertex + vertexShader );
After those lines, add:
console.log("fragment shader:", prefix_fragment + fragmentShader);
console.log("vertex shader:", prefix_vertex + vertexShader);
And you will be able to see the content of the shaders.
[EDIT]
Rereading your question, I realize I answered a bit off, as you create your own shaders...
You can have a look at lines 6463 and 6490 of WebGLRenderer (https://github.com/mrdoob/three.js/blob/master/src/renderers/WebGLRenderer.js#L6463): you will see standard uniforms / attributes that three JS inject in your shaders. You can have a look to the Wiki where you have an entry about that (https://github.com/mrdoob/three.js/wiki - Which default attributes / uniforms / varyings are available in custom shaders?) but it directs you to the lines I outlined above.

This question has been getting quite a few views, and the user Killah mentioned that the existing answers did not lead to a solution anymore with the current version of three.js.
This is why I tried solving the problem again, and I'd like to outline a couple of options that I found:
The quickest and easiest way (while not very elegant) is to just put a random error in your shader. You will get a console error with the entire shader code, including everything that three.js adds.
The better solution is to output the shader source from where it's compiled, namely THREE.WebGLShader (as of the current three.js version, r68). I've done a quick copy and paste that should output all shader sources before they're compiled.
Add this after including three.js and before your own code:
THREE.WebGLShader = ( function () {
var addLineNumbers = function ( string ) {
var lines = string.split( '\n' );
for ( var i = 0; i < lines.length; i ++ ) {
lines[ i ] = ( i + 1 ) + ': ' + lines[ i ];
}
return lines.join( '\n' );
};
return function ( gl, type, string ) {
var shader = gl.createShader( type );
console.log(string);
gl.shaderSource( shader, string );
gl.compileShader( shader );
if ( gl.getShaderParameter( shader, gl.COMPILE_STATUS ) === false ) {
console.error( 'THREE.WebGLShader: Shader couldn\'t compile.' );
}
if ( gl.getShaderInfoLog( shader ) !== '' ) {
console.warn( 'THREE.WebGLShader: gl.getShaderInfoLog()', gl.getShaderInfoLog( shader ) );
console.warn( addLineNumbers( string ) );
}
return shader;
};
} )();
Note that this snippet is just copied (and very slightly changed) from the three.js sources and should be removed before actually using the code. Just for debugging!
There is one more option that is less invasive: you can inspect your ShaderMaterial after creating and rendering it at least once, like so:
var material = new THREE.ShaderMaterial({
uniforms: {
uColorMap: { type: 't', value: THREE.ImageUtils.loadTexture('./img/_colors.png') },
uSpecularMap: { type: 't', value: THREE.ImageUtils.loadTexture('./img/_spec.png') },
uNormalMap: { type: 't', value: THREE.ImageUtils.loadTexture('./img/_normal.png') }
},
vertexShader: document.getElementById('vShader').innerText,
fragmentShader: document.getElementById('fShader').innerText
});
Then, after rendering the object at least once:
console.log(material.program.attributes);
console.log(material.program.uniforms);
Hope this helps everyone! Feel free to add your comments if you know more and / or better ways to get your shader code.

An answer to this question can now be found in the three.js documentation: https://threejs.org/docs/#api/renderers/webgl/WebGLProgram

Related

How can i render the PMREM environment map when copying MeshStandardMaterial into a ShaderMaterial

I am trying to build the MeshStandardMaterial by using a ShaderMaterial. I'm keeping most of the #include <logic> statements, which makes it slightly difficult to put breakpoints.
I'd like to know if there is a straightforward way to render the PMREM cubemap, in this particular material template and have it show up the way it's supposed to.
I'm roughly using:
material.defines.USE_ENVMAP = ''
material.defines.ENVMAP_MODE_REFLECTION = ''
material.defines.ENVMAP_TYPE_CUBE_UV = ''
material.defines.ENVMAP_BLENDING_MULTIPLY = ''
material.defines.TEXTURE_LOD_EXT = ''
material.defines.USE_UV = ''
material.extensions.derivatives = true
material.extensions.shaderTextureLOD = true
Which,as far as i can tell, are all of the defines that appear when adding a texture to material.envmap. The shader compiles, the PMREM texture is being generated, and can be read in the shader (gl_FragColor = vec4( texture2D( envmap, vUv ).xyz, 1.) works for example). These are the uniforms i cloned:
{
envmap: UniformsUtils.clone(UniformsLib.envmap),
fog: UniformsUtils.clone(UniformsLib.fog),
lights: UniformsUtils.clone(UniformsLib.lights),
displacementmap: UniformsUtils.clone(UniformsLib.displacementmap)
}
The maxMipmap uniform seems to have a value of 0 when MeshStandardMaterial is used, i'm not sure what else is being used.
I get absolutely no effect from placing a texture in material.uniforms.envmap.value and using these defines. If i turn off the light in the scene, my object renders as black, no reflections.
This doesn't seem like it requires that many inputs but i get 0. out of it:
radiance += getLightProbeIndirectRadiance( /*specularLightProbe,*/ geometry.viewDir, geometry.normal, material.specularRoughness, maxMipLevel );
For my case it was a missing uniform:
https://github.com/mrdoob/three.js/blob/dev/src/renderers/shaders/ShaderLib.js#L99
envMapIntensity: { value: 1 } // temporary
It's not part of the envmap.

Is it possible to let Fog interact with the material's opacity?

I am working on a project that displays buildings. The requirement is to let the building gradually fade out (transparent) based on the distance between the camera and the buildings. Also, this effect has to follow the camera's movement.
I consider using THREE.Fog(), but the Fog seems can only change the material's color.
Above is a picture of the building with white fog.
The buildings are in tiles, each tile is one single geometry (I merged all the buildings into one) using
var bigGeometry = new THREE.Geometry();
bigGeometry.merge(smallGeometry);
The purple/blue color thing is the ground, and ground.material.fog = false;. So the ground won't interact with the fog.
My question is:
Is it possible to let the fog interact with the building's material's opacity instead of color? (more white translate to more transparent)
Or should I use Shader to control the material's opacity based on distance to the camera? But I have no idea of how to do this.
I also considered adding alphaMap. If so, each building tile have to map an alphaMap and all these alphaMap have to interact with the camera's movement. It's going to be a tons of work.
So any suggestions?
Best Regards,
Arthur
NOTE: I suspect there are probably easier/prettier ways to solve this than opacity. In particular, note that partially-opaque buildings will show other buildings behind them. To address that, consider using a gradient or some other scene background, and choosing a fog color to match that, rather than using opacity. But for the sake of trying it...
Here's how to alter an object's opacity based on its distance. This doesn't actually require THREE.Fog, I'm not sure how you would use the fog data directly. Instead I'll use THREE.NodeMaterial, which (as of three.js r96) is fairly experimental. The alternative would be to write a custom shader with THREE.ShaderMaterial, which is also fine.
const material = new THREE.StandardNodeMaterial();
material.transparent = true;
material.color = new THREE.ColorNode( 0xeeeeee );
// Calculate alpha of each fragment roughly as:
// alpha = 1.0 - saturate( distance / cutoff )
//
// Technically this is distance from the origin, for the demo, but
// distance from a custom THREE.Vector3Node would work just as well.
const distance = new THREE.Math2Node(
new THREE.PositionNode( THREE.PositionNode.WORLD ),
new THREE.PositionNode( THREE.PositionNode.WORLD ),
THREE.Math2Node.DOT
);
const normalizedDistance = new THREE.Math1Node(
new THREE.OperatorNode(
distance,
new THREE.FloatNode( 50 * 50 ),
THREE.OperatorNode.DIV
),
THREE.Math1Node.SAT
);
material.alpha = new THREE.OperatorNode(
new THREE.FloatNode( 1.0 ),
normalizedDistance,
THREE.OperatorNode.SUB
);
Demo: https://jsfiddle.net/donmccurdy/1L4s9e0c/
Screenshot:
I am the OP. After spending some time reading how to use Three.js's Shader material. I got some code that is working as desired.
Here's the code: https://jsfiddle.net/yingcai/4dxnysvq/
The basic idea is:
Create an Uniform that contains controls.target (Vector3 position).
Pass vertex position attributes to varying in the Vertex Shader. So
that the Fragment Shader can access it.
Get the distance between each vertex position and controls.target. Calculate alpha value based on the distance.
Assign alpha value to the vertex color.
Another important thing is: Because the fade out mask should follow the camera move, so don't forget to update the control in the uniforms every frame.
// Create uniforms that contains control position value.
uniforms = {
texture: {
value: new THREE.TextureLoader().load("https://threejs.org/examples/textures/water.jpg")
},
control: {
value: controls.target
}
};
// In the render() method.
// Update the uniforms value every frame.
uniforms.control.value = controls.target;
I had the same issue - a few years later - and solved it with the .onBeforeCompile function which is maybe more convenient to use.
There is a great tutorial here
The code itself is simple and could be easily changed for other materials. It just uses the fogFactor as alpha value in the material.
Here the material function:
alphaFog() {
const material = new THREE.MeshPhysicalMaterial();
material.onBeforeCompile = function (shader) {
const alphaFog =
`
#ifdef USE_FOG
#ifdef FOG_EXP2
float fogFactor = 1.0 - exp( - fogDensity * fogDensity * vFogDepth * vFogDepth );
#else
float fogFactor = smoothstep( fogNear, fogFar, vFogDepth );
#endif
gl_FragColor.a = saturate(1.0 - fogFactor);
#endif
`
shader.fragmentShader = shader.fragmentShader.replace(
'#include <fog_fragment>', alphaFog
);
material.userData.shader = shader;
};
material.transparent = true
return material;
}
and afterwards you can use it like
const cube = new THREE.Mesh(geometry, this.alphaFog());

ThreeJS material with shadows but no lights

I want a material with:
Textures
Not receiving lights
Receiving shadows
I tried with the following library materials:
MeshBasicMaterial: Does not support shadows
MeshLamberMaterial: If you disable lights (material.lights = false) it also disables shadows
ShadowMaterial: Does not support textures
Is a custom ShaderMaterial the only way to achieve it?
In three.js, as in real life, shadows are the absence of light. So for a built-in three.js material to receive shadows, it must respond to light.
However, you can modify a built-in material's shader to achieve the effect you want with just a few lines of code. Here is an example to get you started:
THREE.ShaderLib[ 'lambert' ].fragmentShader = THREE.ShaderLib[ 'lambert' ].fragmentShader.replace(
`vec3 outgoingLight = reflectedLight.directDiffuse + reflectedLight.indirectDiffuse + totalEmissiveRadiance;`,
`#ifndef CUSTOM
vec3 outgoingLight = reflectedLight.directDiffuse + reflectedLight.indirectDiffuse + totalEmissiveRadiance;
#else
vec3 outgoingLight = diffuseColor.rgb * ( 1.0 - 0.5 * ( 1.0 - getShadowMask() ) ); // shadow intensity hardwired to 0.5 here
#endif`
);
Then, to use it:
var material = new THREE.MeshLambertMaterial( { map: texture } );
material.defines = material.defines || {};
material.defines.CUSTOM = "";
In spite of its name, this material will behave like MeshBasicMaterial, but will darken when it is in shadow. And furthermore, MeshLambertMaterial will still work as expected.
three.js r.88
In a past version, maybe .72, you could cast and receive shadows with the MeshBasicMaterial. It was simple. Then the concept of ambient light changed in three.js and MeshBasicMaterial could no longer support shadows.
THREE.ShadowMaterial was introduced to compensate for the limitation. It works great! But it really only works on PlaneGeometry because by it's nature, THREE.ShadowMaterial is transparent, so the shadows cast inside and outside the object3d with ShadowMaterial are seen.
The idea is that you use two meshes, one with the MeshBasicMaterial, and the other with ShadowMaterial.
shape = new THREE.BoxGeometry(1,1,1),
basicMaterial = new THREE.MeshBasicMaterial({
color: 0xff0000
}),
mesh = new THREE.Mesh(shape, basicMaterial),
shadowMaterial = new THREE.ShadowMaterial({opacity:.2}),
mesh2 = new THREE.Mesh(shape, shadowMaterial),
You can see an example of the problem, here: https://jsfiddle.net/7d47oLkh/
The shadows cast at the bottom of the box are incorrect for the use-case.
The answer is, NO. There is no easy way to support full-bright basic materials that also accept and cast a shadow in three.js.

Three.js specific render-time uniforms

I want to implement per-object motion-blur effect based on calculating previous pixel position inside shaders.
This technic's first step is to build velocity map of moving objects. This step requirements is to have as uniform variables projection and model view matrices of current frame and the same matrices of previous frame.
How could I include those matrices to uniforms for some special shader? I supposed to have solution in some way like:
uniforms = {
some_uniform_var : {type: "m4", value: initialMatrix, getter: function(){
// `this` points to object
return this.worldMatrix
}}
}
But now in THREE.js this is not available. We could make some sort of monkey patching, but I cannot find best way to do it.
Any suggestions?
The current solvation to this problems consist of several parts. I'm using EffectComposer to make several passes of rendered scene, one of then - VelocityPass. It takes current and previous model-view matrix and projection matrix and produces two positions. Both of them then used to calculate speed of a point.
Shader looks like this
"void main() {",
"vec2 a = (pos.xy / pos.w) * 0.5 + 0.5;",
"vec2 b = (prevPos.xy / prevPos.w) * 0.5 + 0.5;",
"vec2 oVelocity = a - b;",
"gl_FragColor = vec4(oVelocity, 0.0, 1.);",
"}"
There're several issues of this decision.
Three.js has certain point where it injects matrices to object-related shaders. The very ending of SetProgram closure, which lives in WebGLRenderer. That's why I took the whole renderer file, renamed renderer to THREE.MySuperDuperWebGLRenderer and added couple lines of code in it:
A closure to access closures, defined in userspace:
function materialPerObjectSetup(material, object){
if( material.customUniformUpdate ){
material.customUniformUpdate( object, material, _gl ); // Yes, I had to pass it...
}
}
And calling of it in renderBuffer and renderBufferDirect;
var program = setProgram( camera, lights, fog, material, object );
materialPerObjectSetup(material, object);
Now - the userspace part:
velocityMat = new THREE.ShaderMaterial( THREE.VelocityShader );
velocityMat.customUniformUpdate = function(obj, mat, _gl){
// console.log("gotcha");
var new_m = obj.matrixWorld;
var p_uniforms = mat.program.uniforms;
var mvMatrix = camera.matrixWorldInverse.clone().multiplyMatrices(camera.matrixWorldInverse, obj._oldMatrix );
_gl.uniformMatrix4fv( p_uniforms.prevModelViewMatrix, false, mvMatrix.elements );
_gl.uniformMatrix4fv( p_uniforms.prevProjectionMatrix, false, camera.projectionMatrix.elements );
obj._pass_complete = true; // Необходимо сохранять состояние старой матрицы пока не отрисуется этот пасс.
// А то матрицы обновляются каждый рендеринг сцены.
}
_pass_complete needed when we rerendering scene several times - each time matrix recalculated. This trick help us save previous matrix untill we use it.
_gl.uniformMatrix4fv is needed, because three.js serves universes one time before rendering. No matter how much objects we have - other method will pass to the shader modelViewMatrix of the last one. This happens because I want to draw this scene fully using VelocityShader. There's no other way to say to Renderer to use some alternative material for objects.
And as final point of this explaination I putting here a trick to manage previous matrix of an object:
THREE.Mesh.prototype._updateMatrixWorld = rotatedObject.updateMatrixWorld;
THREE.Mesh.prototype._pass_complete = true;
Object.defineProperty(THREE.Mesh.prototype, "updateMatrixWorld", {get: function(){
if(this._pass_complete){
this._oldMatrix = this.matrixWorld.clone();
this._pass_complete = false;
}
this._updateMatrixWorld();
return (function(){
});
}})
I believe, that there's could be a nicer solution. But sometimes I need to act in rush. And such kind of monkey things could happen.

Cannot pass current result to the next rendering

I am trying to do a progressive rendering using the previous rendering as a texture to the next one.
EDIT 1: As suggested in the comments, I did updated my version of THREE.js to the latest available, and kept my old code, the result is the same (even if the vertical positions of objects flipped). And my problem still remains. Please do consider my update and my pray for help.
Original message:
My fragment shader should only increment the color on the green channel with 0.1, like this:
#ifdef GL_ES
precision highp float;
#endif
uniform sampler2D sampa;
varying vec2 tc;
void main(void)
{
vec4 c = texture2D(sampa, tc);
vec4 t = vec4(c.x, c.y + .1, c.z, 1.);
gl_FragColor = t;
}
My composer is like this:
composer.addPass(renderModel);
composer.addPass(screenPass);
composer.addPass(feedPass);
Where renderModel is a RenderPass, rendering my scene in which I have a plane and a cube.
and screenPass and feedPass are identical with the only difference being that one renders on screen the other one renders in writeBuffer (composer.renderTarget1).
var renderModel = new THREE.RenderPass(scene, camera);
renderModel.clear = false;
screenPass = new THREE.ShaderPass(shader2, 'sampa');
screenPass.renderToScreen = true;
screenPass.clear = false;
screenPass.needsSwap = false;
feedPass = new THREE.ShaderPass(shader2, 'sampa');
feedPass.renderToScreen = false;
feedPass.clear = false;
feedPass.needsSwap = false;
And in the animation loop, I have something like this:
composer.render();
if(step % 250 == 0)
{
newmat = new THREE.MeshBasicMaterial(
{
map : composer.renderTarget1
});
plane.material = newmat;
}
step++;
requestAnimationFrame(animate);
The part with step % 250 is to delay the change of material.
Anyway, the problem is that the plane is disappearing when that happens. Even if it is correctly rendered in the first 250 steps. I guess it is still there but with no texture data, so it is not actually rendered.
I know that EffectComposer is not part of the library, and it is found only in examples, and might not be supported, but I would really do with any advice on this situation, and any answer will be greatly appreciated.
As for any other info about the problem, or some other code that might help I very am willing to share.
Could you point out what am I doing wrong?
I thank you for your kindness.
It seems the solution to this problem is to use two RenderTargets, and switch between them on every step. My limited knowledge stops me from understanding exactly why but this is exactly how EffectComposer work.
For those who might have this problem and need a solution you should try to set needsSwap to true for your shaders pass.
And if you do not use EffectComposer, then remember to use two rendertargets.

Resources