In my scene I have two overlapping / crossing plane meshes. The first one uses a MeshBasicMaterial and the second one a custom ShaderMaterial (simple cut-out fragment shader). It seems as if the mesh with the ShaderMaterial doesn't have any depth information as the other plane is always rendered on top of it.
How can I add the plane mesh with the ShaderMaterial to the scene so the collisions and overlapping with other meshes is shown correctly? Do I have to do this in the fragment shader or is it something I have to set up in the material?
Edit: I've made two different variants: A and B. A: works as it should, both Plane Meshes have depth information and use the MeshBasicMaterial:
var movieMaterial = new THREE.MeshBasicMaterial( { map: videoTexture, overdraw: true, side:THREE.DoubleSide } );
Screenshot of Variant A, two crossing Plane Meshes using MeshBasicMaterial
Variant B uses a custom ShaderMaterial on one Plane Mesh:
var movieMaterial = new THREE.ShaderMaterial({
uniforms: {
texture: { type: "t", value: videoTexture }
},
vertexShader: [
"varying vec2 vUv;",
"void main() {",
"vUv = uv;",
"gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);",
"}",
].join("\n"),
fragmentShader: [
"varying vec2 vUv;",
"uniform sampler2D texture;",
"void main() {",
"gl_FragColor = texture2D(texture, vUv);",
"if (gl_FragColor.r + gl_FragColor.b + gl_FragColor.g > 1.5) discard;",
"}",
].join("\n"),
side:THREE.DoubleSide
});
Screenshot of Variant B, now one Plane Mesh is using a custom ShaderMaterial
And now the depth information is lost. The code I posted is the only difference here.
Thanks in advance
This is an old question but I've recently come across this problem. The issue was with the usage of the logarithmic depth buffer. The shaders passed to the ShaderMaterial do not work with it out of the box. Option one is to disable the logarithmic depth buffer. Option two is to append a few pieces of glsl code to your shaders provided by three, something like this:
import { ShaderChunk } from 'three';
const VertexShader = ShaderChunk.common + '\n' + ShaderChunk.logdepthbuf_pars_vertex + `
...
void main() {
...
` + ShaderChunk.logdepthbuf_vertex + `
}
`;
const FragmentShader = ShaderChunk.logdepthbuf_pars_fragment + `
...
void main() {
...
` + ShaderChunk.logdepthbuf_fragment + `
}
`;
Related
I am working with THREE.JS and want to be able to have a mesh that changes the occluded part of itself into a different color.
Simple Example
The above image is a simple example, where the wall is in front of the Mesh and obstructing part of the Mesh but not all of it. The visible part of the mesh should be colored green whilst the occluded part should be colored red. Note that the wall is not transparent; the occluded part of the mesh should still be rendered using depthTest = false.
I've tried messing around with some basic shaders, but I don't really how to get started. Currently, the core parts of my code looks like this:
// My cube's material
const overlayMat = new THREE.ShaderMaterial({
uniforms: {
"maskTexture": { value: null },
},
vertexShader:
`varying vec2 vUv;
void main() {
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}`,
fragmentShader:
`varying vec2 vUv;
uniform sampler2D maskTexture;
void main() {
vec4 maskColor = texture2D(maskTexture, vUv);
float visibilityFactor = 1.0 - maskColor.g > 0.0 ? 1.0 : 0.5;
// Attempt to set the green value to the visibility factor
gl_FragColor = vec4(1.0, visibilityFactor, 0.0, 1.0);
}`,
depthTest: false,
depthWrite: false,
transparent: true
});
// My mask
let renderTargetMaskBuffer = new THREE.WebGLRenderTarget(innerWidth, innerHeight, {
minFilter: THREE.LinearFilter,
magFilter: THREE.LinearFilter,
format: THREE.RGBAFormat
});
// Inside my animate function:
function animate() {
// ...
overlayMat.uniforms["maskTexture"].value = renderTargetMaskBuffer.depthTexture;
renderer.render(scene, camera);
}
Which does not work, the cube remains one constant color.
Full code (with the "wall" that occludes part of the mesh) (JSFiddle)
I tried a very simple test using Three.js ShaderMaterial.
I load a 2048x2048 jpg image as a texture for my height map and apply it to deform a PlaneBufferGeometry in the vertex shader.
I also apply the same texture for the diffuse color in the fragment shader.
Globally it works but I see some big artifacts as shown in this screenshot
The artifact always appears along a line parallel to the X axis and passing through the camera.
I have the problem on all three.js version I tried (r105, r114)
The code is yet very simple, anyone know what am I doing wrong ?
Javascript
var textureLoader = new THREE.TextureLoader();
var testTextureBump = textureLoader.load( './front_b.jpg' );
var testGeometry = new THREE.PlaneBufferGeometry(3000, 3000, 500, 500);
var testUniforms = {
uTextureBump: { value: testTextureBump }
};
var testMaterial = new THREE.ShaderMaterial({
uniforms: testUniforms,
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent,
side: THREE.FrontSide,
blending: THREE.NormalBlending,
depthWrite: false,
wireframe: false,
transparent: true
});
var testMesh = new THREE.Mesh( testGeometry, testMaterial );
scene.add( testMesh );
Vertex shader
uniform sampler2D uTextureBump;
varying vec2 vUv;
void main() {
vUv = uv;
vec4 diffuseTexture = texture2D(uTextureBump, uv);
vec3 positionHeight = position.xyz;
positionHeight.z += diffuseTexture.r * 20.0;
gl_Position = projectionMatrix * modelViewMatrix * vec4(positionHeight, 1.0);
}
Fragment shader
precision highp float;
precision highp int;
uniform sampler2D uTextureBump;
varying vec2 vUv;
void main (void) {
vec4 texture = texture2D(uTextureBump, vUv);
gl_FragColor = vec4( texture.rgb, 1.0 );
}
You can see the problem in this demo
Move your mouse on the left or right and you'll see the artifacts.
You can fly around as I use the standard THREE.FlyControl service.
The corresponding project file can be download here
I would like to know how could I move shaders which are actually included in my html to external files. In this way, I'll be able to include them in my gulp tasks. I took a look at how JavaScript shaders files are written but I don't understand very well.
For example with the Glow shader code below, how could I move it to an external file ?
<script type="x-shader/x-vertex" id="vertexShaderGlow">
uniform vec3 viewVector;
uniform float c;
uniform float p;
varying float intensity;
void main()
{
vec3 vNormal = normalize( normalMatrix * normal );
vec3 vNormel = normalize( normalMatrix * viewVector );
intensity = pow( c - dot(vNormal, vNormel), p );
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
</script>
<script type="x-shader/x-vertex" id="fragmentShaderGlow">
uniform vec3 glowColor;
varying float intensity;
void main()
{
vec3 glow = glowColor * intensity;
gl_FragColor = vec4( glow, 1.0 );
}
</script>
The other answer provided is simply taking GLSL code and turning each line into a string. Each string is a value in an array. The join call is concatenating all of the strings with a \n character to make the code easier to read when debugging. I've done it this way many times before, and is a legitimate solution to what you're trying to do.
But if you'd rather have external files with raw GLSL code, you can do that, too. Consider the two files:
glow_vertex.glsl
glow_fragment.glsl
These files contain the shader code which you would normally have in the script tags. You can use an XMLHTTPRequest to fetch the files, and use the returned text as your shader code.
var vertexShader = null;
var fragmentShader = null;
function shadersDone(){
var material = new THREE.ShaderMaterial({
uniforms: { /* define your uniforms */},
vertexShader: vertexShader,
fragmentShader: fragmentShader
});
}
function vertexDone(code){
vertexShader = code;
if(fragmentShader !== null){
shadersDone();
}
}
function fragmentDone(code){
fragmentShader = code;
if(vertexShader !== null){
shadersDone();
}
}
var xhr1 = new XMLHttpRequest();
var xhr2 = new XMLHttpRequest();
xhr1.open("GET", "/server/glow_vertex.glsl", true);
xhr2.open("GET", "/server/glow_fragment.glsl", true);
xhr1.responseType = "text";
xhr2.responseType = "text";
xhr1.onload = function(){
if(xhr1.readyState === xhr1.DONE && xhr1.status === 200){
vertexDone(xhr1.resultText);
}
};
xhr2.onload = function(){
if(xhr2.readyState === xhr2.DONE && xhr2.status === 200){
fragmentDone(xhr2.resultText);
}
};
xhr1.send(null);
xhr2.send(null);
Note that that's all asynchronous. Also, your server is going to need to be configured to send GLSL files as plain text.
As long as we're talking about the modern web...
There is also the option to import your shader code. VERY BIG BUT it's currently only supported in Chrome and Opera (although polyfills do exist). Microsoft Edge lists the functionality as "under consideration," and Firefox does not indent to implement it in its current state. So take what follows with a large grain of salt, and keep an eye on: http://caniuse.com/#search=import
In your HTML, and before the JavaScript which would use it...
<link id="vertexImport" rel="import" href="/server/glow_vertex.glsl" />
<link id="fragmentImport" rel="import" href="/server/glow_fragment.glsl" />
Then later in your JavaScript:
var material = new THREE.ShaderMaterial({
vertexShader: document.getElementById("vertexImport").import.body.childNodes[0].data,
fragmentShader: document.getElementById("fragmentImport").import.body.childNodes[0].data,
});
Again, this is asynchronous. You may need to add an onload handler for each link, so you don't attempt to access the code before it's loaded.
The answer suggesting joining an array of glsl lines is a pattern that can be encountered with three, but should probably be avoided in this use case.
It may be useful in some kind of a module, where it's sort of a "compiled" snapshot of a shader, not intended to be modified.
Otherwise the main downsides of this approach is lack of syntax highlighting and being verbose.
Nowadays most js code is transformed one way or the other. Shader code should be inlined like this.
const myShader = new THREE.ShaderMaterial({
vertexShader: require('./myVert.vs'),
fragmentShader: require('./myFrag.vs'),
})
edit
myVert.vs:
//this is all nicely highlighted in sublime text for example
void main (){
gl_Position = vec4( position.xy, 0., 1.);
}
myVert.fs:
void main (){
gl_FragColor = vec4(1.,0.,0.,1.);
}
myClass.js:
class myMaterial extends THREE.ShaderMaterial{
constructor(){
super({
vertexShader: require('./myVert.vs'),
//^ becomes vertexShader: 'void main() {...}'
...
You can move the shader code into a separate JS file and include that file after three.js.
Here is one example from https://github.com/timoxley/threejs/blob/master/examples/js/shaders/ColorifyShader.js
/**
* #author alteredq / http://alteredqualia.com/
*
* Colorify shader
*/
THREE.ColorifyShader = {
uniforms: {
"tDiffuse": { type: "t", value: null },
"color": { type: "c", value: new THREE.Color( 0xffffff ) }
},
vertexShader: [
"varying vec2 vUv;",
"void main() {",
"vUv = uv;",
"gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );",
"}"
].join("\n"),
fragmentShader: [
"uniform vec3 color;",
"uniform sampler2D tDiffuse;",
"varying vec2 vUv;",
"void main() {",
"vec4 texel = texture2D( tDiffuse, vUv );",
"vec3 luma = vec3( 0.299, 0.587, 0.114 );",
"float v = dot( texel.xyz, luma );",
"gl_FragColor = vec4( v * color, texel.w );",
"}"
].join("\n")
};
With the above you would create our material like this:
material = new THREE.ShaderMaterial({
uniforms : THREE.ColorifyShader.uniforms,
vertexShader : THREE.ColorifyShader.vertexShader,
fragmentShader : THREE.ColorifyShader.fragmentShader
});
ofcourse you don't need to call the object THREE.ColorifyShader, you can call it whatever you want.
I am trying to run the following simple shader with three.js
mat = new THREE.ShaderMaterial({
uniforms: {
color: { type: 'v3', value: new THREE.Color(0xcccccc) }
},
vertexShader: 'attribute vec3 vert;\n'
+ 'void main() {\n'
+ ' gl_Position = vec4(vert, 1.0);\n'
+ '}',
fragmentShader: 'uniform vec3 color;\n'
+ 'void main() {\n'
+ ' gl_FragColor = vec4(color,1);\n'
+ '}'
})
The shaders compile but the object that has this material is invisible.
This should display the object in a constant light grey color.
When I run this with kick.js shader editor
it works as expected.
Predefined materials all work great.
Am I missing something?
Your vertex shader should be:
// Multiply each vertex by the model-view matrix and the projection matrix
// (both provided by Three.js) to get a final vertex position.
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
I'm using a custom shader to curve a plane. My custom shader extends the Lambert shader so it supports lights and shadows. It all works as expected, but when the vertexShader changes the geometry of the plane, the shadow doesn't update. Is there anything I'm missing to flag that the geometry has updated in my vertexShader and the shadow needs to change?
[Here is a screenshot of the problem. The plane is curved with a vertexShader, but the shadow doesn't update][1]
[1]: http://i.stack.imgur.com/6kfCF.png
Here is the demo/code: http://dev.cartelle.nl/curve/
If you drag the "bendAngle" slider you can see that the shadow doesn't update.
One work-around I thought was to get the bounding box of my curved plane. Then use those points and create a new Mesh/Box and use that object to cast the shadow. But then I wasn't sure how to get the coordinates of the new curved geometry. When I would check geometry.boundingBox after the shader was applied, it would also just give me the original coordinates every time.
Thanks
Johnny
If you are modifying the geometry positions in the vertex shader, and you are casting shadows, you need to specify a custom depth material so the shadows will respond to the modified positions.
In your custom depth material's vertex shader, you modify the vertex positions in the same way you modified them in the material's vertex shader.
An example of a custom depth material can be seen in this three.js example, (although vertices are not modfied in the vertex shader in that example; they are modified on the CPU).
In your case, you would create a vertex shader for the custom depth material using a pattern like so:
<script type="x-shader/x-vertex" id="vertexShaderDepth">
uniform float bendAngle;
uniform vec2 bounds;
uniform float bendOffset;
uniform float bendAxisAngle;
vec3 bendIt( vec3 ip, float ba, vec2 b, float o, float a ) {
// your code here
return ip;
}
void main() {
vec3 p = bendIt( position, bendAngle, bounds, bendOffset, bendAxisAngle );
vec4 mvPosition = modelViewMatrix * vec4( p, 1.0 );
gl_Position = projectionMatrix * mvPosition;
}
</script>
And fragment shader like this:
<script type="x-shader/x-fragment" id="fragmentShaderDepth">
vec4 pack_depth( const in float depth ) {
const vec4 bit_shift = vec4( 256.0 * 256.0 * 256.0, 256.0 * 256.0, 256.0, 1.0 );
const vec4 bit_mask = vec4( 0.0, 1.0 / 256.0, 1.0 / 256.0, 1.0 / 256.0 );
vec4 res = fract( depth * bit_shift );
res -= res.xxyz * bit_mask;
return res;
}
void main() {
gl_FragData[ 0 ] = pack_depth( gl_FragCoord.z );
}
</script>
Then in your javascript, you specify the custom depth material:
uniforms = {};
uniforms.bendAngle = { type: "f", value: properties.bendAngle };
uniforms.bendOffset = { type: "f", value: properties.offset };
uniforms.bendAxisAngle = { type: "f", value: properties.bendAxisAngle };
uniforms.bounds = { type: "v2", value: new THREE.Vector2( - 8, 16 ) };
var vertexShader = document.getElementById( 'vertexShaderDepth' ).textContent;
var fragmentShader = document.getElementById( 'fragmentShaderDepth' ).textContent;
myObject.customDepthMaterial = new THREE.ShaderMaterial( {
uniforms: uniforms,
vertexShader: vertexShader,
fragmentShader: fragmentShader
} );
three.js r.74