Three.js - DoubleSide Material same lightning - three.js

This is somewhat a follow-up on this ThreeJS material with shadows but no lights
The question is really simple. I have foliage in my scene that is made up of a couple of planes in a cross, and I need both sides of the plane lit equally, but still being affected by shadow.
With an unmodified shader, the scene looks like this:
As you can see, the front and back side of the planes are not equally lit, causing the "hard edges" to show.
Based on the answer from #WestLangely in the SO question mentioned above, I was able to create the desired effect like so:
THREE.ShaderLib[ 'lambert' ].fragmentShader = THREE.ShaderLib[ 'lambert' ].fragmentShader.replace(
`vec3 outgoingLight = reflectedLight.directDiffuse + reflectedLight.indirectDiffuse + totalEmissiveRadiance;`,
`
#ifdef DOUBLE_SIDED
reflectedLight.indirectDiffuse = getAmbientLightIrradiance( ambientLightColor );
reflectedLight.indirectDiffuse *= BRDF_Diffuse_Lambert( diffuseColor.rgb );
reflectedLight.directDiffuse = diffuseColor.rgb;
reflectedLight.directDiffuse *= BRDF_Diffuse_Lambert( diffuseColor.rgb ) * getShadowMask();
vec3 outgoingLight = (reflectedLight.directDiffuse + reflectedLight.indirectDiffuse + totalEmissiveRadiance).rgb * ( 1.0 - 0.45 * ( 1.0 - getShadowMask() ) );
#else
vec3 outgoingLight = reflectedLight.directDiffuse + reflectedLight.indirectDiffuse + totalEmissiveRadiance;
#endif
`
);
Which produced the following result:
As you can see, the "hard edges" are gone, and the foliage looks a lot better now.
However, something is causing them not to be properly affected by lights anymore. Making the environment dark and adding a simple PointLight, gives the following result:
The dark foliage meshes shown above should be properly lit as a whole.
I'm sure there is a better way to go about this, but my knowledge of GLSL only goes so far, especially when it comes to modifying Three.js shaders.
TL;DR: How can I make a material be equally lit on both sides, providing THREE.DoubleSide is set as side on the material, while the entire mesh is still affected by shadows?
Thanks in advance!

Related

Texture lookup inside FBO simulation shader

I'm trying to make FBO-particle system by calculating positions in separate pass. Using code from this post now http://barradeau.com/blog/?p=621.
I render sphere of particles, without any movement:
The only thing i'm adding so far is a texture in simulation fragment shader:
void main() {
vec3 pos = texture2D( texture, vUv ).xyz;
//THIS LINE, pos is approx in -200..200 range
float map = texture2D(texture1, abs(pos.xy/200.)).r;
...
// save map value in ping-pong texture as alpha
gl_FragColor = vec4( pos, map );
texture1 is: half black half white.
Then in render vertex shader i read this map parameter:
map = texture2D( positions, position.xy ).a;
and use it in render fragment shader to see the color:
vec3 finalColor = mix(vec3(1.,0.,0.),vec3(0.,1.,0.),map);
gl_FragColor = vec4( finalColor, .2 );
So what i hope to see is: (made by setting same texture in render shaders)
But what i really see is: (by setting texture in simulation shaders)
Colors are mixed up, though mostly you can see more red ones where they should be, but there are a lot of green particles in between.
Also tried to make my own demo with simplified texture and same idea and i got this:
Also mixed up, but you can still guess image.
Same error.
I think i am missing something obvious. But i was struggling with this a couple of days now, not able to find a mistake by myself.
Would be very grateful for someone to point me in the right direction. Thank you in advance!
Demo with error: http://cssing.org.ua/examples/fbo-error/
Full code i'm referring: https://github.com/akella/fbo-test
You should disable texture filtering by using GL_NEAREST min/mag filters.
My guess is that THREE.TextureLoader() loads texture with mipmaps and texture2D call in vertex shader uses the lowest-res mipmap. In vertex shaders you should use texture2DLod(texture, texCoord, 0.0) - note the 3rd param, lod, which specifies 0 mipmap level.

three.js / envmap / renderToTexture / single texture

I'm creating an animation using envmap.
Everything works as expected but instead of using 6 images inside a CubeTexture, I would like to use 6 animated textures.
I thought it would be obvious but I'm on it for hours now, I searched a lot everywhere I could but it's like nobody on earth ever used an animated texture as envMap...
I found some interesting posts concerning SphericalMappingReflection based on a single texture - exactly what I am looking for I think - , but impossible to find any example of use...
I also checked the Three.js shader source code in order to understand how envMap is done and I found theses lines
#elseENVMAP_TYPE_EQUIREC
float flipNormal = 1.0;
#endif
#ifdef ENVMAP_TYPE_CUBE
vec4 envColor = textureCube( envMap, flipNormal * vec3( flipEnvMap * reflectVec.x, reflectVec.yz ) );
#elif defined( )
vec2 sampleUV;
sampleUV.y = saturate( flipNormal * reflectVec.y * 0.5 + 0.5 );
sampleUV.x = atan( flipNormal * reflectVec.z, flipNormal * reflectVec.x ) * RECIPROCAL_PI2 + 0.5;
vec4 envColor = texture2D( envMap, sampleUV );
#elif defined( ENVMAP_TYPE_SPHERE )
vec3 reflectView = flipNormal * normalize((viewMatrix * vec4( reflectVec, 0.0 )).xyz + vec3(0.0,0.0,1.0));
vec4 envColor = texture2D( envMap, reflectView.xy * 0.5 + 0.5 );
#endif
Then it looks possible to create an envMap using texture2D instead of textureCube, but I didn't find any hint on how to define "ENVMAP_TYPE_EQUIREC" or "ENVMAP_TYPE_SPHERE"
Can you help me please ?
I feel a bit depressed with my problem and really don't know how to do.
Thank you !
( please excuse me if my english is not perfect, it's not my native language )

Three.js, custom shader and png texture with transparency

I have an extremely simple PNG texture: a grey circle with a transparent background.
I use it as a uniform map for a THREE.ShaderMaterial:
var uniforms = THREE.UniformsUtils.merge( [basicShader.uniforms] );
uniforms['map'].value = THREE.ImageUtils.loadTexture( "img/particle.png" );
uniforms['size'].value = 100;
uniforms['opacity'].value = 0.5;
uniforms['psColor'].value = new THREE.Color( 0xffffff );
Here is my fragment shader (just part of it):
gl_FragColor = vec4( psColor, vOpacity );
gl_FragColor = gl_FragColor * texture2D( map,vec2( gl_PointCoord.x, 1.0 - gl_PointCoord.y ) );
gl_FragColor = gl_FragColor * vec4( vColor, 1.0 );
I applied the material to some particles (THREE.PointCloud mesh) and it works quite well:
But if i turn the camera of more than 180 degrees I see this:
I understand that the fragment shader is not correctly taking into account the alpha value of the PNG texture.
What is the best approach in this case, to get the right color and opacity (from custom attributes) and still get the alpha right from the PNG?
And why is it behaving correctly on one side?
Transparent objects must be rendered from back to front -- from furthest to closest. This is because of the depth buffer.
But PointCloud particles are not sorted based on distance from the camera. That would be too inefficient. The particles are always rendered in the same order, regardless of the camera position.
You have several work-arounds.
The first is to discard fragments for which the alpha is low. You can use a pattern like so:
if ( textureColor.a < 0.5 ) discard;
Another option is to set material.depthTest = false or material.depthWrite = false. You might not like the side effects, however, if you have other objects in the scene.
three.js r.71

How to implement a ShaderToy shader in three.js?

looking for info on how to recreate the ShaderToy parameters iGlobalTime, iChannel etc within threejs. I know that iGlobalTime is the time elapsed since the Shader started, and I think the iChannel stuff is for pulling rgb out of textures, but would appreciate info on how to set these.
edit: have been going through all the shaders that come with three.js examples and think that the answers are all in there somewhere - just have to find the equivalent to e.g. iChannel1 = a texture input etc.
I am not sure if you have answered your question, but it might be good for others to know the integration steps for shadertoys to THREEJS.
First, you need to know that shadertoys is a fragment shaders. That being said, you have to set a "general purpose" vertex shader that should work with all shadertoys (fragment shaders).
Step 1
Create a "general purpose" vertex shader
varying vec2 vUv;
void main()
{
vUv = uv;
vec4 mvPosition = modelViewMatrix * vec4(position, 1.0 );
gl_Position = projectionMatrix * mvPosition;
}
This vertex shader is pretty basic. Notice that we defined a varying variable vUv to tell the fragment shader where is the texture mapping. This is important because we are not going to use the screen resolution (iResolution) for our base rendering. We will use the texture coordinates instead. We have done that in order to integrate multiple shadertoys on different objects in the same THREEJS scene.
Step 2
Pick the shadertoys that we want and create the fragment shader. (I have chosen a simple toy that performs well: Simple tunnel 2D by niklashuss).
Here is the given code for this toy:
void main(void)
{
vec2 p = gl_FragCoord.xy / iResolution.xy;
vec2 q = p - vec2(0.5, 0.5);
q.x += sin(iGlobalTime* 0.6) * 0.2;
q.y += cos(iGlobalTime* 0.4) * 0.3;
float len = length(q);
float a = atan(q.y, q.x) + iGlobalTime * 0.3;
float b = atan(q.y, q.x) + iGlobalTime * 0.3;
float r1 = 0.3 / len + iGlobalTime * 0.5;
float r2 = 0.2 / len + iGlobalTime * 0.5;
float m = (1.0 + sin(iGlobalTime * 0.5)) / 2.0;
vec4 tex1 = texture2D(iChannel0, vec2(a + 0.1 / len, r1 ));
vec4 tex2 = texture2D(iChannel1, vec2(b + 0.1 / len, r2 ));
vec3 col = vec3(mix(tex1, tex2, m));
gl_FragColor = vec4(col * len * 1.5, 1.0);
}
Step 3
Customize the shadertoy raw code to have a complete GLSL fragment shader.
The first thing missing out the code are the uniforms and varyings declaration. Add them at the top of your frag shader file (just copy and paste the following):
uniform float iGlobalTime;
uniform sampler2D iChannel0;
uniform sampler2D iChannel1;
varying vec2 vUv;
Note, only the shadertoys variables used for that sample are declared, plus the varying vUv previously declared in our vertex shader.
The last thing we have to twick is the proper UV mapping, now that we have decided to not use the screen resolution. To do so, just replace the line that uses the IResolution uniforms i.e.:
vec2 p = gl_FragCoord.xy / iResolution.xy;
with:
vec2 p = -1.0 + 2.0 *vUv;
That's it, your shaders are now ready for usage in your THREEJS scenes.
Step 4
Your THREEJS code:
Set up uniform:
var tuniform = {
iGlobalTime: { type: 'f', value: 0.1 },
iChannel0: { type: 't', value: THREE.ImageUtils.loadTexture( 'textures/tex07.jpg') },
iChannel1: { type: 't', value: THREE.ImageUtils.loadTexture( 'textures/infi.jpg' ) },
};
Make sure the textures are wrapping:
tuniform.iChannel0.value.wrapS = tuniform.iChannel0.value.wrapT = THREE.RepeatWrapping;
tuniform.iChannel1.value.wrapS = tuniform.iChannel1.value.wrapT = THREE.RepeatWrapping;
Create the material with your shaders and add it to a planegeometry. The planegeometry() will simulate the shadertoys 700x394 screen resolution, in other words it will best transfer the work the artist intented to share.
var mat = new THREE.ShaderMaterial( {
uniforms: tuniform,
vertexShader: vshader,
fragmentShader: fshader,
side:THREE.DoubleSide
} );
var tobject = new THREE.Mesh( new THREE.PlaneGeometry(700, 394,1,1), mat);
Finally, add the delta of the THREE.Clock() to iGlobalTime value and not the total time in your update function.
tuniform.iGlobalTime.value += clock.getDelta();
That is it, you are now able to run most of the shadertoys with this setup...
2022 edit: The version of Shaderfrog described below is no longer being actively developed. There are bugs in the compiler used making it not able to parse all shaders correctly for import, and it doesn't support many of Shadertoy's features, like multiple image buffers. I'm working on a new tool if you want to follow along, otherwise you can try the following method, but it likely won't work most of the time.
Original answer follows:
This is an old thread, but there's now an automated way to do this. Simply go to http://shaderfrog.com/app/editor/new and on the top right click "Import > ShaderToy" and paste in the URL. If it's not public you can paste in the raw source code. Then you can save the shader (requires sign up, no email confirm), and click "Export > Three.js".
You might need to tweak the parameters a little after import, but I hope to have this improved over time. For example, ShaderFrog doesn't support audio nor video inputs yet, but you can preview them with images instead.
Proof of concept:
ShaderToy https://www.shadertoy.com/view/MslGWN
ShaderFrog http://shaderfrog.com/app/view/247
Full disclosure: I am the author of this tool which I launched last week. I think this is a useful feature.
This is based on various sources , including the answer of #INF1.
Basically you insert missing uniform variables from Shadertoy (iGlobalTime etc, see this list: https://www.shadertoy.com/howto) into the fragment shader, the you rename mainImage(out vec4 z, in vec2 w) to main(), and then you change z in the source code to 'gl_FragColor'. In most Shadertoys 'z' is 'fragColor'.
I did this for two cool shaders from this guy (https://www.shadertoy.com/user/guil) but unfortunately I didn't get the marble example to work (https://www.shadertoy.com/view/MtX3Ws).
A working jsFiddle is here: https://jsfiddle.net/dirkk0/zt9dhvqx/
Change the shader from frag1 to frag2 in line 56 to see both examples.
And don't 'Tidy' in jsFiddle - it breaks the shaders.
EDIT:
https://medium.com/#dirkk/converting-shaders-from-shadertoy-to-threejs-fe17480ed5c6

Fragment shader mixing two cube textures based on alpha not working

I have a problem completely alluding me. I am trying to mix two cube textures together. But for some reason I cannot seem to mix one texture with the other based on the alpha. The function looks like this:
vec4 baseColor = textureCube( tCube, vec3( tFlip * vWorldPosition.x, vWorldPosition.yz ) );
vec4 atmosphereColor = textureCube( tAtmosphere, vec3( tFlip * vWorldPosition.x, vWorldPosition.yz ) );
vec4 texelColor = mix( atmosphereColor, baseColor, atmosphereColor.a );
gl_FragColor = texelColor;
All the materials concerned have depthWrite: true, depthTest: true and transparent: true. One is a texture with 6 images and the other is a WebGLRenderTarget. The WebGLRenderTarget is created with an alpha channel.
format: THREE.RGBAFormat
Individually they both work as expected. For example if I use gl_FragColor = atmosphereColor;, the alpha is visible. But as soon as I try to mix one with the other, no alpha is applied. So for example when I do this:
gl_FragColor = vec4( baseColor.r, baseColor.g, baseColor.b, atmosphereColor.a );
It still will not use the alpha variable provided (it just sets it to 1.0) - even though I know the value is present and varying as I just set gl_FragColor to atmosphereColor and could see it varying D:>
Any ideas? :(
I was able to find the source of the problem. It was related to to the clearAlpha parameter of the renderer. This needed to be set to 0 instead of 1 for my case.

Resources