I have a problem with uniform textures and the repeat of them, because with:
tex.wrapS = THREE.RepeatWrapping;
tex.wrapT = THREE.RepeatWrapping;
tex.repeat.x=100;
tex.repeat.y=100;
it doesn´t work. So I searched the web for a solution and I found the following thread on giuthub:
https://github.com/mrdoob/three.js/issues/787
the threadstarter has the same problem as I do, but unfortunately the links with the answers don´t work anymore.
What do I have to do when I want my uniform textures repeated?
Are you writing your own fragment shader? If so, you'll need to multiply the UV coordinates by the repeat values, e.g.
uniform sampler2D baseTexture;
varying vec2 vUv;
void main()
{
gl_FragColor = texture2D( baseTexture, vUv * 100 );
}
If not, http://stemkoski.github.io/Three.js/Texture-Repeat.html contains an example of texture repeating, e.g.
// texture repeated twice in each direction
var lavaTexture = THREE.ImageUtils.loadTexture( 'images/lava.jpg' );
lavaTexture.wrapS = lavaTexture.wrapT = THREE.RepeatWrapping;
lavaTexture.repeat.set( 2, 2 );
var lavaMaterial = new THREE.MeshBasicMaterial( { map: lavaTexture } );
Related
I have a Three.js shader material that chromakeys a video texture so that the green becomes transparent. That part works fine.
Now i am trying to modify it so that it is affected by the intensity of the ambient light, basically what i want is that when the ambient light's intensity is lower, the video playing becomes darker.
On images I can do that fine by simply adding a Standard Material so i've tried adding two separate materials to the video (the chromakey shader material and a standard one) but that didn't help.
So I started doing some research and digging into the code of the chromakey shader (which was not written by me) and i made the following changes:
I've merged the original uniforms with the ones from THREE.UniformsLib["lights"]
I've enabled the lights in the shader material's parameters
Now the question is, how do I access the ambient light's intensity value (which is constantly updating by the way) inside the fragment shader, and how do I make the pixels darker depending on the intensity value (which is between 0 and 1).
Shader Material
var uniforms = THREE.UniformsUtils.clone(THREE.UniformsLib["lights"]);
uniforms['color'] = { type: 'c', value: data.color };
uniforms['texture'] = { type: 't', value: videoTexture };
this.material = new THREE.ShaderMaterial({
uniforms: uniforms,
vertexShader: this.vertexShader,
fragmentShader: this.fragmentShader,
lights: true
});
Vertex Shader
varying vec2 vUv;
void main(void)
{
vUv = uv;
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_Position = projectionMatrix * mvPosition;
}
Fragment Shader
uniform sampler2D texture;
uniform vec3 color;
varying vec2 vUv;
void main(void)
{
vec3 tColor = texture2D( texture, vUv ).rgb;
float a = (length(tColor - color) - 0.5) * 7.0;
gl_FragColor = vec4(tColor, a);
}
I basically need to modify tColor according to the light's intensity but like I said, I have no idea how to access that value and how to darken/brighten the color according to it.
Adding brightness control to you fragment shader
If its just a simple brightness control you can multiply the frag color by a scalar.
Example frag shader
uniform sampler2D texture;
uniform vec3 color;
uniform float brightness; // added uniform to control brightness
varying vec2 vUv;
void main(void) {
vec3 tColor = texture2D( texture, vUv ).rgb;
float a = (length(tColor - color) - 0.5) * 7.0;
gl_FragColor = vec4(tColor * brightness, a); // scale brightness of rgb channels
}
Then in Javascript code to support the new uniform
const BRIGHTNESS_MAX = 1; // default value. MAX brightness DO not change this
// DO NOT CHANGE this value to set upper limit
// Use the const LUX_MAX (see below) to set upper limit
const BRIGHTNESS_MIN = 0.7; // The darkest you want A value of 0 is black
// add brightness The same uniforms object as you
// got with uniforms = THREE.UniformsUtils.clone(THREE.UniformsLib["lights"]);
uniforms.brightness = {value: BRIGHTNESS_MAX}; // default to max
To change the value of the uniform
uniforms.brightness.value = brightness; // set the new value
From Three Docs
"All uniforms values can be changed freely (e.g. colors, textures, opacity, etc), values are sent to the shader every frame."
So that is all that is needed to add the brightness control.
Using Ambient sensor
I will assume you have access to the sensor value. The sensor holds the light level as an absolute value LUX 10 is dark ~707 is normal and 10000 plus is bright.
You will have to calibrate the sensor reading by changing what LUX corresponds to BRIGHTNESS_MAX and setting the BRIGHTNESS_MIN to the darkest you want the image to become.
As the the scaling and dynamic range of the light sensor and display device are very different the following function makes the assumption the MAX_LUX and white on the rendered image are the same brightness
The following function will convert from a LUX value to a brightness value
const MAX_LUX = 5000; // This LUX value and above will set brightness to max
function LUX2Brightness(lux) {
if (lux >= MAX_LUX) { return BRIGHTNESS_MAX }
const MIN = (BRIGHTNESS_MIN ** 2.2) * MAX_LUX; // do not manually set this value
// Set BRIGHTNESS_MIN to control
// low light level
if (lux <= MIN) { return BRIGHTNESS_MIN }
return (lux ** (1 / 2.2)) / (MAX_LUX ** (1 / 2.2));
}
To use the above function with the shader
// luminosity is value from ambient light sensor event
uniforms.brightness.value = LUX2Brightness(luminosity);
The assumption is that you set MAX_LUX to the actual LUX output of the all white rendered image (best of luck with that).
IMPORTANT!!
The is no absolute solution to levels.
Human vision is adaptive. How you calibrate the min and max will change depending on how your eyes have adapted to the current light levels, the current brightness, color, (and more) setting on the device displaying the rendered content, the current setting of the camera, exposure, white balance (and so on), your personal artist preferences.
All of these things are usually set automatically so any setting that looks good now may not be what is desired in the morning, or when you come back from a coffee break.
All the code
Fragment shader
uniform sampler2D texture;
uniform vec3 color;
uniform float brightness;
varying vec2 vUv;
void main(void) {
vec3 tColor = texture2D( texture, vUv ).rgb;
float a = (length(tColor - color) - 0.5) * 7.0;
gl_FragColor = vec4(tColor * brightness, a);
}
JavaScript settup code
const BRIGHTNESS_MAX = 1; // Don't change this value
const BRIGHTNESS_MIN = 0.7;
const MAX_LUX = 2000;
uniforms.brightness = {value: BRIGHTNESS_MAX};
function LUX2Brightness(lux) {
if (lux >= MAX_LUX) { return BRIGHTNESS_MAX }
const MIN = (BRIGHTNESS_MIN ** 2.2) * MAX_LUX;
if (lux <= MIN) { return BRIGHTNESS_MIN }
return (lux ** (1 / 2.2)) / (MAX_LUX ** (1 / 2.2));
}
Sensor reading
Put following line in sensor event. Eg the "devicelight" event listener.
uniforms.brightness.value = LUX2Brightness(event.value);
looking for info on how to recreate the ShaderToy parameters iGlobalTime, iChannel etc within threejs. I know that iGlobalTime is the time elapsed since the Shader started, and I think the iChannel stuff is for pulling rgb out of textures, but would appreciate info on how to set these.
edit: have been going through all the shaders that come with three.js examples and think that the answers are all in there somewhere - just have to find the equivalent to e.g. iChannel1 = a texture input etc.
I am not sure if you have answered your question, but it might be good for others to know the integration steps for shadertoys to THREEJS.
First, you need to know that shadertoys is a fragment shaders. That being said, you have to set a "general purpose" vertex shader that should work with all shadertoys (fragment shaders).
Step 1
Create a "general purpose" vertex shader
varying vec2 vUv;
void main()
{
vUv = uv;
vec4 mvPosition = modelViewMatrix * vec4(position, 1.0 );
gl_Position = projectionMatrix * mvPosition;
}
This vertex shader is pretty basic. Notice that we defined a varying variable vUv to tell the fragment shader where is the texture mapping. This is important because we are not going to use the screen resolution (iResolution) for our base rendering. We will use the texture coordinates instead. We have done that in order to integrate multiple shadertoys on different objects in the same THREEJS scene.
Step 2
Pick the shadertoys that we want and create the fragment shader. (I have chosen a simple toy that performs well: Simple tunnel 2D by niklashuss).
Here is the given code for this toy:
void main(void)
{
vec2 p = gl_FragCoord.xy / iResolution.xy;
vec2 q = p - vec2(0.5, 0.5);
q.x += sin(iGlobalTime* 0.6) * 0.2;
q.y += cos(iGlobalTime* 0.4) * 0.3;
float len = length(q);
float a = atan(q.y, q.x) + iGlobalTime * 0.3;
float b = atan(q.y, q.x) + iGlobalTime * 0.3;
float r1 = 0.3 / len + iGlobalTime * 0.5;
float r2 = 0.2 / len + iGlobalTime * 0.5;
float m = (1.0 + sin(iGlobalTime * 0.5)) / 2.0;
vec4 tex1 = texture2D(iChannel0, vec2(a + 0.1 / len, r1 ));
vec4 tex2 = texture2D(iChannel1, vec2(b + 0.1 / len, r2 ));
vec3 col = vec3(mix(tex1, tex2, m));
gl_FragColor = vec4(col * len * 1.5, 1.0);
}
Step 3
Customize the shadertoy raw code to have a complete GLSL fragment shader.
The first thing missing out the code are the uniforms and varyings declaration. Add them at the top of your frag shader file (just copy and paste the following):
uniform float iGlobalTime;
uniform sampler2D iChannel0;
uniform sampler2D iChannel1;
varying vec2 vUv;
Note, only the shadertoys variables used for that sample are declared, plus the varying vUv previously declared in our vertex shader.
The last thing we have to twick is the proper UV mapping, now that we have decided to not use the screen resolution. To do so, just replace the line that uses the IResolution uniforms i.e.:
vec2 p = gl_FragCoord.xy / iResolution.xy;
with:
vec2 p = -1.0 + 2.0 *vUv;
That's it, your shaders are now ready for usage in your THREEJS scenes.
Step 4
Your THREEJS code:
Set up uniform:
var tuniform = {
iGlobalTime: { type: 'f', value: 0.1 },
iChannel0: { type: 't', value: THREE.ImageUtils.loadTexture( 'textures/tex07.jpg') },
iChannel1: { type: 't', value: THREE.ImageUtils.loadTexture( 'textures/infi.jpg' ) },
};
Make sure the textures are wrapping:
tuniform.iChannel0.value.wrapS = tuniform.iChannel0.value.wrapT = THREE.RepeatWrapping;
tuniform.iChannel1.value.wrapS = tuniform.iChannel1.value.wrapT = THREE.RepeatWrapping;
Create the material with your shaders and add it to a planegeometry. The planegeometry() will simulate the shadertoys 700x394 screen resolution, in other words it will best transfer the work the artist intented to share.
var mat = new THREE.ShaderMaterial( {
uniforms: tuniform,
vertexShader: vshader,
fragmentShader: fshader,
side:THREE.DoubleSide
} );
var tobject = new THREE.Mesh( new THREE.PlaneGeometry(700, 394,1,1), mat);
Finally, add the delta of the THREE.Clock() to iGlobalTime value and not the total time in your update function.
tuniform.iGlobalTime.value += clock.getDelta();
That is it, you are now able to run most of the shadertoys with this setup...
2022 edit: The version of Shaderfrog described below is no longer being actively developed. There are bugs in the compiler used making it not able to parse all shaders correctly for import, and it doesn't support many of Shadertoy's features, like multiple image buffers. I'm working on a new tool if you want to follow along, otherwise you can try the following method, but it likely won't work most of the time.
Original answer follows:
This is an old thread, but there's now an automated way to do this. Simply go to http://shaderfrog.com/app/editor/new and on the top right click "Import > ShaderToy" and paste in the URL. If it's not public you can paste in the raw source code. Then you can save the shader (requires sign up, no email confirm), and click "Export > Three.js".
You might need to tweak the parameters a little after import, but I hope to have this improved over time. For example, ShaderFrog doesn't support audio nor video inputs yet, but you can preview them with images instead.
Proof of concept:
ShaderToy https://www.shadertoy.com/view/MslGWN
ShaderFrog http://shaderfrog.com/app/view/247
Full disclosure: I am the author of this tool which I launched last week. I think this is a useful feature.
This is based on various sources , including the answer of #INF1.
Basically you insert missing uniform variables from Shadertoy (iGlobalTime etc, see this list: https://www.shadertoy.com/howto) into the fragment shader, the you rename mainImage(out vec4 z, in vec2 w) to main(), and then you change z in the source code to 'gl_FragColor'. In most Shadertoys 'z' is 'fragColor'.
I did this for two cool shaders from this guy (https://www.shadertoy.com/user/guil) but unfortunately I didn't get the marble example to work (https://www.shadertoy.com/view/MtX3Ws).
A working jsFiddle is here: https://jsfiddle.net/dirkk0/zt9dhvqx/
Change the shader from frag1 to frag2 in line 56 to see both examples.
And don't 'Tidy' in jsFiddle - it breaks the shaders.
EDIT:
https://medium.com/#dirkk/converting-shaders-from-shadertoy-to-threejs-fe17480ed5c6
Trying to create a skybox using the cubemap shader (like in the examples) and noticed a distortion when you transate the mesh.
If you create a cube of say 1 dimension width, height, and depth. Set the side to be THREE.BackSide and depthWrite to false. Then scale the mesh to say 1000 units in the x, y, and z fields.
When the mesh is positioned in the center of the world everything is fine. But as soon as you translate the mesh the cube map starts to distort badly.
You would want to move the mesh to be the same position as the camera thereby never allowing the skybox to reach its limits if the user walks around.
The shader code I'm using is this:
'cube': {
uniforms: { "tCube": { type: "t", value: null },
"tFlip": { type: "f", value: -1 } },
vertexShader: [
"varying vec3 vWorldPosition;",
"void main() {",
"vec4 worldPosition = modelMatrix * vec4( position, 1.0 );",
"vWorldPosition = worldPosition.xyz;",
"gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );",
"}"
].join("\n"),
fragmentShader: [
"uniform samplerCube tCube;",
"uniform float tFlip;",
"varying vec3 vWorldPosition;",
"void main() {",
"gl_FragColor = textureCube( tCube, vec3( tFlip * vWorldPosition.x, vWorldPosition.yz ) );",
"}"
].join("\n")
}
Does anyone know if the shader can be modified to prevent this distortion?
Many thanks!
After doing some research, I found that the cube map shaders for skyboxes rely on the camera to be in the center of the world. So to get this working in the scenario I described above, instead of setting the position of the skybox to the camera, I simply set the camera's world position to be 0.
Just before rendering the skybox you need to do this:
// Get the current position
this._prevCamPos.getPositionFromMatrix( camera.matrixWorldInverse );
// Now set the position of the camera to be 0,0,0
camera.matrixWorldInverse.elements[12] = 0;
camera.matrixWorldInverse.elements[13] = 0;
camera.matrixWorldInverse.elements[14] = 0;
Then just after its rendered it needs to go back:
// Now revert the camera back
camera.matrixWorldInverse.elements[12] = this._prevCamPos.x;
camera.matrixWorldInverse.elements[13] = this._prevCamPos.y;
camera.matrixWorldInverse.elements[14] = this._prevCamPos.z;
I do two rendering passes in webgl application using three.js (contrived example here):
renderer.render(depthScene, camera, depthTarget);
renderer.render(scene, camera);
The first rendering pass is to the render target depthTarget which I want to access in the second rendering pass as a texture uniform:
uniform sampler2D tDepth;
float unpack_depth( const in vec4 rgba_depth ) { ... }
void main() {
vec2 screenTexCoord = vec2( 1.0, 1.0 );
float depth = 1.0 - unpack_depth( texture2D( tDepth, screenTexCoord ) );
gl_FragColor = vec4( vec3( depth ), 1.0 );
}
My question is how do I get the value for screenTexCoord? It is not gl_FragCoord.xy.
To avoid a possible misunderstanding: I don't want to render the texture from the first pass to a quad. I want to use the texture from the first pass while rendering the geometry in the second pass.
EDIT:
According to the WebGL specification gl_FragCoord contains window coordinates which are normalized device coordinates (ndc) scaled by the viewport. The ndc are within [-1, 1] so the following should yield coordinates within [0, 1] for texture lookup:
vec2 ndcXY = gl_FragCoord.xy / vec2( viewWidth, viewHeight );
vec2 screenTexCoord = (ndcXY+1.0)/2.0;
But somewhere I must be wrong because the updated example does still not show the (packed) depth?!
Finally figured it out myself. The correct way to calculate the texture coordinates is just:
vec2 screenTexCoord = gl_FragCoord.xy / vec2( viewWidth, viewHeight );
See a working example here.
I'm trying to write a simple shader where half of my scene will be displayed and half of the scene will be transparent. I can't seem to figure out why the transparency isn't working:
uniform sampler2D tDiffuse;
varying vec2 vUv;
void main(){
vec2 p = vUv;
vec4 color;
if(p.x < 0.5){
color = (1.0, 0.0, 0.0, 0.0);
}else{
color = texture2D(tDiffuse, p);
}
gl_FragColor = color;
}
The shader is definitely running without errors - the right half of the screen is my threejs scene and the left half of the screen is red (when it should really be transparent). I've read that maybe I need to call glBlendFunc(GL_SRC_ALPHA); - but I am getting errors when I try this. To do this I did renderer.context.blendFuncSeparate(GL_SRC_ALPHA); in my main js file (not the shader). Am I supposed to place this somewhere else to make it work?
Any help would be appreciated. For reference I'm applying my shader with the standard effectsComposer, shaderPass, etc - which most threejs shader examples use.
Thanks in advance for your help!!!
It is difficult to help you with only partial information and code snippets, so I can only make educated guesses.
By default EffectComposer uses a render target with RGB format. Did specify RGBA?
Did you specify material.transparent = true?
three.js r.56
I had this problem and for me it was that the material didn't have transparency enabled.
let myMaterial = new THREE.ShaderMaterial({
uniforms: myUniforms,
fragmentShader: myFragmentShader(),
vertexShader: myVertexShader(),
});
myMaterial.transparent = true;