I am attempting to make a first person experiment incorporating FBO particles (similar to Mr. Doob's sporel.
I create the position of my particles in a data texture, and then in my vertex shader I do a lookup on the texture to get the position of that particle.
uniform sampler2D map;
varying vec2 vUv;
varying vec3 vPosition;
void main(void) {
vec2 uv = position.xy;
vec4 data = texture2D(map, uv);
vPosition = data.xyz;
gl_PointSize = 1.0;
gl_Position = projectionMatrix * modelViewMatrix * vec4(vPosition, 1.0);
This is working great, and initially I see a field of particles as shown here...
The problem is when I walk a bit and rotate, all of the particles suddenly disappear. I did a bunch of searching around for a solution, and found this thread which suggested setting material.depthWrite to false. I did this, to no avail- here is my ShaderMaterial definition:
var particleMaterial = new THREE.ShaderMaterial({
uniforms: uniforms,
vertexShader: shaders.vertexShaders.floor,
fragmentShader: shaders.fragmentShaders.floor,
blending: THREE.AdditiveBlending,
depthWrite: false,
})
I am at a loss right now and any help would be greatly appreciated!
Thanks! Eric
Related
I'm trying to write my own reflection shader however the reflection does not work correctly when I try to rotate the sphere. It works only when my sphere is static. Here is my GLSL code:
sphereMaterial = new THREE.ShaderMaterial({
uniforms: {
uCubeMap: { value: cubeCamera.renderTarget.texture },
},
vertexShader: `
varying vec2 vUv;
varying vec3 vViewVector;
varying vec3 vNormal;
void main() {
vec4 modelPosition = modelMatrix * vec4(position, 1.0);
vec4 viewPosition = viewMatrix * modelPosition;
vec4 projectedPosition = projectionMatrix * viewPosition;
vViewVector = modelPosition.xyz - cameraPosition;
vNormal = normal;
gl_Position = projectedPosition;
}
`,
fragmentShader: `
uniform samplerCube uCubeMap;
varying vec3 vViewVector;
varying vec3 vNormal;
void main() {
vec3 reflectedDirection = normalize(reflect(vViewVector, vNormal));
reflectedDirection.x = -reflectedDirection.x;
vec3 textureColor = textureCube(uCubeMap, reflectedDirection).rgb;
gl_FragColor = vec4(textureColor, 1.0);
}
`,
});
Complete example: https://codepen.io/marianban/pen/VwPeYer?editors=0010 The sphere rotation is on the line 143 sphereGroup.rotation.x += 0.01;. It works fine if you comment it. It looks like that the reflection is not taking the sphere rotation into account.
I found several similar questions: GLSL cubemap reflection shader, Texturing Spheres with Cubemaps (not reflection maps) but unfortunately they didn't help me.
I tried a very simple test using Three.js ShaderMaterial.
I load a 2048x2048 jpg image as a texture for my height map and apply it to deform a PlaneBufferGeometry in the vertex shader.
I also apply the same texture for the diffuse color in the fragment shader.
Globally it works but I see some big artifacts as shown in this screenshot
The artifact always appears along a line parallel to the X axis and passing through the camera.
I have the problem on all three.js version I tried (r105, r114)
The code is yet very simple, anyone know what am I doing wrong ?
Javascript
var textureLoader = new THREE.TextureLoader();
var testTextureBump = textureLoader.load( './front_b.jpg' );
var testGeometry = new THREE.PlaneBufferGeometry(3000, 3000, 500, 500);
var testUniforms = {
uTextureBump: { value: testTextureBump }
};
var testMaterial = new THREE.ShaderMaterial({
uniforms: testUniforms,
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent,
side: THREE.FrontSide,
blending: THREE.NormalBlending,
depthWrite: false,
wireframe: false,
transparent: true
});
var testMesh = new THREE.Mesh( testGeometry, testMaterial );
scene.add( testMesh );
Vertex shader
uniform sampler2D uTextureBump;
varying vec2 vUv;
void main() {
vUv = uv;
vec4 diffuseTexture = texture2D(uTextureBump, uv);
vec3 positionHeight = position.xyz;
positionHeight.z += diffuseTexture.r * 20.0;
gl_Position = projectionMatrix * modelViewMatrix * vec4(positionHeight, 1.0);
}
Fragment shader
precision highp float;
precision highp int;
uniform sampler2D uTextureBump;
varying vec2 vUv;
void main (void) {
vec4 texture = texture2D(uTextureBump, vUv);
gl_FragColor = vec4( texture.rgb, 1.0 );
}
You can see the problem in this demo
Move your mouse on the left or right and you'll see the artifacts.
You can fly around as I use the standard THREE.FlyControl service.
The corresponding project file can be download here
I have this geometry: Picture
I want to add the same effect that mountains are with snow texture and so on:
Texture splatting with Three.js
Little background what info I give to shaders from Three.js:
//importing grass and snow textures:
var grassTexture = new THREE.ImageUtils.loadTexture( 'images/grass-512.jpg' );
grassTexture.wrapS = grassTexture.wrapT = THREE.RepeatWrapping;
var snowTexture = new THREE.ImageUtils.loadTexture( 'images/snow-512.jpg' );
snowTExture.wrapS = snowTExture.wrapT = THREE.RepeatWrapping;
this.customUniforms = {
grassTexture: { value: grassTexture },
snowTexture: { value: snowTexture },
};
var customMaterial = new THREE.ShaderMaterial({
uniforms: customUniforms,
side: THREE.DoubleSide,
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent,
});
//creating mesh, geometry is the model in picture.
mesh = new THREE.Mesh(geometry, customMaterial);
Vertex and fragment shaders:
//vertexShader:
varying vec2 vUV;
void main(){
vUV = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
I get full red model with this:
//fragmentShader:
void main(){
gl_FragColor = vec4(1, 0.0, 0.0, 1.0) ;
}
I want textures that are higher with snowTexture and lower with grassTexture.
uniform sampler2D grassTexture;
uniform sampler2D snowTexture;
varying vec2 vUV;
//Something like this?:
vec4 grass = texture2D( grassTexture, vUV);
vec4 snow = texture2D( snowTexture, vUV);
gl_FragColor = vec4(0.0, 0.0, 0.0, 1.0) + grass + snow;
This really not that hard to understand, let me walk you through the logic.
In your case, you don't want to use a displacement map. So, you need to set up a varying height on your vertexShader to map your vertices up-coordinates [0,1] to your fragmentShader.
//vertexShader:
varying vec2 vUV;
varying float height;
void main() {
vUV = uv;
float maxPosition = 30.0; // this is an example value.
height = max( 0.0, min(1.0, position.y/maxPosition ) ); // assuming +y is up
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
Now you can access height from your fragmentShader and use that information to select where you want your transitions to occur.
uniform sampler2D grassTexture;
uniform sampler2D snowTexture;
varying vec2 vUV;
varying float height;
void main(){
vec4 grass = (1.0 - smoothstep( 0.48, 0.52, height)) * texture2D( grassTexture, vUV);
vec4 snow = (smoothstep(0.48, 0.52, height) - 0.0) * texture2D( snowTexture, vUV);
gl_FragColor = vec4(0.0, 0.0, 0.0, 1.0) + grass + snow;
}
The link provided uses function smoothstep to make a gradual transition between the textures. We can create transitions using the follow pattern ( a - b ) * textureColor.
In this case, a controls when the texture starts to contribute to the fragment color.
b controls when the texture stops contributing.
In other words, your grass texture will have already started contributing at every height, so we map a to 1.0. It stops contributing around 0.5, so we give b a smooth fade-out as it approaches that 0.5.
Your snow texture, on the other hand, will only start contributing around 0.5. So, we give a a smooth fade-in as it approaches 0.5. It will never stop contributing, so we set b as 0.0.
Hope this clears things up for you.
I'm trying to access the scene's lights from a shader in three.js.
This question is nearly a duplicate of Three.js ShaderMaterial issue with lights
but the comments on that question aren't helping me resolve the issue.
Here is the vertex shader:
#if NUM_DIR_LIGHTS > 0
struct DirectionalLight {
vec3 direction;
vec3 color;
int shadow;
float shadowBias;
float shadowRadius;
vec2 shadowMapSize;
};
uniform DirectionalLight directionalLights[ NUM_DIR_LIGHTS ];
#endif
varying vec3 color;
void main() {
float r = directionalLights[0].color.r;
color = vec3(r,1.0,0.0);
gl_Position = projectionMatrix * modelViewMatrix * vec4(position , 1.0);
}
and the relevant ShaderMaterial:
var material = new THREE.ShaderMaterial({
uniforms: THREE.UniformsLib['lights'],
vertexShader: document.getElementById('vertexShader').innerHTML,
fragmentShader: document.getElementById('fragmentShader').innerHTML,
lights : true
});
I've posted the entire example code here: https://jsfiddle.net/zhkvcajs/
Removing the lights : true renders a green knot, but it's not getting the directionalLights information that should change the knot's color. Apparently, lights : true is required for that, but causes an error.
If you want to use scene lights with ShaderMaterial, you need to set lights: true.
var material = new THREE.ShaderMaterial( {
uniforms: uniforms,
vertexShader: document.getElementById('vertexShader').innerHTML,
fragmentShader: document.getElementById('fragmentShader').innerHTML,
lights: true
} );
three.js r.126
I try to draw a texture onto a sphere like this:
script(type='x-shader/x-vertex')#Vertex
varying vec2 vUv;
void main() {
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
script(type='x-shader/x-fragment')#Fragment
uniform sampler2D baseTexture;
varying vec2 vUv;
void main() {
vec4 baseColor = texture2D( baseTexture, vUv );
gl_FragColor = baseColor;
}
this.materials = new THREE.ShaderMaterial( {
uniforms: this.uniforms,
vertexShader: document.getElementById( 'Vertex' ).textContent,
fragmentShader: document.getElementById( 'Fragment' ).textContent,
transparent: true,
blending: THREE.AdditiveBlending
});
This does work fine, but the texture is not transparent, even if I change the alpha value. Transparent pixels from my texture are just black.
But if I write baseColor.a = 0.0, I cannot see the texture anymore, but also not what lies behind it in the scene. I think I'm missing mixing the texture with the background somehow?
How can I achieve this with GLSL in three.js?
Thanks
I have no idea how THREE.js works under hood but I see you set blending to be additive.That's not what you want for alpha blending.Alpha blending uses
this function :
glBlendFunc (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
while additive uses:
glBlendFunc(GL_ONE, GL_ONE);
So make sure you use the first one and that your texture has in fact alpha channel as A component of RGBA.