Any way to render fog on top of SSAO in three.js? - three.js

I'm using Fog and SSAO in my project, and the SSAO is emphasize stuff that needed to be faded, like the horizon line and buildings.
Is there any way the render the fog on top of the SSAO effect?
thanks.
I tried to write a shader, but it not working…
( function () {
var FogShader = {
uniforms: {
'tDiffuse': { value: null },
'fogColor': { value: new THREE.Vector3( 1.0, 0, 0 ) },
'fogNear': { value: 1.0 },
'fogFar': { value: 10.0 }
},
vertexShader:
varying vec2 vUv;
varying float fogDepth;
void main() {
vUv = uv;
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_Position = projectionMatrix * mvPosition;
fogDepth = - mvPosition.z;
}`,
fragmentShader:
`
uniform vec3 fogColor;
uniform float fogNear;
uniform float fogFar;
varying float fogDepth;
uniform sampler2D tDiffuse;
varying vec2 vUv;
void main() {
vec4 texel = texture2D( tDiffuse, vUv );
gl_FragColor = texel ;
float fogFactor = smoothstep( fogNear, fogFar, fogDepth );
gl_FragColor.rgb = mix( gl_FragColor.rgb, fogColor, fogFactor );
}`
};
THREE.FogShader = FogShader;
} )();
I’m using it like this:
var fogpass = new THREE.ShaderPass( THREE.FogShader );
composer.addPass( fogpass );
If I manually change the fogFactor to 1 - all output is red, so I think I have something wrong with the fogDepth…

You can replicate your fog formula in the SSAO shader. Then mix AO with fog:
float final = fog * multiplier * AO;
vec3 result = mix(scene, fognearcol * (1-fog), final);

Related

reconstruct worldposition.xyz from depth

I want to restore the worldposition.xyz from any pixel of a rendered image for postprocessing. With the help of the example from three.js i reconstructed the depth value. I think that i am close to my goal. Does anyone know how i can reconstruct the world positions from the vUv and the depth value?
depthShader = {
uniforms: {
'tDiffuse': { value: null },
'tDepth': { value: null },
'cameraNear': { value: 0 },
'cameraFar': { value: 0 },
},
vertexShader:`
varying vec2 vUv;
void main() {
vUv = uv;
vec4 modelViewPosition = modelViewMatrix * vec4(position, 1.0);
gl_Position = projectionMatrix * modelViewPosition;
}`,
fragmentShader:`
#include <packing>
uniform sampler2D tDiffuse;
uniform sampler2D tDepth;
uniform float cameraNear;
uniform float cameraFar;
varying vec2 vUv;
float readDepth( sampler2D depthSampler, vec2 coord ) {
float fragCoordZ = texture2D( depthSampler, coord ).x;
float viewZ = perspectiveDepthToViewZ( fragCoordZ, cameraNear, cameraFar );
return viewZToOrthographicDepth( viewZ, cameraNear, cameraFar );
}
void main() {
float depth = readDepth(tDepth, vUv);
vec4 color = texture2D(tDiffuse, vUv);
gl_FragColor.rgb = 1.0 - vec3( depth );
}`
};
float clipW = cameraProjection[2][3] * viewZ + cameraProjection[3][3];
vec4 clipPosition = vec4( ( vec3( gl_FragCoord.xy / viewport.zw, depth ) - 0.5 ) * 2.0, 1.0 );
clipPosition *= clipW;
vec4 viewPosition = inverseProjection * clipPosition;
vec4 vorldPosition = cameraMatrixWorld * vec4( viewPosition.xyz, 1.0 );

Upgrading to Three js 0.130.1 version Points rendered with shadermaterial and buffergeometry not rendering

We were using Three 0.115 version and everything was working. Since we got vulnerability issues for < 0.125, we decided to upgrade to latest version. Then we are getting issues with shader material.
We have an application that uses Point cloud rendered with buffer geometry(positions, sizes and colors bufferattributes) and shadermaterial.
function vertexShader() {
return `attribute float size;
attribute vec3 customColor;
varying vec3 vColor;
attribute float visibility;
varying float vVisible;
void main() {
vColor = customColor;
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_PointSize = size * ( 300.0 / -mvPosition.z );
gl_Position = projectionMatrix * mvPosition;
vVisible = visibility;
}`
}
function fragmentShader() {
return `uniform vec3 color;
uniform sampler2D pointTexture;
varying vec3 vColor;
varying float vVisible;
void main() {
gl_FragColor = vec4( color * vColor, 1.0 );
gl_FragColor = gl_FragColor * texture2D( pointTexture, gl_PointCoord );
if ( gl_FragColor.a < ALPHATEST ) discard;
if (vVisible < 0.5) discard;
}`
}
and in our javascript init code.
const material = new THREE.ShaderMaterial({
uniforms: {
color: { value: new THREE.Color(0xffffff) },
texture: { value: new THREE.TextureLoader().load(circle) },
resolution: { value: new THREE.Vector2() },
},
vertexShader: vertexShader(),
fragmentShader: fragmentShader(),
alphaTest: 0.9,
blending: THREE.AdditiveBlending
});
there is no error in console. But points are not rendered.
we use raycast for detecting points and that works without any issue.
Any idea why after upgrading to latest version of three, rendering of points fails?
is this something to do with shadermaterial?
Thanks for the help :)

Per instance UV texture mapping in Three.js InstancedBufferGeometry

I have a InstancedBufferGeometry made up of a single Plane:
const plane = new THREE.PlaneBufferGeometry(100, 100, 1, 1);
const geometry = new THREE.InstancedBufferGeometry();
geometry.maxInstancedCount = 100;
geometry.attributes.position = plane.attributes.position;
geometry.index = plane.index;
geometry.attributes.uv = plane.attributes.uv;
geometry.addAttribute( 'offset', new THREE.InstancedBufferAttribute( new Float32Array( offsets ), 3 ) ); // an offset position
I am applying a texture to each plane, which is working as expected, however I wish to apply a different region of the texture to each instance, and I'm not sure about the correct approach.
At the moment I have tried to build up uv's per instance, based on the structure of the uv's for a single plane:
let uvs = [];
for (let i = 0; i < 100; i++) {
const tl = [0, 1];
const tr = [1, 1];
const bl = [0, 0];
const br = [1, 0];
uvs = uvs.concat(tl, tr, bl, br);
}
...
geometry.addAttribute( 'uv', new THREE.InstancedBufferAttribute( new Float32Array( uvs ), 2) );
When I do this, I don't have any errors, but every instance is just a single colour (all instances are the the same colour). I have tried changing the instance size, and also the meshes per attribute (which I don't fully understand, struggling to find a good explanation in the docs).
I feel like I'm close, but I'm missing something, so a point in the right direction would be fantastic!
(For reference, here are my shaders):
const vertexShader = `
precision mediump float;
uniform vec3 color;
uniform sampler2D tPositions;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
attribute vec2 uv;
attribute vec2 dataUv;
attribute vec3 position;
attribute vec3 offset;
attribute vec3 particlePosition;
attribute vec4 orientationStart;
attribute vec4 orientationEnd;
varying vec3 vPosition;
varying vec3 vColor;
varying vec2 vUv;
void main(){
vPosition = position;
vec4 orientation = normalize( orientationStart );
vec3 vcV = cross( orientation.xyz, vPosition );
vPosition = vcV * ( 2.0 * orientation.w ) + ( cross( orientation.xyz, vcV ) * 2.0 + vPosition );
vec4 data = texture2D( tPositions, vec2(dataUv.x, 0.0));
vec3 particlePosition = (data.xyz - 0.5) * 1000.0;
vUv = uv;
vColor = data.xyz;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position + particlePosition + offset, 1.0 );
}
`;
const fragmentShader = `
precision mediump float;
uniform sampler2D map;
varying vec3 vPosition;
varying vec3 vColor;
varying vec2 vUv;
void main() {
vec3 color = texture2D(map, vUv).xyz;
gl_FragColor = vec4(color, 1.0);
}
`;
As all my instances need to take the same size rectangular area, but offset (like a sprite sheet), I have added a UV offset and UV scale attribute to each instance, and use this to define which area of the map to use:
const uvOffsets = [];
for (let i = 0; i < INSTANCES; i++) {
const u = i % textureWidthHeight;
const v = ~~ (i / textureWidthHeight);
uvOffsets.push(u, v);
}
...
geometry.attributes.uv = plane.attributes.uv;
geometry.addAttribute( 'uvOffsets', new THREE.InstancedBufferAttribute( new Float32Array( uvOffsets ), 2 ) );
uniforms: {
...
uUvScale: { value: 1 / textureWidthHeight }
}
And in the fragment shader:
void main() {
vec4 color = texture2D(map, (vUv * uUvScale) + (vUvOffsets * uUvScale));
gl_FragColor = vec4(1.0, 1.0, 1.0, color.a);
}
\o/

How to add fog to texture in shader (THREE.JS R76)

So firstly, I am aware of this post: ShaderMaterial fog parameter does not work
My question is a bit different...
I am trying to apply the fog in my three.js scene to a shader thats using a TEXTURE and I can't figure it out. My best guess as to what is supposed to go into the frag was:
resultingColor = mix(texture2D(glowTexture, vUv), fogColor, fogFactor);
This works when the texture2D part is just a normal color but as a texture it doesn't render.
THREE.glowShader = {
vertexShader: [
`
varying vec2 vUv;
void main() {
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.0);
}
`
].join("\n"),
fragmentShader: [
"uniform sampler2D glowTexture;",
"varying vec2 vUv;",
"uniform vec3 fogColor;",
"uniform float fogNear;",
"uniform float fogFar;",
"void main() {",
`
vec4 resultingColor = texture2D(glowTexture, vUv);
`,
`#ifdef USE_FOG
#ifdef USE_LOGDEPTHBUF_EXT
float depth = gl_FragDepthEXT / gl_FragCoord.w;
#else
float depth = gl_FragCoord.z / gl_FragCoord.w;
#endif
#ifdef FOG_EXP2
float fogFactor = whiteCompliment( exp2( - fogDensity * fogDensity * depth * depth * LOG2 ) );
#else
float fogFactor = smoothstep( fogNear, fogFar, depth );
#endif`,
// resultingColor = mix(texture2D(glowTexture, vUv), fogColor, fogFactor);
`#endif`,
"gl_FragColor = resultingColor;",
"}"
].join("\n")
}
Here is a fiddle that shows a ShaderMaterial with a texture and red fog
<script id="vertexShader" type="x-shader/x-vertex">
varying vec2 vUv;
varying vec3 vPosition;
void main( void ) {
vUv = uv;
vPosition = position;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.0);
}
</script>
<script id="fragmentShader" type="x-shader/x-fragment">
varying vec2 vUv;
uniform sampler2D texture;
uniform vec3 fogColor;
uniform float fogNear;
uniform float fogFar;
void main() {
gl_FragColor = texture2D(texture, vUv);
#ifdef USE_FOG
#ifdef USE_LOGDEPTHBUF_EXT
float depth = gl_FragDepthEXT / gl_FragCoord.w;
#else
float depth = gl_FragCoord.z / gl_FragCoord.w;
#endif
float fogFactor = smoothstep( fogNear, fogFar, depth );
gl_FragColor.rgb = mix( gl_FragColor.rgb, fogColor, fogFactor );
#endif
}
</script>
Here is a bit of how to create the material
var uniforms = {
texture: { type: "t", value: texture},
fogColor: { type: "c", value: scene.fog.color },
fogNear: { type: "f", value: scene.fog.near },
fogFar: { type: "f", value: scene.fog.far }
};
var vertexShader = document.getElementById('vertexShader').text;
var fragmentShader = document.getElementById('fragmentShader').text;
material = new THREE.ShaderMaterial(
{
uniforms : uniforms,
vertexShader : vertexShader,
fragmentShader : fragmentShader,
fog: true
}
);

Same texture offseting different position in the fragment shader using threejs rendering engine

My vertex shader:
varying vec2 texCoord;
void main() {
texCoord = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
My fragment shader:
varying vec2 texCoord;
uniform sampler2D texture1;
uniform sampler2D texture2;
uniform float multiplier;
void main( void ) {
vec3 tex1 = texture2D(texture1, texCoord).xyz ;
vec3 tex2 = texture2D(texture2, texCoord).xyz ;
vec3 finaltex = mix( tex1, tex2, multiplier) ;
gl_FragColor = vec4(finaltex , 1.0);
}
Now this work very well when i run using the two texture.check http://shaderfrog.com/app/view/68 for the multiplier action.
But now what I want is I am having texture like this:
So with the single texture I want to index the offset of my texCoord so that I just need to sample one texture and I can get three representation form this like:
var shaderMaterial = new THREE.ShaderMaterial({
uniforms:{
texture1: { type: "t", value: texture1 }
},
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent
});
Can offset my tri-color in the fragment shader. or Some one can help me modifying the fragment shader so that I can pass uniform to index my tri-color into individual Yellow,pink,red.
So either from shader or threejs I can get help regarding same.
I have done reference using two texture because I want to interpolate with cross fade effect on the texture same I want cross fade using fragment shader (independent to this I have already achieve by the texture.offset.x = currentColoum / horizontal and texture.offset.y = currentRow / Vertical;
I found the answer to this question, even implemented into application :D.
vertexShader:
void main() {
texCoord = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
FragmentShader:
varying vec2 texCoord;
uniform sampler2D texture;
uniform vec2 offset_current;
uniform vec2 offset_next;
uniform float multiplier;
void main( void ) {
vec2 offset1 = offset_current + vec2( texCoord.x* 1.0, texCoord.y * 0.333333);
vec2 offset2 = offset_next + vec2( texCoord.x* 1.0, texCoord.y * 0.333333);
vec3 tex1 = texture2D(texture1,offset1).xyz ;
vec3 tex2 = texture2D(texture1, offset2).xyz ;
vec3 mixCol = mix( tex1, tex2, multiplier );
vec4 fragColor = vec4(mixCol, 1.0);
if(fragColor.a == 0.0)
discard;
gl_FragColor = fragColor;
}
Explanation:
Since I am having the vertical texture of three different type I make my offset into y direction 0.3333. Because texture is read from [0,1]. I have extended this code same for the horizontal direction.
If some one going to make this dynamic then instead of hard coded we can pass the 0.3333 as the calculate one taking the inspiration form link.

Resources