THREE.js blur the frame buffer - three.js

I need to blur the frame buffer and I don't know how to get the frame buffer using THREE.js.
I want to blur the whole frame buffer rather than blur each textures in the scene. So I guess I should read the frame buffer and then blur, rather than doing this in shaders.
Here's what I have tried:
Call when init:
var renderTarget = new THREE.WebGLRenderTarget(512, 512, {
wrapS: THREE.RepeatWrapping,
wrapT: THREE.RepeatWrapping,
minFilter: THREE.NearestFilter,
magFilter: THREE.NearestFilter,
format: THREE.RGBAFormat,
type: THREE.FloatType,
stencilBuffer: false,
depthBuffer: true
});
renderTarget.generateMipmaps = false;
Call in each frame:
var gl = renderer.getContext();
// render to target
renderer.render(scene, camera, renderTarget, false);
framebuffer = renderTarget.__webglFramebuffer;
console.log(framebuffer);
gl.flush();
if (framebuffer != null)
gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);
var width = height = 512;
var rdData = new Uint8Array(width * height * 4);
gl.readPixels(0, 0, width, height, gl.RGBA, gl.UNSIGNED_BYTE, rdData);
console.log(rdData);
// render to screen
renderer.render(scene, camera);
But framebuffer is WebFramebuffer {} and rdData is full of 0. Am I doing this in the right way?

Any blur should use shaders to be efficient, but in this case not as materials.
If you want to blur the entire frame buffer and render that to the screen use the effect composer. It's located in three.js/examples/js./postprocessing/EffectComposer.js
Set up the scene camera and renderer as normal but in addition add an instance of the effect composer. With the scene as a render pass.
composer = new THREE.EffectComposer( renderer );
composer.addPass( new THREE.RenderPass( scene, camera ) );
Then blur the whole buffer with two passes using the included blur shaders located in three.js/examples/shaders/
hblur = new THREE.ShaderPass( THREE.HorizontalBlurShader );
composer.addPass( hblur );
vblur = new THREE.ShaderPass( THREE.VerticalBlurShader );
// set this shader pass to render to screen so we can see the effects
vblur.renderToScreen = true;
composer.addPass( vblur );
finally in your method called in each frame render using the composer instead of the renderer
composer.render();
Here is a link to a working example of full screen blur

Try using the MeshDepthMaterial and render this into your shader.
I suggest rendering the blur pass with a dedicated camera using the same settings as the scene's diffuse camera. Then by adjusting the camera's frustrum you can do both screen and depth of blur effects. For a screen setup move the near frustrum towards the camera and move the far frustrum in increments away from the camera.
http://threejs.org/docs/#Reference/Materials/MeshDepthMaterial

Related

THREEJS: Clear RT and default FBO with different colors

I am rendering a scene once into WebGLRenderTarget and then into default Frame buffer like this:
//want the BG of RT to be green
renderer.setClearColor(0x00ff00,1);
renderer.clearTarget(renderTargetTex, true,true);
renderer.render(this._scene, cam,renderTargetTex, true);
//now render same scene into default FBO:
//set red BG
renderer.setClearColor(0xff0000,1);
renderer.clear();
renderer.render(this._scene,cam);
The result is always black background background. But if I don't call
renderer.clearTarget(renderTargetTex, true,true);
renderer.render(this._scene, cam,renderTargetTex, true);
The renderer autoClear is set to false
Then I am getting the back buffer cleared to red. How do I clear each RT to its color? I am using THREE.js version 93dev
renderer.clear() will clear the current render target.
If you want to clear a target different from the current one, you have to set the desired target first.
You can use this pattern, instead:
renderer.setRenderTarget( renderTarget );
renderer.setClearColor( color1, alpha1 );
renderer.clear();
renderer.render( scene, camera, renderTarget, true );
renderer.setRenderTarget( null );
renderer.setClearColor( color2, alpha2 );
renderer.clear();
renderer.render( scene, camera );
three.js r.97

Three.js: Apply SSAO (Screen Space Ambient Occlusion) to Displacement map

have I've implemented the Screen Space Ambient Occlusion in my Three.js project correctly, and run perfect, like this:
//Setup SSAO pass
depthMaterial = new THREE.MeshDepthMaterial();
depthMaterial.depthPacking = THREE.RGBADepthPacking;
depthMaterial.blending = THREE.NoBlending;
var pars = { minFilter: THREE.LinearFilter, magFilter: THREE.LinearFilter, format: THREE.RGBAFormat, stencilBuffer: true }; //Stancilbuffer true because not effect transparent object
depthRenderTarget = new THREE.WebGLRenderTarget(window.innerWidth, window.innerHeight, pars);
depthRenderTarget.texture.name = "SSAOShader.rt";
ssaoPass = new THREE.ShaderPass(THREE.SSAOShader);
///////ssaoPass.uniforms[ "tDiffuse" ].value will be set by ShaderPass
ssaoPass.uniforms["tDepth"].value = depthRenderTarget.texture;
ssaoPass.uniforms['size'].value.set(window.innerWidth, window.innerHeight);
ssaoPass.uniforms['cameraNear'].value = camera.near;
ssaoPass.uniforms['cameraFar'].value = camera.far;
ssaoPass.uniforms['radius'].value = radius;
ssaoPass.uniforms['aoClamp'].value = aoClamp;
ssaoPass.uniforms['lumInfluence'].value = lumInfluence;
But, when I set a material with displacementMap (that run correctly without SSAO enabled), this is the result. Notice that the SSAO is applied "correctly" to the original sphere (with a strange-trasparent-artificat), but I need to apply it to the "displaced vertex" of the sphere)
This is my composer passes:
//Main render scene pass
postprocessingComposer.addPass(renderScene);
//Post processing pass
if (ssaoPass) {
postprocessingComposer.addPass(ssaoPass);
}
And this is the rendering loop with composer
if (postprocessingComposer) {
if (ssaoPass) {
//Render depth into depthRenderTarget
scene.overrideMaterial = depthMaterial;
renderer.render(scene, camera, depthRenderTarget, true);
//Render composer
scene.overrideMaterial = null;
postprocessingComposer.render();
renderer.clearDepth();
renderer.render(sceneOrtho, cameraOrtho);
}
else {
//Render loop with post processing (no SSAO, becasue need more checks, see above)
renderer.clear();
postprocessingComposer.render();
renderer.clearDepth();
renderer.render(sceneOrtho, cameraOrtho);
}
}
else {
//Simple render loop (no post-processing)
renderer.clear();
renderer.render(scene, camera);
renderer.clearDepth();
renderer.render(sceneOrtho, cameraOrtho);
}
How can i archive a correct Screen Space Ambient Occlusion applied to a mesh with Displacement Map? Thanks.
[UPDATE]:
After some work i tried to this procedure for every child in the scene, with displacement map, to define a new a new overrideMaterial of the scene equal to a depthMaterial with displacement map parameters of the child material.
var myDepthMaterial = new THREE.MeshDepthMaterial({
depthPacking: THREE.RGBADepthPacking,
displacementMap: child.material.displacementMap,
displacementScale: child.material.displacementScale,
displacementBias: child.material.displacementBias
});
child.onBeforeRender = function (renderer, scene, camera, geometry, material, group) {
scene.overrideMaterial = myDepthMaterial;
};
This solution sounds good, but doesnt work.
You are using SSAO with a displacement map. You need to specify the displacement map when you instantiate the depth material.
depthMaterial = new THREE.MeshDepthMaterial( {
depthPacking: THREE.RGBADepthPacking,
displacementMap: displacementMap,
displacementScale: displacementScale,
displacementBias: displacementBias
} );
three.js r.87

How to blur an object in webVR?

In my scene i have a glowing cube. Firstly, i render the cube on a texture and then render the texture applying gaussian blur post processing. In this way i get the result right when i am not in VR mode. Here it is -
But when i go to VR mode, it gives me distorted result. Please check the following image -
Can anyone please tell me why is happening? Do i have to make any adjustment to render on texture for VR mode?
Update:
My code is like this -
function render() {
generate_blur_texture();
effect.render(scene, camera);
}
var rtTexture = new THREE.WebGLRenderTarget( window.innerWidth, window.innerHeight, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBAFormat});
var rtTextureFinal = new THREE.WebGLRenderTarget( window.innerWidth, window.innerHeight, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBAFormat});
function generate_blur_texture() {
// i think the key point is, i am using `renderer` here instead of 'effect'
renderer.render(scene, camera, rtTexture, true);
// then further rendering on rtTexture and rtTextureFinal (ping pong) to generate blur, this time i only draw a quad to sample from the texture
// ultimately, rtTexture contains the blurred texture
}
Three.js Revision: 77

Texture applied only on canvas click with Three.js

I am using WebGLRenderer from Three.js to render an object reconstructed from an IndexedFaceStructure that has texture. My problem is that when the page loads the object shows up with no texture, only a black colored mesh displays, however, when i click on the canvas where i render the object the texture shows up.
I have been looking around and tried the texture.needsUpdate = true; trick, but this one removes also the black meshed object on page load so i am at a loss here.
These are the main bits of my code:
function webGLStart() {
container = document.getElementById("webgl-canvas");
renderer = new THREE.WebGLRenderer({canvas:container, alpha:true, antialias: true});
renderer.setClearColor(0x696969,1);
renderer.setSize(container.width, container.height) ;
scene = new THREE.Scene();
camera = new THREE.PerspectiveCamera(45, container.width / container.height, 1, 100000);
camera.position.set(60, 120,2000) ;
//computing the geometry2
controls = new THREE.OrbitControls( camera );
controls.addEventListener( 'change', render );
texture = new THREE.ImageUtils.loadTexture(texFile);
//texture.needsUpdate = true;
material = new THREE.MeshBasicMaterial( {wireframe:false, map: texture, vertexColors: THREE.VertexColors} );
mesh = new THREE.Mesh(geometry2, material);
scene.add(mesh);
render();
animate();
}
function render()
{
renderer.render(scene, camera);
}
function animate()
{
controls.update();
}
And the html part: canvas id="webgl-canvas" style="border: none;" width="900" height="900" (i could not add it properly).
Do you happen to have a clue why is this happening?
If you have a static scene, you do not need an animation loop, and you only need to render the scene when OrbitControls modifies the camera position/orientation.
Consequently, you can use this pattern -- without an animation loop:
controls.addEventListener( 'change', render );
However, you also need to force a render when the texture loads. You do that by specifying a callback to render in the ImageUtils.loadTexture() method:
var texture = THREE.ImageUtils.loadTexture( "textureFile", undefined, render );
Alternatively, you could add the mesh to the scene and call render in the callback.
three.js r.70

Three.js use framebuffer as texture

I'm using an image in a canvas element as a texture in Three.js, performing image manipulations on the canvas using JavaScript, and then calling needsUpdate() on the texture. This works, but it's quite slow.
I'd like to perform the image calculations in a fragment shader instead. I've found many examples which almost do this:
Shader materials: http://mrdoob.github.io/three.js/examples/webgl_shader2.html This example shows image manipulations performed in a fragment shader, but that shader is functioning as the fragment shader of an entire material. I only want to use the shader on a texture, and then use the texture as a component of a second material.
Render to texture: https://threejsdoc.appspot.com/doc/three.js/examples/webgl_rtt.html This shows rendering the entire scene to a WebGLRenderTarget and using that as the texture in a material. I only want to pre-process an image, not render an entire scene.
Effects composer: http://www.airtightinteractive.com/demos/js/shaders/preview/ This shows applying shaders as a post-process to the entire scene.
Edit: Here's another one:
Render to another scene: http://relicweb.com/webgl/rt.html This example, referenced in Three.js Retrieve data from WebGLRenderTarget (water sim), uses a second scene with its own orthographic camera to render a dynamic texture to a WebGLRenderTarget, which is then used as a texture in the primary scene. I guess this is a special case of the first "render to texture" example listed above, and would probably work for me, but seems over-complicated.
As I understand it, ideally I'd be able to make a new framebuffer object with its own fragment shader, render it on its own, and use its output as a texture uniform for another material's fragment shader. Is this possible?
Edit 2: It looks like I might be asking something similar to this: Shader Materials and GL Framebuffers in THREE.js ...though the question doesn't appear to have been resolved.
Render to texture and Render to another scene as listed above are the same thing, and are the technique you want. To explain:
In vanilla WebGL the way you do this kind of thing is by creating a framebuffer object (FBO) from scratch, binding a texture to it, and rendering it with the shader of your choice. Concepts like "scene" and "camera" aren't involved, and it's kind of a complicated process. Here's an example:
http://learningwebgl.com/blog/?p=1786
But this also happens to be essentially what Three.js does when you use it to render a scene with a camera: the renderer outputs to a framebuffer, which in its basic usage goes straight to the screen. So if you instruct it to render to a new WebGLRenderTarget instead, you can use whatever the camera sees as the input texture of a second material. All the complicated stuff is still happening, but behind the scenes, which is the beauty of Three.js. :)
So: To replicate a WebGL setup of an FBO containing a single rendered texture, as mentioned in the comments, just make a new scene containing an orthographic camera and a single plane with a material using the desired texture, then render to a new WebGLRenderTarget using your custom shader:
// new render-to-texture scene
myScene = new THREE.Scene();
// you may need to modify these parameters
var renderTargetParams = {
minFilter:THREE.LinearFilter,
stencilBuffer:false,
depthBuffer:false
};
myImage = THREE.ImageUtils.loadTexture( 'path/to/texture.png',
new THREE.UVMapping(), function() { myCallbackFunction(); } );
imageWidth = myImage.image.width;
imageHeight = myImage.image.height;
// create buffer
myTexture = new THREE.WebGLRenderTarget( width, height, renderTargetParams );
// custom RTT materials
myUniforms = {
colorMap: { type: "t", value: myImage },
};
myTextureMat = new THREE.ShaderMaterial({
uniforms: myUniforms,
vertexShader: document.getElementById( 'my_custom_vs' ).textContent,
fragmentShader: document.getElementById( 'my_custom_fs' ).textContent
});
// Setup render-to-texture scene
myCamera = new THREE.OrthographicCamera( imageWidth / - 2,
imageWidth / 2,
imageHeight / 2,
imageHeight / - 2, -10000, 10000 );
var myTextureGeo = new THREE.PlaneGeometry( imageWidth, imageHeight );
myTextureMesh = new THREE.Mesh( myTextureGeo, myTextureMat );
myTextureMesh.position.z = -100;
myScene.add( myTextureMesh );
renderer.render( myScene, myCamera, myTexture, true );
Once you've rendered the new scene, myTexture will be available for use as a texture in another material in your main scene. Note that you may want to trigger the first render with the callback function in the loadTexture() call, so that it won't try to render until the source image has loaded.

Resources