I have two scenes: one with the plane and another with the box. I want to blur the box.
Rendering loop:
renderer.clear();
renderer.render( scene, camera );
renderer.clearDepth();
composer.render();
In this case only blurred box appeared, but plane is missing.
Here is a fiddle: http://jsfiddle.net/bq5m1u6v/36/
Related
I'm doing a camera animation in Three.js with GSAP, I'm trying to animate it horizontally like the panning in a film.
Anyone knows if is possible to animate the pan in orbit controls or something similar?
If you are using OrbitControls and you want to manually animate a pan, you have to animate the camera position and OrbitControls.target which represents the focus point of the controls. The relevant code for the controls is:
gsap.to( controls.target, {
duration: 2,
x: 10,
onUpdate: function() {
controls.update();
}
} );
Full demo: https://jsfiddle.net/kerpm61q/
I am rendering a scene once into WebGLRenderTarget and then into default Frame buffer like this:
//want the BG of RT to be green
renderer.setClearColor(0x00ff00,1);
renderer.clearTarget(renderTargetTex, true,true);
renderer.render(this._scene, cam,renderTargetTex, true);
//now render same scene into default FBO:
//set red BG
renderer.setClearColor(0xff0000,1);
renderer.clear();
renderer.render(this._scene,cam);
The result is always black background background. But if I don't call
renderer.clearTarget(renderTargetTex, true,true);
renderer.render(this._scene, cam,renderTargetTex, true);
The renderer autoClear is set to false
Then I am getting the back buffer cleared to red. How do I clear each RT to its color? I am using THREE.js version 93dev
renderer.clear() will clear the current render target.
If you want to clear a target different from the current one, you have to set the desired target first.
You can use this pattern, instead:
renderer.setRenderTarget( renderTarget );
renderer.setClearColor( color1, alpha1 );
renderer.clear();
renderer.render( scene, camera, renderTarget, true );
renderer.setRenderTarget( null );
renderer.setClearColor( color2, alpha2 );
renderer.clear();
renderer.render( scene, camera );
three.js r.97
I'm having an issue when rendering a white material in ThreeJS version 87.
Here are the steps to replicate:
A white PNG image that is loaded as texture
This texture is used to create a MeshBasicMaterial (passed as parameter map)
The MeshBasicMaterial is used along a plane Geometry to create a Mesh
The Mesh is added to an empty Scene and rendered on a WebGLRenderer with alpha: true and clearColor as white
The problem is that the rendered texture now has grey edges on parts that should be fully white.
This happens with any image with white edges. I've also tried many different configurations for the renderer and the material but to no avail.
I've made a very simple CodePen that replicates the behavior as simple as possible. Does anyone know how can this problem be solved?
CodePen:
https://codepen.io/ivan-i1/pen/pZxwZX
var renderer, width, height, scene, camera, dataUrl, threeTexture, geometry, material, mesh;
width = window.innerWidth;
height = window.innerHeight;
dataUrl = '//data url from image';
threeTexture = new THREE.ImageUtils.loadTexture(dataUrl);
material = new THREE.MeshBasicMaterial({
map: threeTexture,
transparent: true,
alphaTest: 0.1
});
material.needsUpdate = true;
geometry = new THREE.PlaneGeometry(5, 5);
mesh = new THREE.Mesh(geometry, material);
mesh.position.z = -5;
scene = new THREE.Scene();
scene.add(mesh);
camera = new THREE.PerspectiveCamera( 70, window.innerWidth / window.innerHeight, 1, 1000 );
renderer = new THREE.WebGLRenderer({
alpha: true
});
document.body.appendChild( renderer.domElement );
renderer.setSize(width, height);
renderer.setClearColor( 0xffffff, 1 );
//renderer.render(scene, camera);
function render() {
//Finally, draw to the screen
requestAnimationFrame(render);
renderer.render(scene, camera);
}
render();
Any help is truly appreciated.
ThreeJS/87
Edit:
I think I'm lacking more precision on my post.
This is the original full alpha image:
It might not show because its all white
And this is the same image with different transparencies on 4 quadrants:
This one too might not show because its all white
I got a helpful answer where I was told to make the alphaTest higher, but the problem is that doing that wipes out the transparent parts out of the images, and I need to conserve those parts.
Here is a copy of the codepen with the updated images and showing the same (but slight) grey edges:
codepen
Sorry for not being as precise the first time, any further help is even more appreciated.
Set alphaTest to 0.9.. or higher.. observe the improvement.
Your star texture has gray or black in the area outside the star, which is why you're seeing a gray halo. You can fix it by filling the image with white, (but not changing the alpha channel) in your image editing tool.
Also, you should upgrade to latest three.js (r95)
edit:
I'm not sure what your exact expectation is.. but there are many different settings that control alpha blending in THREE. There is renderer.premultipliedAlpha = true/false (defaults to true) and material.transparent = true/false; material.alphaTest is a threshold value to control at what level alpha is ignored completely. There are also the material.blending, .blendEquation .blendEquation, .blendEquationAlpha, blendDst and blendSrc. etc. etc. You probably need to read up on those.
https://threejs.org/docs/#api/materials/Material
For instance.. here is your texture with:
renderer.premultipliedAlpha = false;
notice the black border on one quadrant of your texture.
https://codepen.io/manthrax/pen/KBraNB
I'm using an image in a canvas element as a texture in Three.js, performing image manipulations on the canvas using JavaScript, and then calling needsUpdate() on the texture. This works, but it's quite slow.
I'd like to perform the image calculations in a fragment shader instead. I've found many examples which almost do this:
Shader materials: http://mrdoob.github.io/three.js/examples/webgl_shader2.html This example shows image manipulations performed in a fragment shader, but that shader is functioning as the fragment shader of an entire material. I only want to use the shader on a texture, and then use the texture as a component of a second material.
Render to texture: https://threejsdoc.appspot.com/doc/three.js/examples/webgl_rtt.html This shows rendering the entire scene to a WebGLRenderTarget and using that as the texture in a material. I only want to pre-process an image, not render an entire scene.
Effects composer: http://www.airtightinteractive.com/demos/js/shaders/preview/ This shows applying shaders as a post-process to the entire scene.
Edit: Here's another one:
Render to another scene: http://relicweb.com/webgl/rt.html This example, referenced in Three.js Retrieve data from WebGLRenderTarget (water sim), uses a second scene with its own orthographic camera to render a dynamic texture to a WebGLRenderTarget, which is then used as a texture in the primary scene. I guess this is a special case of the first "render to texture" example listed above, and would probably work for me, but seems over-complicated.
As I understand it, ideally I'd be able to make a new framebuffer object with its own fragment shader, render it on its own, and use its output as a texture uniform for another material's fragment shader. Is this possible?
Edit 2: It looks like I might be asking something similar to this: Shader Materials and GL Framebuffers in THREE.js ...though the question doesn't appear to have been resolved.
Render to texture and Render to another scene as listed above are the same thing, and are the technique you want. To explain:
In vanilla WebGL the way you do this kind of thing is by creating a framebuffer object (FBO) from scratch, binding a texture to it, and rendering it with the shader of your choice. Concepts like "scene" and "camera" aren't involved, and it's kind of a complicated process. Here's an example:
http://learningwebgl.com/blog/?p=1786
But this also happens to be essentially what Three.js does when you use it to render a scene with a camera: the renderer outputs to a framebuffer, which in its basic usage goes straight to the screen. So if you instruct it to render to a new WebGLRenderTarget instead, you can use whatever the camera sees as the input texture of a second material. All the complicated stuff is still happening, but behind the scenes, which is the beauty of Three.js. :)
So: To replicate a WebGL setup of an FBO containing a single rendered texture, as mentioned in the comments, just make a new scene containing an orthographic camera and a single plane with a material using the desired texture, then render to a new WebGLRenderTarget using your custom shader:
// new render-to-texture scene
myScene = new THREE.Scene();
// you may need to modify these parameters
var renderTargetParams = {
minFilter:THREE.LinearFilter,
stencilBuffer:false,
depthBuffer:false
};
myImage = THREE.ImageUtils.loadTexture( 'path/to/texture.png',
new THREE.UVMapping(), function() { myCallbackFunction(); } );
imageWidth = myImage.image.width;
imageHeight = myImage.image.height;
// create buffer
myTexture = new THREE.WebGLRenderTarget( width, height, renderTargetParams );
// custom RTT materials
myUniforms = {
colorMap: { type: "t", value: myImage },
};
myTextureMat = new THREE.ShaderMaterial({
uniforms: myUniforms,
vertexShader: document.getElementById( 'my_custom_vs' ).textContent,
fragmentShader: document.getElementById( 'my_custom_fs' ).textContent
});
// Setup render-to-texture scene
myCamera = new THREE.OrthographicCamera( imageWidth / - 2,
imageWidth / 2,
imageHeight / 2,
imageHeight / - 2, -10000, 10000 );
var myTextureGeo = new THREE.PlaneGeometry( imageWidth, imageHeight );
myTextureMesh = new THREE.Mesh( myTextureGeo, myTextureMat );
myTextureMesh.position.z = -100;
myScene.add( myTextureMesh );
renderer.render( myScene, myCamera, myTexture, true );
Once you've rendered the new scene, myTexture will be available for use as a texture in another material in your main scene. Note that you may want to trigger the first render with the callback function in the loadTexture() call, so that it won't try to render until the source image has loaded.
This is my code:
var sprite = new THREE.Sprite(material);
sprite.renderDepth = 10;
The above renderDepth setting is invalid, it does not work for sprites.
How to solve this problem?
You want one sprite to always be on top.
Since SpriteMaterial does not support a user-specified renderDepth, you have to implement a work-around.
Sprites are rendered last when using WebGLRenderer.
The easiest way to do what you want is to have two scenes and two render passes, with one sprite in the second scene like so:
renderer.autoClear = false;
scene2.add( sprite2 );
then in the render loop
renderer.render( scene, camera );
renderer.clearDepth();
renderer.render( scene2, camera );
three.js r.64