I'm using the stencil buffer to composite two THREE.js Scenes over each other. This works, but when I try to use PointerLockControls, moving the camera (using mouse or keyboard) somehow causes the stenciled part of the view to be moved/shifted. It seems to lag behind the camera position.
I've tried other controls (Trackball, Orbit), and this issue does not occur when using those.
Something seems to be clashing, but I have no idea where to look.
The PointerLock code is copied straight from the examples.
This is the render code:
// 1: clear scene (autoClear is disabled)
renderer.clear(true, true, true);
// 2: draw portal mesh into stencil buffer
gl.colorMask(false, false, false, false);
gl.depthMask(false);
gl.enable(gl.STENCIL_TEST);
gl.stencilMask(0xFF);
gl.stencilFunc(gl.NEVER, 0, 0xFF);
gl.stencilOp(gl.INCR, gl.KEEP, gl.KEEP);
renderer.render(stencilScene, camera);
gl.colorMask(true, true, true, true);
gl.depthMask(true);
gl.stencilOp(gl.KEEP, gl.KEEP, gl.KEEP);
// 3: draw toScene on scencil
renderer.clear(false, true, false);
gl.stencilFunc(gl.LESS, 0, 0xff);
renderer.render(toScene, camera);
gl.disable(gl.STENCIL_TEST);
renderer.clear(false, false, true);
// clear the depth buffer and draw the fromPortal mesh into it
renderer.clear(false, true, false);
gl.colorMask(false, false, false, false);
gl.depthMask(true);
renderer.render(stencilScene, camera);
// draw the actual scene
gl.colorMask(true, true, true, true);
gl.depthMask(false); // gl.enable(gl.DEPTH_TEST) ?
renderer.render(fromScene, camera);
Here's a fiddle: http://jsfiddle.net/g8EWG/
If you move from left to right, or just look around, the shift is quite noticeable.
Any help would be greatly appreciated.
Related
I am using multiple scenes as a workaround for selective lighting. Now, I meet a difficulty in using transparent objects.
For simplity, I created a jsfiddle illustration:
[1]: https://jsfiddle.net/curisiro/w9ke75ma/2/
I have two transparent squares which are in different scenes. The problem is I can see the blue square behind the red square (figure 1) but I can NOT see the red square behind the blue square (figure 2).
With material, by using other effects, depthTest and depthWrite must be set to true as default.
Do you have any solution to solve this problem?
Edit: If you insist on using two scenes, you can fix this problem by clearing the depth between the renders:
function render() {
requestAnimationFrame(render);
this.renderer.clear();
renderer.render(scene, camera);
renderer.clearDepth(); // <--- Like this
renderer.render(scene1, camera);
}
However, this is limiting if you plan to add more complexity to the scene and need depth testing to take place between them. Alternatively, just render to the same scene:
let geometry = new THREE.BoxGeometry(1, 1, 1);
let material = new THREE.MeshStandardMaterial({color: 0x0000ff, transparent: true, opacity: 0.4});
let mesh = new THREE.Mesh(geometry, material);
scene.add(mesh);
let geometry1 = new THREE.BoxGeometry(1, 1, 1);
let material1 = new THREE.MeshStandardMaterial({color: 0xff0000, transparent: true, opacity: 0.4});
let mesh1 = new THREE.Mesh(geometry1, material1);
mesh1.position.z = 2;
scene.add(mesh1);
(see forked fiddle). In this case, you would handle selective lighting some other way (layers, or custom materials perhaps, depending on what you need).
I have some text rendering over a background quad. Let's call this a 'label'. Both are positioned at the same point which causes z-fighting.
I'd like to promote the text to avoid z-fighting using polygon offset.
This is how I add the polygon offset to the text material:
const material = new THREE.RawShaderMaterial(
CreateMSDFShader({
map: this.glyphs,
opacity: opt.opacity ?? 1,
alphaTest: (opt.opacity ?? 1) < 1 ? 0.001 : 1,
color: opt.colour ?? '#ffffff',
transparent: opt.transparent ?? true,
glslVersion: opt.renderMode === 'webgl' ? THREE.GLSL1 : THREE.GLSL3,
side: opt.side ?? THREE.DoubleSide,
depthFunc: opt.depthFunc ?? THREE.LessEqualDepth,
depthTest: true,
depthWrite: false,
polygonOffset: true,
polygonOffsetUnits: -1.0,
polygonOffsetFactor: -4.0,
})
);
const mesh = new THREE.Mesh(geom, material);
and this is the background material:
if (tableOptions.background) {
const geometry = new THREE.PlaneGeometry(1, 1, 1, 1);
const backgroundMaterial = new ActivatableMaterial(
{
color: new THREE.Color(tableOptions.backgroundColour),
toneMapped: false,
opacity: 1,
alphaTest: 0.001,
transparent: true,
},
{
activationColour: new THREE.Color(tableOptions.activationColour),
}
);
this.background = buildUIObject(
Actions.ExpandLabel as Action,
geometry,
backgroundMaterial
);
setName(this.background, 'Background');
this.tableGroup.add(this.background);
}
The polygon offset just isn't working (using Chrome). The text disappears behind the background quad as I orbit the camera around, and reappears at random. The labels are always facing the camera (using lookAt).
What could stop the polygon offset from working?
The labels are rendering to a renderpass with a render taget set up as follows:
const pars = {
minFilter: THREE.LinearFilter,
magFilter: THREE.LinearFilter,
format: THREE.RGBAFormat,
stencilBuffer: false,
depthBuffer: false,
};
this.renderTargetBuffer = new THREE.WebGLRenderTarget(
resolution.x,
resolution.y,
pars
);
this.renderTargetBuffer.texture.name = 'RenderTargetBuffer';
this.renderTargetBuffer.texture.generateMipmaps = false;
I'm assuming that because the polygonOffset is a state thing it doesn't matter that this is a RawShaderMaterial. Is that a safe assumption?
Edit: I have added the opposite polygonOffset to the background mesh separately and again it doesn't work.
Polygon offset wasn't the best solution in this case. I switched to maintaining a custom order for the labels based on distance to the camera. That way I could force the text render order to be greater than the background.
I have THREE.Points with Material:
var material = new THREE.PointsMaterial( { sizeAttenuation: false, visible: true, size: 128, color: 0xffffff, depthTest: false, depthWrite: false, opacity: 0.5, blending: THREE.AdditiveBlending, transparent: true, map: texture } );
This material works fine.
When I put another THREE.Points with the similar material, this material starts to be invisible, when I have both in camera view.
When I rotate the camera to position, where is visible only one THREE.Points, material starts to be visible again.
When is loaded and present in the camera view field some another object, or mesh, pointClouds(Points in r73) are invisible again ...
Problem starts with update to r73.
EDIT: IT HAPENS WHEN PLANET OBJECT IS VISIBLE set visible false, and all works great...
EDIT2 it happens only on some GPU, on INTEL HD4000/DELL NTB, error not occuring on ATI Mobility Radeon
I need to blur the frame buffer and I don't know how to get the frame buffer using THREE.js.
I want to blur the whole frame buffer rather than blur each textures in the scene. So I guess I should read the frame buffer and then blur, rather than doing this in shaders.
Here's what I have tried:
Call when init:
var renderTarget = new THREE.WebGLRenderTarget(512, 512, {
wrapS: THREE.RepeatWrapping,
wrapT: THREE.RepeatWrapping,
minFilter: THREE.NearestFilter,
magFilter: THREE.NearestFilter,
format: THREE.RGBAFormat,
type: THREE.FloatType,
stencilBuffer: false,
depthBuffer: true
});
renderTarget.generateMipmaps = false;
Call in each frame:
var gl = renderer.getContext();
// render to target
renderer.render(scene, camera, renderTarget, false);
framebuffer = renderTarget.__webglFramebuffer;
console.log(framebuffer);
gl.flush();
if (framebuffer != null)
gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);
var width = height = 512;
var rdData = new Uint8Array(width * height * 4);
gl.readPixels(0, 0, width, height, gl.RGBA, gl.UNSIGNED_BYTE, rdData);
console.log(rdData);
// render to screen
renderer.render(scene, camera);
But framebuffer is WebFramebuffer {} and rdData is full of 0. Am I doing this in the right way?
Any blur should use shaders to be efficient, but in this case not as materials.
If you want to blur the entire frame buffer and render that to the screen use the effect composer. It's located in three.js/examples/js./postprocessing/EffectComposer.js
Set up the scene camera and renderer as normal but in addition add an instance of the effect composer. With the scene as a render pass.
composer = new THREE.EffectComposer( renderer );
composer.addPass( new THREE.RenderPass( scene, camera ) );
Then blur the whole buffer with two passes using the included blur shaders located in three.js/examples/shaders/
hblur = new THREE.ShaderPass( THREE.HorizontalBlurShader );
composer.addPass( hblur );
vblur = new THREE.ShaderPass( THREE.VerticalBlurShader );
// set this shader pass to render to screen so we can see the effects
vblur.renderToScreen = true;
composer.addPass( vblur );
finally in your method called in each frame render using the composer instead of the renderer
composer.render();
Here is a link to a working example of full screen blur
Try using the MeshDepthMaterial and render this into your shader.
I suggest rendering the blur pass with a dedicated camera using the same settings as the scene's diffuse camera. Then by adjusting the camera's frustrum you can do both screen and depth of blur effects. For a screen setup move the near frustrum towards the camera and move the far frustrum in increments away from the camera.
http://threejs.org/docs/#Reference/Materials/MeshDepthMaterial
I am trying to figure out a way to cut out a certain region of a background texture such that a certain custom pattern is not rendered on the screen for that background. For example:
This square can be any pattern.
I am using Frame Buffer Object and Stencil Buffer to achieve this kind of effect. Here is the code:
fbo.begin();
//Disables ColorMask and DepthMask so that all the rendering is done on the Stencil Buffer
Gdx.gl20.glColorMask(false, false, false, false);
Gdx.gl20.glDepthMask(false);
Gdx.gl20.glEnable(GL20.GL_STENCIL_TEST);
Gdx.gl20.glStencilFunc(GL20.GL_ALWAYS, 1, 0xFFFFFFFF);
Gdx.gl20.glStencilOp(GL20.GL_REPLACE, GL20.GL_REPLACE, GL20.GL_REPLACE);
stage.getSpriteBatch().begin();
rHeart.draw(stage.getSpriteBatch(), 1); //Draws the required pattern on the stencil buffer
//Enables the ColorMask and DepthMask to resume normal rendering
Gdx.gl20.glColorMask(true, true, true, true);
Gdx.gl20.glDepthMask(true);
Gdx.gl20.glStencilFunc(GL20.GL_EQUAL, 1, 0xFFFFFFFF);
Gdx.gl20.glStencilOp(GL20.GL_KEEP, GL20.GL_KEEP, GL20.GL_KEEP);
background.draw(stage.getSpriteBatch(), 1); //Draws the background such that the background is not rendered on the required pattern, leaving that area black.
stage.getSpriteBatch().end();
Gdx.gl20.glDisable(GL20.GL_STENCIL_TEST);
fbo.end();
However this is not working at all. How am I supposed to do this using Stencil Buffers? I am also facing some difficulty understanding glStencilFunc and glStencilOp. It would be very helpful if anyone can shed some light on these two.
UPDATE: I have also tried producing something of the same kind using glColorMask. Here is the code:
Gdx.gl20.glClearColor(0, 0, 0, 0);
stage.draw();
FrameBuffer.clearAllFrameBuffers(Gdx.app);
fbo1.begin();
Gdx.gl20.glClearColor(0, 0, 0, 0);
batch.begin();
rubber.draw(batch, 1);
Gdx.gl20.glColorMask(false, false, false, true);
coverHeart.draw(batch, 1);
Gdx.gl20.glColorMask(true, true, true, false);
batch.end();
fbo1.end();
toDrawHeart = new Image(new TextureRegion(fbo1.getColorBufferTexture()));
batch.begin();
toDrawHeart.draw(batch, 1);
batch.end();
This code is producing this:
Instead of something like this: (Ignore the windows sizes and colour tones)
Note: I am using the libgdx library.
While drawing to a SpriteBatch, state changes are ignored, until end() is called. If you want to use stenciling with SpriteBatch, you'll need to break up the batch drawing. One thing, I've left out FBOs, but that shouldn't make a difference.
#Override
public void create() {
camera = new OrthographicCamera(1, 1);
batch = new SpriteBatch();
texture = new Texture(Gdx.files.internal("data/badlogic.jpg"));
texture.setFilter(TextureFilter.Linear, TextureFilter.Linear);
TextureRegion region = new TextureRegion(texture, 0, 0, 256, 256);
sprite = new Sprite(region);
sprite.setSize(1f, 1f);
sprite.setPosition(-0.5f, -0.5f);
spriteUpsideDown = new Sprite(new TextureRegion(texture, 1f, 1f, 0f, 0f));
spriteUpsideDown.setSize(1f, 1f);
spriteUpsideDown.setPosition(-0.5f, -0.5f);
pattern = new Sprite(region);
pattern.setSize(0.5f, 0.5f);
pattern.setPosition(-0.25f, -0.25f);
<< Set Input Processor >>
}
The input processor allows to set two boolean flags breakBatch1 and breakBatch2 via keyboard (libgdx on desktop), which are used to break the SpriteBatch drawing.
#Override
public void render() {
Gdx.gl.glClearColor(1, 1, 1, 1);
Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_STENCIL_BUFFER_BIT);
batch.setProjectionMatrix(camera.combined);
// setup drawing to stencil buffer
Gdx.gl20.glEnable(GL20.GL_STENCIL_TEST);
Gdx.gl20.glStencilFunc(GL20.GL_ALWAYS, 0x1, 0xffffffff);
Gdx.gl20.glStencilOp(GL20.GL_REPLACE, GL20.GL_REPLACE, GL20.GL_REPLACE);
Gdx.gl20.glColorMask(false, false, false, false);
// draw base pattern
batch.begin();
pattern.draw(batch);
if(breakBatch1) { batch.end(); batch.begin(); }
// fix stencil buffer, enable color buffer
Gdx.gl20.glColorMask(true, true, true, true);
Gdx.gl20.glStencilOp(GL20.GL_KEEP, GL20.GL_KEEP, GL20.GL_KEEP);
// draw where pattern has NOT been drawn
Gdx.gl20.glStencilFunc(GL20.GL_NOTEQUAL, 0x1, 0xff);
sprite.draw(batch);
if(breakBatch2) { batch.end(); batch.begin(); }
// draw where pattern HAS been drawn.
Gdx.gl20.glStencilFunc(GL20.GL_EQUAL, 0x1, 0xff);
spriteUpsideDown.draw(batch);
batch.end();
}
Gdx.gl20.glStencilFunc(GL20.GL_REPLACE, GL20.GL_REPLACE, GL20.GL_REPLACE);
These are not the right arguments to glStencilFunc. I think you mean glStencilOp here.
You need to use glGetError in your code, it will alert you to these kinds of errors.
I believe your problem is that your initial GL_REPLACE stencil operation is applied to all the drawn pixels by your rHeart.draw regardless of the shape of any texture applied on the quad.
Thus, the stencil value is applied to every pixel of your quads which gives your problem.
If the texture applied on your quad has an alpha channel, as GL_ALPHA_TEST is not supported, you could setup your shader to discard the totally transparent pixels, preventting them from being drawn to the stencil buffer.