Threejs webvr with two separate renderers - three.js

I have two separate renderers with separate scenes. In one scene, I have a very expensive dynamic shader which I am rendering at a low resolution to boost fps (using .setPixelRatio(0.3). The other renderer takes care of everything else. I can render both at the same time, one on top of the other, without issue.
The problem is I cannot figure out how to render both at the same time with WebVR enabled.
Here is the relevant code
self.shaderRenderer.vr.enabled = true;
self.renderer.vr.enabled = true;
WEBVR.getVRDisplay( function ( display ) {
self.renderer.vr.setDevice( display );
self.shaderRenderer.vr.setDevice( display );
document.body.appendChild( WEBVR.getButton( display, self.renderer.domElement ) );
});
self.shaderRenderer.animate(self.renderShader);
self.renderer.animate(self.render);
I can switch between the two in VR mode by swapping the renderer.domElement in the 6th line. How do I render both, one on top of the other with WebVR enabled?

Related

Why THREE.WebGLRenderer.setPixelRatio() on computer is slower than mobile?

In the main js script I have below properties for the renderer
const renderer = new THREE.WebGLRenderer({
canvas: canvas
})
renderer.setSize(sizes.width, sizes.height)
renderer.setPixelRatio(10) // 2 is good enough for pixel ratio but I just want to play around and set to 10.
My render became really smooth, on both google chrome desktop and mobile. However on my desktop which is using rtx2060 super, the fps is very low around 10fps, and on my mobile there is no drop on fps. I could not debug the issue.
renderer.setPixelRatio(10)
Please don't do that. This line will produce a final drawing buffer size which will be problematic for most devices. In almost all use cases, setPixelRatio() should be used like so in order to render at the native screen resolution:
renderer.setPixelRatio( window.devicePixelRatio );
It's not recommended to experiment with different values other than window.devicePixelRatio.

How to make renderer transparent in three js Trackball control (r87)?

I have two renderer in one canvas as, I want that renderer to be transparent I applied this below code. but it didn't work out. could anybody help me out?
var renderer = new THREE.WebGLRenderer( { alpha: true } );
renderer.setClearColor( 0x000000, 0 );
Don't use setClearColor if you want a transparent background. It overrides alpha, which is what makes the background transparent. EDIT: Using setClearColor with the WebGLRenderer.alpha property is fine. Just remember to include an alpha parameter if you want the background to be transparent: renderer.setClearColor( hexColor, alphaValue );
I don't understand what you mean by "two renderer in one canvas" though. That sounds dangerous, at best. If you're trying to draw two things to the same canvas, consider combining them into one scene, or render two scenes using the same renderer.
(This also has nothing to do with TrackballControl.)

Are all renderers good for textures?

So, the scene include an earth spinning on its axis, a moon rotating around the earth, and a light source to the right that will help to simulate the effect of an eclipse. I thought it would be easy because we've done shadows and transformations before but I ran into a problem.
In our template we have the following at the top:
// For the assignment where a texture is required you should
// deactivate the Detector and use ONLY the CanvasRenderer. There are some
// issues in using waht are called Cross Domain images for textures. You
// can get more details by looking up WebGL and CORS using Google search.
// if ( Detector.webgl )
// var renderer = new THREE.WebGLRenderer();
// else
var renderer = new THREE.CanvasRenderer();
My problem is, when I leave it like that, the spotlight doesn't appear on the scene. However, as was warned, if I activate the Detector, the textures won't work.
But I need both textures and the spotlight. How do I work around this?
You are confusing yourself. Detector.webgl only checks for support of WebGL on the browser. The code below uses the WebGL renderer if the current browser supports WebGL and CanvasRenderer if there is no WebGL support.
if ( Detector.webgl )
var renderer = new THREE.WebGLRenderer();
else
var renderer = new THREE.CanvasRenderer();
With WebGL - loading textures will run into a cross domain issue. Best to then execute the code either on a web server or a local server like http://www.wampserver.com/en/ for Windows or https://www.mamp.info/en/ for Mac. Or npm-package like https://github.com/tapio/live-server.
As far as I know shadows are not supported on the CSSCanvasRender. I would ask your assignment head to clarify.

TrackballControls change events

I have a static scene with no animation loop, and am trying to use the change event of TrackballControls to trigger the render function following the pattern in this thread, i.e.:
var controls = new THREE.TrackballControls( camera, renderer.domElement );
controls.addEventListener( 'change', render );
function render() {
renderer.render( scene, camera );
}
This works well with OrbitControls, but the change events don't fire when I substitute TrackballControls. However, if I add the line:
_this.update();
at the end of mousewheel(), mousemove(), and touchmove() functions in TrackballControls.js, I can get the change events to fire correctly (in my case, anyway). I am not sure if this is the best way to get the change events firing. Is forking a local copy of TrackballControls the best solution for this case, have I overlooked something, or does it make sense to change TrackballControls.js?
TrackballControls was written to require an animation loop in which controls.update() is called.
OrbitControls, on the other hand, can be used in static scenes in which the scene is rendered only when the mouse is moved, like so:
controls.addEventListener( 'change', render );
In either case, the controls are part of the examples -- not the library -- so you are free to hack them to your liking.
What you are proposing is fine if your scene is static and there is no damping required.
EDIT: corrected for three.js r.73

Reset depth buffers destroyed by EffectComposer

I'm trying to render two different scenes and cameras on top of each other, like a HUD. Both render correctly when alone. Also this works as intended, so you can see mainscene under the helpscene:
renderer.render(mainscene, maincamera);
renderer.render(helpscene, helpcamera);
When I'm using EffectComposer to render the main scene, I can not see helpscene at all, I only see the results of composer rendering:
renderTargetParameters = { minFilter: THREE.LinearFilter, magFilter: THREE.LinearFilter, format: THREE.RGBAFormat, stencilBuffer: false };
renderTarget = new THREE.WebGLRenderTarget( width, height, renderTargetParameters );
composer = new THREE.EffectComposer(renderer, renderTarget);
---- cut out for brevity ---
composer.render(delta);
renderer.render(helpscene, helpcamera); // has no effect whatsoever on the screen, why?
What is happening here? Again if I comment either render call out, they work correctly. But both enabled, I only see the composer scene/rendering. I would expect the helpscene to overlay (or at least overwrite) whatever is rendered before that.
I have quite complex code before renderer.render(helpscene, helpcamera);, it might take various different render paths and use effectcomposer or not based on different settings. But I want the helpscene to always take the simple route with no effects or anything, that's why I'm using a separate render call and not incorporating it as an effectcomposer pass.
EDIT: Turns out it is because some funny business with depth buffers (?). If I set material.depthTest = false to everything in the helper scene, it will show kind of correctly. It looks like the depth is set to zero or very low by some composer pass or by the composer itself, and rather unexpectedly, it will have the effect of hiding anything rendered with subsequent render calls.
Because I'm only using LineMaterial in the helper scene it will do for now, but I expect some problems further down the road with the depthTest = false workaround (might have some real shaded objects there later, which would need depth test against other objects inside the same helper scene).
So I guess the REAL QUESTION IS: how do I reset the depth buffers (or something) after EffectComposer, so that further render calls are not affected by it? I can also do the helper scene rendering as the last composer pass, does not make much difference.
I should maybe mention that one of my composer setups main RenderPass renders as a texture to a distorted plane geometry near a perspective camera created for that purpose (like the ortographic camera & quad setup found in many postprocessing examples, but with distortion). Other setup has a "normal" RenderPass with the actual scene camera, where I would expect the depth information to be such that I should see the helper scene anyway (that's probably some seriously f****ed up english, sorry, non-native speaker here and I could not come up with better words). I am having the same problem with both alternatives.
...and answering my self. After finding the real cause, it's quite simple.
renderer.clear(false, true, false); will clear the depth buffers so the overlay render works as expected :)

Resources