Preloading Mesh Data in Three.js - three.js

It appears that THREE.js sends mesh (geometry and material) information to the card only when an object is first rendered. Unfortunately this can cause noticeable hiccups in frame rate when a new object comes on the scene.
Is there a way to utilize the three.js framework (or is there a parameter I'm missing) to send the mesh data down to the card immediately after the associated resources are loaded, rather than on first render? I've considered creating a temporary / off-screen scene that I could put each object into on load, render once, and the discard. I've also tried calling the low-level renderer functions directly with mock data to force the write. It works, but both are hacks.
Any suggestions?
Three.js r67.

Related

Preserving textures not visible in a Rajawali scene

I've a working 360 video viewer built using Rajawali/Google VR. I'm trying to modify it to display a texture which is the output of a non Rajawali GL program, rather than displaying the StreamingTexture from the video directly (i.e. within onDrawEye I'm using a separate GL program and passing it a texture ID, it's rendering to a framebuffer with an attachment which'll be used as the texture on the sphere).
The problem I am having is that however I manage the textures (either directly with OpenGL, or using Rajawali's texture classes), they are empty within my inner GL program (i.e. black/no output when embedding into a sampler). The only thing that works is if I add the texture to a material on some dummy object visible within the scene - then the texture is available within my separate program - but that's not what I want. I've tried just adding the texture to the TextureManager, but it's insufficient to keep it around. What I'm trying to do is extremely simple and works fine without Rajawali or the GVR machinery.
What's causing even textures I'm generating and managing myself to be compromised? I don't have a minimum failing example, but could put one together with some effort.

Occlusion of real-world objects using three.js

I’m using three.js inside an experimental augmented-reality web browser. (The browser is called Argon. Essentially, Argon uses Qualcomm’s Vuforia AR SDK to track images and objects in the phone camera. Argon sends the tracking information into Javascript, where it uses transparent web pages with three.js to create 3D graphics on top of the phone video feed.) My question is about three.js, however.
The data Argon sends into the web page allows me to align the 3D camera with the physical phone camera and draw 3D graphics such that they appear to align with the real world as expected. I would also like to have some of the things in the physical world occlude the 3D graphics (I have 3D models of the physical objects, because I’ve set the scene up or because they are prepared objects like boxes that are being tracked by Vuforia).
I’m wondering if folks have suggestions on the best way to accomplish this occlusion with three.js. Thanks.
EDIT: it appears that the next version of three.js (R71) will have a simpler way to do this, so if you can use the dev branch (or just wait), you can do this much more easily. See this post: three.js transparent object occlusion
MY ORIGINAL ANSWER (without using the new features in R71):
I think the best way to do this is (to avoid extra work by creating new rendering passes for example) to modify the WebGL renderer (src/renderers/WebGLRenderer.js) and add support for a new kind of object, perhaps call them “occlusionObjects”.
If you look in the renderer, you will see two current object lists, opaqueObjects and transparentObjects. The renderer sorts the renderable objects into these two lists, so that it can render the opaque objects first, and then the transparent objects after them. What you need to do is store all of your new objects into the occlusionObjects list rather than those two. You will see that the opaque and transparent objects are sorted based on their material properties. I think here, you may want to add a property to an object you want to be an occluder (“myObject.occluder = true”, perhaps), and just pull those objects out.
Once you have the three lists, look what the render() function does with these object lists. You’ll see a couple of places with rendering calls like this:
renderObjects( opaqueObjects, camera, lights, fog, true, material );
Add something like this before that line, to turn off writing into the color buffers, render the occlusion objects into the depth buffer only, and then turn color buffer writes back on before you render the remaining objects.
context.colorMask( false, false, false, false);
renderObjects( occluderObjects, camera, lights, fog, true, material );
context.colorMask(true, true, true, true);
You’ll need to do this in a couple of places, but it should work.
Now you can just mark any objects in your scene as “occluder = true” and they will only render into the depth buffer, allowing the video to show through and occluding any opaque or transparent objects rendered behind them.

Render scene onto custom mesh with three.js

After messing around with this demo of Three.js rendering a scene to a texture, I successfully replicated the essence of it in my project: amidst my main scene, there's a now sphere and a secondary scene is drawn on it via a THREE.WebGLRenderTarget buffer.
I don't really need a sphere, though, and that's where I've hit a huge brick wall. When trying to map the buffer onto my simple custom mesh, I get an infinite stream of the following errors:
three.js:23444 WebGL: INVALID_VALUE: pixelStorei: invalid parameter for alignment
three.js:23557 Uncaught TypeError: Cannot read property 'width' of undefined
My geometry, approximating an annular shape, is created using this code. I've successfully UV-mapped a canvas onto it by passing {map: new THREE.Texture(canvas)} into the material options, but if I use {map: myWebGLRenderTarget} I get the errors above.
A cursory look through the call stack makes it look like three.js is assuming the presence of the texture.image attribute on myWebGLRenderTarget and attempting to call clampToMaxSize on it.
Is this a bug in three.js or am I simply doing something wrong? Since I only need flat rendering (with MeshBasicMaterial), one of the first thing I did when adapting the render-to-texture demo above was remove all trace of the shaders, and it worked great with just the sphere. Do I need those shaders back in order to use UV mapping and a custom mesh?
For what its worth, I was needlessly setting needsUpdate = true on my texture. (The handling of needsUpdate apparently assumes the presence of a <canvas> that the texture is based on.)

Three.js restoring webglcontext

What components of a three.js scene need to be recreated when restoring the webglcontext?
For example, can I;scene.add( myoldcamera );
Or do I need to do;scene.add( new Camera() );
And is it different depending on the object type? e.g. Materials, Lights, Meshes etc.
When a context is lost in a webGL program, it never gets back automatically , we need to re-create textures, buffers, framebuffers, renderbuffers, shaders, programs, and setup state (clearColor, blendFunc, depthFunc, etc...) on webglcontextrestored event.
Here you can get more info .

Threejs objmtlloader black model

I'm new to ThreeJS and I made this example which shows one of our model.
http://petrie3dtesting.museums.ucl.ac.uk/3DFootCover/index.html
I created a Petrie3Dviewer and in the HTML page created a viewer object which takes as input an .obj and .mtl file. Strangely tho, the objects shows up BLACK and then when I start interacting the texture comes up. I tried everything I think: different browsers, making the texture smaller, different computers, nothing I get a random behaviour all the time.
I tried on FIrefox, Chrome mainly.
It seems that I need to force the rendering once the obj file is loaded but the OBJMTLLoader.js does not provide any event for it.
Really many thanks for the help.
Best,
GC
You should call this.DoRender in your Animate function to render the frame.
this.Animate = function() {
this.orbitControls.update();
this.Animate();
requestAnimationFrame(this.Animate.bind(this));
}
At the moment you call your render function only when the user changes the perspective with the OrbitControls. And because your texture is loaded asynchronously it is not ready the first time when you render the frame.

Resources