Three.js webgl renderer for high DPI displays bug? - three.js

I have the following logic to create a Three.js R69 WebGL renderer that is supposed to handle high DPI displays. It did for quite a while, but about a week ago one and only one three.js page started rendering like the high DPI was correctly set, but my 3D coordinate origin became the upper left corner of the rendering canvas rather than the expected center of the canvas. (No changes to my environment that I can tell, maybe the browsers auto-updated. I'm testing with Chrome, FF and Safari on OSX 10.10.1)
// create our renderer:
gCex3.renderer = new THREE.WebGLRenderer({ antialias:true, alpha:true,
devicePixelRatio: window.devicePixelRatio || 1
});
// Three.js R69: I started needing to explicitly set this so clear alpha is 1:
gCex3.renderer.setClearColor( new THREE.Color( 0x000000 ), 1 );
gCex3.rendererDOM = $('#A3DH_Three_wrapper');
gCex3.rendererDOM.append( gCex3.renderer.domElement );
// fbWidth & fbHeight are w,h of a div located within the page:
gCex3.renderer.setSize( gs.fbWidth, gs.fbHeight, true ); // 'true' means update the canvas style
Checking the latest R69 examples, they don't seem to do anything special for high dpi displays. Checking the WebGLRenderer source, it looks like the devicePixelRatio logic is now embedded within the WebGLRenderer() function.
I've tried that minimal logic in the examples, specifically this:
renderer = new THREE.WebGLRenderer( { antialias: false } );
renderer.setClearColor( new THREE.Color( 0x000000 ), 1 );
renderer.setSize( gs.fbWidth, gs.fbHeight, true );
And I see the same behavior: my coordinate system origin is the upper left of the rendering canvas.
Note that in Chrome, when running the javascript debugger, I see a "webgl context lost" event during the exiting of the previous page, but before this logic is being executed. Could the WebGLRenderer be getting created during the period when there is no WebGL context?

Related

Three js texture offset not updating

I am currently trying to create a mesh that is colored using a datatexture, my initial coloring shows up just fine, but now my next goal is to offset the texture along the y axis. very similar to this example.
http://math.hws.edu/graphicsbook/demos/c5/textures.html
How I create my texture / mesh:
this.colorTexture = new DataTexture(colors, this.frameWidth, frameCount, RGBFormat, FloatType, UVMapping, RepeatWrapping, RepeatWrapping);
const material = new MeshBasicMaterial({
side: FrontSide,
vertexColors: true,
wireframe: false,
map: this.colorTexture
});
this.mesh = new Mesh(geometry, material);
How I attempt to animate the texture using offset:
this.mesh.material.map.offset.y -= 0.001;
this.mesh.material.map.needsUpdate = true;
this.mesh.material.needsUpdate = true;
this.mesh.needsUpdate = true;
I have confirmed that the function I'm using to try to offset is being called during each animation frame, however the visualization itself is not animating or showing changes apart from the initial positioning of the colors I wrote to the texture.
Any help is greatly appreciated :)
The uv transformation matrix of a texture is updated automatically as long as Texture.matrixAutoUpdate is set to true (which is also the default value). You can simply modulate Texture.offset. There is no need to set any needsUpdate flags (Mesh.needsUpdate does not exist anyway).
It's best if you strictly stick to the code from the webgl_materials_texture_rotation example. If this code does not work, please demonstrate the issue with a live example.

Disabling WebGL gl.BLEND doesn't Work

I use THREE.js and enable alpha canvas: (because I need to build my WebGL on top of something else)
this.renderer = new THREE.WebGLRenderer({ canvas: this.canvas, antialias: false, alpha: true });
I set the clear color like this:
this.renderer.setClearColor(0xffffff, 0.0);
In each frame:
_render () {
renderer.clear();
gl.disable(gl.BLEND);
// ... something else doesn't need to be blended, whose alpha value is not 1.0
}
I'm curious why something else still gets blended with the white background even if I disable gl.BLEND.
three.js controls the blending. When you call renderer.render it will set the blending calling gl.enable(gl.BLEND) for each material depending on whether or not that material needs blending.
On top of that, even with blending off you can draw with a non 1.0 alpha which will end up giving use a canvas that can see through to the background.

threejs raycaster cannot intersect in stereoscopic mode

I am trying to make use of Raycaster in a ThreeJS scene to create a sort of VR interaction.
Everything works fine in normal mode, but not when I enable stereo effect.
I am using the following snippet of code.
// "camera" is a ThreeJS camera, "objectContainer" contains objects (Object3D) that I want to interact with
var raycaster = new THREE.Raycaster(),
origin = new THREE.Vector2();
origin.x = 0; origin.y = 0;
raycaster.setFromCamera(origin, camera);
var intersects = raycaster.intersectObjects(objectContainer.children, true);
if (intersects.length > 0 && intersects[0].object.visible === true) {
// trigger some function myFunc()
}
So basically when I try the above snippet of code in normal mode, myFunc gets triggered whenever I am looking at any of the concerned 3d objects.
However as soon as I switch to stereo mode, it stops working; i.e., myFunc never gets triggered.
I tried updating the value of origin.x to -0.5. I did that because in VR mode, the screen gets split into two halves. However that didn't work either.
What should I do to make the raycaster intersect the 3D objects in VR mode (when stereo effect is turned on)?
Could you please provide a jsfiddle with the code?
Basically, if you are using stereo in your app, it means you are using 2 cameras, therefore you need to check your intersects on both cameras views, this could become an expensive process.
var cameras =
{ 'camera1': new THREE.PerspectiveCamera(50, window.innerWidth / window.innerHeight, 1, 10000),
'camera2': new THREE.PerspectiveCamera(50, window.innerWidth / window.innerHeight, 1, 10000)
};
for (var cam in cameras) {
raycaster.setFromCamera(origin, cameras[cam]);
//continue your logic
}
You could use a vector object that simulates the camera intersection to avoid checking twice, but this depends on what you are trying to achieve.
I encountered a similar problem, I eventually found the reason. Actually in StereoEffect THREE.js displays the meshes on the two eyes, but in reality is actually adds only one mesh to the scene, exactly in the middle of the line left-eye-mesh <-> right-eye-mesh, hidden to the viewer.
So when you use the raycaster, you need to use it on the real mesh on the middle, not the illusion displayed on each eye !
I detailled here how to do it
Three.js StereoEffect displays meshes across 2 eyes
Hopes it solves your problem !
You can use my StereoEffect.js file in your project for resolving problem. See example of using. See my Raycaster stereo pull request also.

WebGL & Three.js - two renderers, no shadows in FF in second one

we have a setup with two WebGLRenderers (using the clone of the same scene, to avoid issues). Same scene, same lights, same camera. Second renderer is used for snapshoting on demand (to avoid problems with aliasing of RT rendering, etc.).
All this works like a charm in Chrome but in Firefox (35.0.1) we are missing shadows completely (only one shadow caster in scene, Spotlight)... Is this a known issue/limitation of FF (windows7/8/8.1)?
Any insight greatly appreciated.
var renderer = new THREE.WebGLRenderer({
alpha : false,
antialias : true,
preserveDrawingBuffer : true // required to support .toDataURL()
});
//shadows
renderer.shadowMapSoft = true;
renderer.physicallyBasedShading = true;
renderer.shadowMapEnabled = true;
renderer.render(snapshot.scene, snapshot.camera);
var data = renderer.domElement.toDataURL("image/jpeg");
I forgot to mention directly in the post that shadows are missing only in the second webGLRenderer instance (snapshot one).
What should I debug in FF (some webgl implemetation structs?). When comparing Chrome and FF status of threejs scene/renderer/camera/lights all seems to be ok and the same between browsers.
This is problem with lending on floating point textures. See http://3dwayfinder.com/webgl-broken-in-firefox-35-0-1-for-windows/

Threejs - Applying simple texture on a shader material

Using Threejs (67) with a Webgl renderer, I can't seem to get a plane with a shader material to wear its texture. No matter what I do the material would just stay black.
My code at the moment looks quite basic :
var grassT = new Three.Texture(grass); // grass is an already loaded image.
grassT.wrapS = grassT.wrapT = Three.ClampToEdgeWrapping;
grassT.flipY = false;
grassT.minFilter = Three.NearestFilter;
grassT.magFilter = Three.NearestFilter;
grassT.needsUpdate = true;
var terrainUniforms = {
grassTexture : { type: "t", value: grassT},
}
Then I just have this revelant part in the vertexShader :
vUv = uv;
And on the fragmentShader side :
gl_FragColor = texture2D(grassTexture, vUv);
This results in :
Black material.
No error in console.
gl_FragColor value is always (0.0, 0.0, 0.0, 1.0).
What I tryed / checked:
Everything works fine if I just apply custom plain colors.
All is ok if I use vertexColors with plain colors too.
My texture width / height is indeed a power of 2.
The image is on the same server than the code.
Tested others images with same result.
The image is actually loading in the browser debugger.
UVS for the mesh are corrects.
Played around with wrapT, wrapS, minFilter, magFilter
Adapted the mesh size so the texture has a 1:1 ratio.
Preloaded the image with requirejs image plugin and created the texture from THREE.Texture() instead of using THREE.ImageUtils();
Played around with needsUpdate : true;
Tryed to add defines['USE_MAP'] during material instanciation.
Tryed to add material.dynamic = true.
I have a correct rendering loop (interraction with terrain is working).
What I still wonder :
It's a multiplayer game using a custom port with express + socket.io. Am I hit by any Webgl security policy ?
I have no lights logic at the moment, is that a problem ?
Maybe the shader material needs other "defines" at instanciation ?
I guess I'm overlooking something simpler, this is why I'm asking...
Thanks.
I am applying various effects on the same shader. I have a custom API that merge all different effects uniforms simply by using Three.UniformsUtils.merge() However this function is calling the clone() method on the texture and this is causing to reset needsUpdate to false before the texture reach the renderer.
It appears that you should set your texture needsUpdate property to true when reaching the material level. On the texture level, if the uniform you set get merged, and therefore cloned, later in the process, it'll lose its needsUpdate property.
The issue is also detailled here: https://github.com/mrdoob/three.js/issues/3393
In my case the following wasn't working (grassT is my texture):
grassT.needsUpdate = true
while the following is running perfectly later on in the code:
material.uniforms.grassTexture.value.needsUpdate = true;
Image loading is asynchronous. Most likely, you are rendering your scene before the texture image loads.
You must set the texture.needsUpdate flag to true after the image loads. three.js has a utility that will do that for you:
var texture = THREE.ImageUtils.loadTexture( "texture.jpg" );
Once rendered, the renderer sets the texture.needsUpdate flag back to false.
three.js r.68

Resources