THREE.js flickering with pointlight and MeshFaceMaterial - three.js

For some reason I am getting flicker on any objects with using MeshFaceMaterial while I have a pointlight in the scene. Ambient and Directional lights are fine.
This is unfortunate, the pointlight adds an extra level of realism to the scene. If I remove the pointlight all is well. Pseudo code:
light = new THERE.PointLight(0xffffff,0.5);
scene.add(light);
loadTerrain();
mesh = new THREE.Mesh(cominedGeometry,new THREE.MeshFaceMaterial(materialArray))
scene.add(mesh);
I've tried loading the light before and after all other objects have loaded, no change in the flickering.
Why is there flickering under these conditions? Is there any remedy for this?

Related

fill bound issue, low fps in 360 scene with stacked transparent textures

I have a project with 360 scene - camera is inside the sphere and 360 photo is wrapped as texture around the sphere:
http://kitchen-360.herokuapp.com/
I added several smaller spheres with transparent textures and Im seeing sudden drop of performance. It is 'fill bound' issue as described in this link:
Debugging low FPS in Three.js
Im trying to solve this performance issue. Im thinking of having only one sphere with multiple textures on it. Is this gonna be faster then stacked spheres with one texture each?
I tried to create sphere mesh with array of MeshBasicMaterial but its not working. Only first texture in the array is visible:
// when texture is loaded I push it to array
var sphereMaterial = new THREE.MeshBasicMaterial({
map: texture,
side: THREE.DoubleSide
})
sphereMaterial.transparent = true;
matArr.push(sphereMaterial);
//... then later when all textures loaded
roomMesh = new THREE.Mesh( sphereGeometryR, matArr );
roomMesh.name = 'great room';
scene.add( roomMesh );
I saw this example for custom shader but dont know how to add and change textures dynamically at later time:
Multiple transparent textures on the same mesh face in Three.js
Is there any other way to optimize this problem? Would geometry merge help here?

Three.js - Is Extra Mesh Material with Same Material Beneficial?

I am new to three.js. I know that you can combine multiple mesh materials into one mesh. What if the mesh is the same material? Would it add more detail? In otherwards, does the extra mesh = new THREE.Mesh( geometry, material ); scene.add( mesh ); do anything beneficial? By the naked eye it's hard for me to tell.
mesh = new THREE.Mesh( geometry, material );
scene.add( mesh );
mesh = new THREE.Mesh( geometry, material );
mesh.position.z = - 1500;
scene.add( mesh );
It seems you have a bit of a mix-up in the terminology. A mesh is an actual object in a scene, where as a material describes mesh's properties, which influence its shading. So creating a second mesh and adding it to a scene would result in a second object with provided geometry and material (i.e. presentation). The probable reason you don't see it may be because it's to far along Z-axis and got culled by camera's far clipping plane (a.k.a Z far).
And to the "is it beneficial" part. It's implementation dependent, but it may be beneficial for rendering performance, because draw calls for meshes sharing a material (and hence shader program and its parameters) can be coalesced together without redundant state changes, which is always good in WebGL (and OpenGl, for that matter).

Three.js: add light to camera

I want to move and rotate the camera but keep a PointLight on the same position relative to the camera. I've read a bunch of threads saying that you can add the light object to the camera instead of the scene. Like so:
pointLight = new THREE.PointLight( 0xffffff );
pointLight.position.set(1,1,2);
camera.add(pointLight);
However this does not seem to work for me. Instead I now when the camera changes set the light's position by applying the cameras matrixWorld to my desired relative light position. This works, but adding the light to the camera seems like a cleaner solution.
I'm a doing something wrong or is adding light object to the camera deprecated?
Thanks!
You need to add the camera to the scene if the camera has a child object, such as a `PointLight'.
scene.add( camera );
three.js r.68

three.js sizeAttenuation to Sprite material

I want the sprite in the scene don't change the size when the camera zoom in or out. and the sprites use different canvas texture as the material.
I found that the sizeAttenuation in ParticleBasicMatierial can work for me. But if I use the WenGLRenderer, I must use ParticleSystem instead of the Particle with the CanvasRenderer.
I currently use ParticleSystem to contain only one vertex, and every vertex correspond to one ParticleSystem, so there are about 800+ ParticleSystem in my scene, this can work, but consume a lot.
Obviously, I can't use "HUD" as the example in three.js source, because the sprites are all in 3D scene.
Can some one help me. Or add the sizeAttenuation to Sprite Material! Thanks!
If you want to prevent size attenuation with sprites, and you are using a perspective camera, you can set the sizeAttenuation property of your material to false:
var material = new THREE.SpriteMaterial( {
color: 0xffffff,
map: texture,
sizeAttenuation: false
} );
This feature was added in three.js r.96.
EDIT: Updated to three.js r.96

THREE.ShaderMaterial is seen inverted by the shadow Camera

I have a mesh that i am loading from 3d studio max into three.js. I modified three.js to hold another typed array for the binormal data. It all seems to be working fine and dandy until shadows are involved. For some reason, the shadow map is wrong, and it seems as if its rendering the mesh with faces flipped.
In this example, the shadows are showing up correctly on the floor, because the renderer has
.shadowMapCullFace = THREE.CullFaceBack
http://dusanbosnjak.com/test/webGL/new/StojadinCeo/stojadinCeo.html
I can get other shadows to show up on my shader, but self shadowing leads to horrible artifacts, and the shadow that my mesh casts on other meshes is always inverted.
I've tried reversing the order in which the face indecis come in, (acb instead of abc), which flips the faces. This creates proper shadow cast, but the mesh shows flipped.
What im thinking of doing at the moment is exporting a flipped mesh, and reversing the cull order in the shaderMaterial, but it would be wonderful to find out why this is happening.
I basically connected the phong and shadow mapping shader chunks with what i've had.
edit
Here is an updated scene with some shadow casters and receive shadows on imported meshes
http://dusanbosnjak.com/test/webGL/new/StojadinCeo/stojadinCeo2.html
light = new THREE.SpotLight(0xaaaaaa);
light.position.set(10,10,10);
light.shadowCameraVisible = true;
light.shadowDarkness = .5;
light.castShadow = true;
light.shadowCameraNear = 1;
light.shadowCameraFar = 250;
light.shadowCameraFov = 57;
light.shadowMapWidth = 2048;
light.shadowMapHeight = 2048;
scene.add(light);
the rest of the meshes just have receiveShadow and castShadow set to true
The shadow shows on the shaderMaterial (i copied the shadowfrag chunk)
THREE.Mesh() with THREE.CubeGeometry() both casts shadows and receives shadows properly, but the shadow cast by the shaderMaterial mesh is inverted.
I can't really isolate this to 50 lines of code as it's a whole import/export process from max.
I don't understand why would the shadow camera render this one particular mesh inverted, while the normal camera renders it correctly, if that is what is happening?
You can zoom out and move the car using wasd
Unless you changed the default settings in three.js, only back-faces cast shadows. A work-around is to set:
renderer.shadowMapCullFace = THREE.CullFaceBack;
or
renderer.shadowMapCullFace = THREE.CullFaceNone;
But these options can lead to other issues.
The best approach is to make sure every mesh has depth. Avoid planes, like the car roof.
For example, you can add an interior liner to the car roof to give it depth.
Shadow mapping in WebGL can be tricky, so read all you can about it so you will be familiar with the issues involved.
three.js r.66

Resources