zbuffer problems using 2D planes in THREE.JS - three.js

I am composing 2D planes with textures. I have 3 levels.
A background plane at z=0,
black shapes for conections at z=0.1 and
small planes with textures at z=0.2
the problem is that when I move the camera planes seems to change z position.
Planes are drawn in incorrect Z, it depend on the position of the camera. Moving the camera it changes again and looks very ugly.
Maybe I need to activate some ZBuffer property for correct drawing
WebGL init is like this and planes are exactly the same (only Z coord change)
renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
renderer._microCache = new MicroCache(); //cache de imagenes
renderer.setClearColor(0xeeeeee, 1);
document.body.appendChild(renderer.domElement);
// add directional light source
var directionalLight = new THREE.DirectionalLight(0xffffff, 1);
directionalLight.position.set(1, 1, 1300).normalize();
scene.add(directionalLight);
//background plane
plane = new THREE.Mesh(new THREE.PlaneGeometry(200000, 200000, 1, 1), new THREE.MeshLambertMaterial({ color: 0xffffff, opacity: planeOpacity, transparent: true }););
plane.position.z = 0;
scene.add(plane);
Other planes are exactly the same but greater Z position
Help please!
Thanks!
Palomo

The thing you're seing is probably z-fighting. Internally, depth is represented by integer in GPU so there is only fixed number of distinct z-s between camera's near and far planes. The solution is to either move your planes apart or narrow camera's near-far range down.

Related

three js weird artifacts with objects intersection

Trying to get into three js and following some tutorials here and there but I'm stuck on some very basic stuff I feel, that I can't find the solution to.
The problem I have is shown better on the gif below, it's those weird artifacts in the smaller spheres that also happen when the yellow spheres go behind the red one only this time the red one is the one glitching out. I tried different base materials (phong, standard, basic) and also played with the opacity and roughness to make sure I don't have a transparent or reflective material but nothing. Some context for the scene:
It's just a big sphere (S) that's standing still and there are 2 smaller spheres (s1 the smaller of the two and s2 the bigger of the two) that are orbiting it. s1 is orbiting tangentially to S surface and s2 center is on the surface of S (so it's half inside S and half outside of it).
Any ideas?
Here's my code for the renderer:
var renderer = new THREE.WebGLRenderer({antialias: true},{ alpha: true });
renderer.setClearColor(0x000000, 0);
renderer.setPixelRatio(window.devicePixelRatio);
renderer.setSize(window.innerWidth, window.innerHeight);
And here's my code for the spheres:
var geometry = new THREE.SphereGeometry(0.5, 32, 32);
var material = new THREE.MeshStandardMaterial (
{
color:"red"
}
);
var point = new THREE.Mesh(geometry, material);
And one ambient light:
var light = new THREE.AmbientLight(0xffffff, 1);
scene.add(light);
Solution was found by Marquizzo in his comment:
The camera.near property was too low and I had to adjust it.

Map Texture to Plane without distortion

In Three.js, how can I change the way in which a texture gets mapped onto a plane?
Let's assume we have a 1x1 plane and a 16:9 image. How can I control the way in which that image gets mapped onto the plane?
By default, the image gets "squished". I would like it to maintain its aspect ratio and have any overlap get "cut off". Is there a way to configure the material or texture to do this, or would I use a shader? If so, what would it need to look like?
const planeMesh = new THREE.Mesh(
new THREE.PlaneBufferGeometry(1, 1),
new THREE.MeshBasicMaterial({
map: texture,
})
);
PS: In future, I would also like to be able to zoom into and out of the image on mouse hover without affecting the size of the plane, so would think a shader might be better?
A Texture already has several properties built-in that can do what you're looking for.
const texture = textureLoader.load("whatever.png");
const planeMesh = new THREE.Mesh(
new THREE.PlaneBufferGeometry(1, 1),
new THREE.MeshBasicMaterial({
map: texture,
})
);
// Sets the pivot point to the center of the texture
texture.center.set(0.5, 0.5);
// Make the texture repeat 0.5625 times in the x-axis to match 16:9 ratio
let ratio = 9 / 16;
texture.repeat.set(ratio, 1);
// Scale texture up to "zoom" into it
let zoom = 0.5;
texture.repeat.set(ratio * zoom, 1 * zoom);
You can read more about the .repeat .center and even .rotation properties in the Texture docs. Just keep in mind that repeating a texture is a bit counter-intuitive because you're doing the inverse of scaling a texture. So to scale a texture by 2, you have to tell it to repeat 1/2 times.

Three.js terrain shadowing

I've done a terrain using planegeometry object. I set a vertexes' y coordinate to bump my terrain. Then I added a directional light to my scene and see that there is no shadowing of the "hills" etc.
I added also a sphere and noticed that there is also no shadow on this terrain.
var light = new THREE.DirectionalLight(0xffffff, 1);
light.castShadow = true;
light.shadowCameraVisible = true;
light.position.set(-300, 120, -200); // CHANGED
scene.add(light);
scene.add( new THREE.DirectionalLightHelper(light, 0.2) );
Directional light with sphere above the terrain
But when I replaced directional light by a spotlight I've seen shadowing on a terrain that I want to have.
var slight = new THREE.SpotLight(0xffffff,1);
slight.position.set(-100,60,100);
slight.shadowCameraVisible = true;
scene.add(slight);
scene.add(new THREE.SpotLightHelper(slight, 0.5));
Spotlight with terrain
So the questions are:
how can I do a light that looks like sunshine so the terrain will be not plane color but depend on light? (in future it will be a part of a real city)
what should I do to see the objects' shadows on the terrain? (from fiddle example: sphere's shadow)
Thanks
You are modifying the vertices of your terrain. When you do so, you have to also modify the vertex normals. One way to do that is like so:
geometry.computeVertexNormals();
To create shadows you must enable them.
renderer.shadowMap.enabled = true;
three.js r.130

FlatShading on colored custom geometry

As soon as I push my own vertices to a geometry, the geometry is a solid color, instead of having colored shadows.
Applying this code to a THREE.PlaneGeometry called mesh gives the following shading:
var light = new THREE.DirectionalLight(0xffffff, 1);
light.castShadow = true;
light.shadowDarkness = 0.5;
// ..
THREE.MeshLambertMaterial({ color: 0x66e6b0, shading: THREE.FlatShading });
// ...
mesh.receiveShadow = true;
mesh.castShadow = true;
However, when I apply the same code to a THREE.Geometry() with custom vertices and faces, the geometry is solid black. How can I give the custom geometry the same shading as the plane geometry?
I can use THREE.MeshBasicMaterial but then there are no longer shadows on the faces.
Using vertexColors: THREE.FaceColors and coloring each face still gives all black.
A THREE.AmbientLight gives color but then there are no shadows on the faces.
Here is a fiddle of randomly generated faces that are all the same color. Instead, I would like them to have different shadows because they are different angles (as in the above image).
It is not the shadows that would produce an effect like that but a z-coordinate. In your jsfiddle all your triangles are on the xy-plane so they all have the same normal. So their lighting would be the same. So if you make the call like this:
geometry.vertices.push(new THREE.Vector3(Math.random() * 100, Math.random() * 100, Math.random() * 100));
and also light.castShadow = false; because it does not contribute to anything then you will get the variation that you want.
This problem of a geometry appearing all black will occur when any initial coordinate of a vertex appended to the geometry has an undefined value.
Even if the geometry is displayed, animates, and no errors are thrown, if an initial coordinate of a THREE.Vector3 is undefined, THREE.MeshLambertMaterial shading will not work.
This problem is demonstrated in the fiddle, where the use of undefined_variable prevents colored shading, and just yields a solid color.
To handle this, initialize all vertices to arbitrary values i.e. new THREE.Vector3(0, 0, 0) and then use the values of variables in animate().

Inconsistent alpha channel in three.js

I'm building an educational tool using planes, extruded splines, and cylinder geometries. Planes have a texture map and the rest are either basic or Lambert materials.
Texture map on planes in the only map and it does have an alpha
channel.
Texture map has been tested as .GIF and .PNG
All objects have "transparent: true"
renderer = new THREE.WebGLRenderer( {antialias:true} );
NOTE: this is the exact same problem listed at the following link. It has not been solved and my Rep isn't high enough to comment.
in three.js, alpha channel works inconsistently
As mmaclaurin noted, it could be a change based in draw order and camera location. I am using THREE.TrackballControls and limiting camera movement to two axes.
Adding or removing the BasicMaterial for wireframe does not change the issue.
Thank you for your time reading this and any help you can offer!
Example of plane object:
var T4map = new THREE.ImageUtils.loadTexture( 'medium_T4.png' );
var T4Material = new THREE.MeshBasicMaterial( { opacity: .96, transparent:true, map: T4map } );
var T4Geometry = new THREE.PlaneGeometry(810, 699, 10, 10);
var T4 = new THREE.Mesh(T4Geometry, T4Material);
T4.position.y = -CNspacing;
T4.doubleSided = true;
scene.add(T4);
Example of extruded spline geometry where problem is most noticeable:
var mesh = THREE.SceneUtils.createMultiMaterialObject( geometry, [
new THREE.MeshLambertMaterial( { color: 0xaaff00, opacity: 0.5, transparent: true } ),
mesh.position.set( x, y, z );
mesh.scale.set( s, s, s );
parent.add( mesh );
Try to play around with depthTest. Usually this would help:
new THREE.MeshBasicMaterial( { side:THREE.BackSide,map:texture,transparency:true, opacity:0.9, depthWrite: false, depthTest: false });
There are many other questions related to your subject, for ex.: transparent bug
Was just going to comment but its too long:
Objects that are marked with transparent = true, are painters sorted based on their centroid, and drawn back to front so that the transparency layers mostly correctly. Make sure your mesh.geometries have proper computeBoundingBox() and computeBoundingSphere() applied to them before adding them... if that doesn't fix your problem, then try using material.alphaTest = 0.5 on your materials.. this works well for things that are mostly on/off alpha.. like masks... but if you have smooth gradations of transparency from 0 to 1 in your textures, you may see fringes where the alpha test threshholding happens.

Resources