set transparent (opacity: 0) to some faces of a sphere geometry - three.js

I want to make half the faces of a sphere transparent and the other half colored.
How can I do that?
I've tried to set a transparent color, but it seems it doesn't work that way.
geometry = new THREE.SphereGeometry(1.0, 17, 17);
for (var i = 0; i < geometry.faces.length; i++) {
let x = Math.random();
//Here I'm trying to set a transparent color to half the faces of the sphere.
let color = 0
if (x < 0.5) {
color = '0x000000';
} else {
color = '0xffffff';
}
geometry.faces[i].color.setHex(color);
}
var material = new THREE.MeshPhongMaterial({ vertexColors: THREE.VertexColors });
sphere = new THREE.Mesh(geometry, material);
All the faces of the sphere are colored if I do in the way above.
I want half the faces to be randomly selected and to be transparent so that it will make the light inside the sphere scatter its rays like god rays effect, which is something like the one in the video below.
https://www.youtube.com/watch?v=suqFV7VGsL4

Looking at the GLSL shaders in three.js, three.js does not support alpha on vertex colors. It only uses Red, Green, and Blue, not the alpha.
To use vertex colors to make something transparent you'd need to write a custom shader or modify three.js's shaders

Related

No cubemap ("material.envMap") reflection on polygon faces with orthographic cam

Why won't a MeshPhongMaterial's envMap property work on polygonal faces when viewed through an orthographic camera?
It works on spheres but not an IcosahedronGeometry, for example. If I set the detail parameter of the IcosahedronGeometry to 2+ (more faces), the envMap begins to show. But if I switch to perspective cam, the envMap is fully visible even with detail of 0.
This is what it looks like with perspective cam, note the cubemap reflection of some clouds:
This is what it looks like with orthogonal cam and detail is 0, note the lack of cubemap reflection (please ignore the warping of the image):
Orthogonal cam, detail is 1; cubemap reflection is back:
The only difference between these two versions of the script is the camera.
Here's the code I'm using to create this object:
import uvGridImg from './img/grid.png';
import nxImg from './img/nx_50.png';
import pxImg from './img/px_50.png';
import nyImg from './img/ny_50.png';
import pyImg from './img/py_50.png';
import nzImg from './img/nz_50.png';
import pzImg from './img/pz_50.png';
const envTexture = new THREE.CubeTextureLoader().load([
pxImg, //right
nxImg, //left
pyImg, //top
nyImg, //bottom
pzImg, //back
nzImg, //front
])
envTexture.mapping = THREE.CubeReflectionMapping
const texture = new THREE.TextureLoader().load(uvGridImg)
const icosahedronGeometry = new THREE.IcosahedronGeometry(1, 0)
const material = new THREE.MeshPhongMaterial()
material.map = texture;
material.envMap = envTexture;
///An attempt to explicitly set every potentially relevant property...
material.envMapIntensity = 0.0;
material.transparent = false;
material.opacity = 1.0;
material.depthTest = true;
material.depthWrite = true;
material.alphaTest = 0.0;
material.visible = true;
material.side = THREE.FrontSide;
material.flatShading=true;
material.roughness = 0.0;
material.color.setHex(0xffffff);
material.emissive.setHex(0x0);
material.specular.setHex(0xffffff);
material.shininess = 30.0;
material.wireframe = false;
material.flatShading = false;
material.combine = THREE.MultiplyOperation;
material.reflectivity = 1.0;
material.refractionRatio = 1.0;
const icosahedron = new THREE.Mesh(icosahedronGeometry, material)
icosahedron.position.x = 0
scene.add(icosahedron);
For an MVCE, please see the example from this tutorial (you will have to add your own orthographic cam to compare with the given perspective cam). Here are image files for the textures.
UPDATE It seems like all non-spherical geometries cannot render a cubemap reflection correctly thru an orthographic cam. The plane, cylinder, box geometries all fail to render a environment map reflection beyond painting the entire face one uniform reflective color. The sphere, lathe, *hedron geometries (at high levels of detail) will render cubemap reflections.
Is there any way around this? This seems like a huge limitation while working with orthographic cameras.
This is the expected behavior.
With perspective cameras, the reflective "rays" separate as they get further away from the camera, reflecting a wider angle of the envMap.
With an ortho camera these reflective "rays" do not separate because they're parallel. So the reflection on a flat face is a very narrow angle of the envMap.
See this demo I quickly put together to demonstrate what you're seeing:
It seems to work on spheres because when the parallel orthographic "rays" bounce off a rounded surface, these rays grow wider apart. They are no longer parallel (as is the case with a Perspective camera).
You can see the reflections still work on your demo because the faces alternate between light and dark as you rotate them. You're just looking at a much narrower segment of the envMap:

Three.js Mesh to Inherit Color from in-front objects

Scenario
I'm trying to get meshes to inherit color from other meshes that are in physically in front
example:
Mesh A Mesh B Camera
(white) (red)
+---+ |\
[] | | << | ]
+---+ |/
I don't want Mesh B to be visible at all BUT I want Mesh A to be red where Mesh B would be blocking the view of Mesh A. All I can think of is that this is a kind of color overlay.
Prototype
I made a three.js prototype where two planes are partially blocking the view of a white sphere. I want to be able to turn the plans into a color overlay so that the sphere shows red, white, and blue (accidental usa flag colors)
Question
Is there a way to do this with THREE? I'm not fixed to this way of doing things but it seems possible, I just haven't come across the right documentation yet.
You could set the blending mode of each plane's material to blending: THREE.MultiplyBlending, so the plane's color is multiplied with items behind it.
See it in action here.
However, if you want it to apply to only the parts where the plane and the sphere overlap, you'd need to write a custom post-processing pass that knocks out the background. Or you could set the background color to #000, since multiplying any color by black yields black.
check the fiddle solutionLink, the two red and blue planes are invisible, a raycaster is shooting from the camera to the scene, when the red or blue planes are hit and the sphere is hit too I change the color of the ball
const raycaster = new THREE.Raycaster();
const redMat = new THREE.MeshBasicMaterial({color: 'red'});
const blueMat = new THREE.MeshBasicMaterial({color: 'blue'});
function raycastFromCameraAndUpdateSphereColor() { // place this in the render loop
raycaster.setFromCamera( new THREE.Vector2(), camera );
const intersects = raycaster.intersectObjects( scene.children );
if (intersects.length > 1) {
// name the object to identify it
if (intersects[0].object.name === 'redPlane') {
intersects[1].object.material = redMat;
}
// name the object to identify it
if (intersects[0].object.name === 'bluePlane') {
intersects[1].object.material = blueMat;
}
}
}

FlatShading on colored custom geometry

As soon as I push my own vertices to a geometry, the geometry is a solid color, instead of having colored shadows.
Applying this code to a THREE.PlaneGeometry called mesh gives the following shading:
var light = new THREE.DirectionalLight(0xffffff, 1);
light.castShadow = true;
light.shadowDarkness = 0.5;
// ..
THREE.MeshLambertMaterial({ color: 0x66e6b0, shading: THREE.FlatShading });
// ...
mesh.receiveShadow = true;
mesh.castShadow = true;
However, when I apply the same code to a THREE.Geometry() with custom vertices and faces, the geometry is solid black. How can I give the custom geometry the same shading as the plane geometry?
I can use THREE.MeshBasicMaterial but then there are no longer shadows on the faces.
Using vertexColors: THREE.FaceColors and coloring each face still gives all black.
A THREE.AmbientLight gives color but then there are no shadows on the faces.
Here is a fiddle of randomly generated faces that are all the same color. Instead, I would like them to have different shadows because they are different angles (as in the above image).
It is not the shadows that would produce an effect like that but a z-coordinate. In your jsfiddle all your triangles are on the xy-plane so they all have the same normal. So their lighting would be the same. So if you make the call like this:
geometry.vertices.push(new THREE.Vector3(Math.random() * 100, Math.random() * 100, Math.random() * 100));
and also light.castShadow = false; because it does not contribute to anything then you will get the variation that you want.
This problem of a geometry appearing all black will occur when any initial coordinate of a vertex appended to the geometry has an undefined value.
Even if the geometry is displayed, animates, and no errors are thrown, if an initial coordinate of a THREE.Vector3 is undefined, THREE.MeshLambertMaterial shading will not work.
This problem is demonstrated in the fiddle, where the use of undefined_variable prevents colored shading, and just yields a solid color.
To handle this, initialize all vertices to arbitrary values i.e. new THREE.Vector3(0, 0, 0) and then use the values of variables in animate().

zbuffer problems using 2D planes in THREE.JS

I am composing 2D planes with textures. I have 3 levels.
A background plane at z=0,
black shapes for conections at z=0.1 and
small planes with textures at z=0.2
the problem is that when I move the camera planes seems to change z position.
Planes are drawn in incorrect Z, it depend on the position of the camera. Moving the camera it changes again and looks very ugly.
Maybe I need to activate some ZBuffer property for correct drawing
WebGL init is like this and planes are exactly the same (only Z coord change)
renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
renderer._microCache = new MicroCache(); //cache de imagenes
renderer.setClearColor(0xeeeeee, 1);
document.body.appendChild(renderer.domElement);
// add directional light source
var directionalLight = new THREE.DirectionalLight(0xffffff, 1);
directionalLight.position.set(1, 1, 1300).normalize();
scene.add(directionalLight);
//background plane
plane = new THREE.Mesh(new THREE.PlaneGeometry(200000, 200000, 1, 1), new THREE.MeshLambertMaterial({ color: 0xffffff, opacity: planeOpacity, transparent: true }););
plane.position.z = 0;
scene.add(plane);
Other planes are exactly the same but greater Z position
Help please!
Thanks!
Palomo
The thing you're seing is probably z-fighting. Internally, depth is represented by integer in GPU so there is only fixed number of distinct z-s between camera's near and far planes. The solution is to either move your planes apart or narrow camera's near-far range down.

Rendering a large number of colored particles using three.js and the canvas renderer

I am trying to use the Three.js library to display a large number of colored points on the screen (about half a million to million for example). I am trying to use the Canvas renderer rather than the WebGL renderer if possible (The web pages would also be displayed in the Google Earth Client bubbles, which seems to work with Canvas renderer but not the WebGL renderer.)
While I have the problem solved for a small number of points (tens of thousands) by modifying the code from here, I am having trouble scaling it beyond that.
But in the the following code using WebGL and the Particle System I can render half a million random points, but without colors.
...
var particles = new THREE.Geometry();
var pMaterial = new THREE.ParticleBasicMaterial({
color: 0xFFFFFF,
size: 1,
sizeAttenuation : false
});
// now create the individual particles
for (var p = 0; p < particleCount; p++) {
// create a particle with randon position values,
// -250 -> 250
var pX = Math.random() * POSITION_RANGE - (POSITION_RANGE / 2),
pY = Math.random() * POSITION_RANGE - (POSITION_RANGE / 2),
pZ = Math.random() * POSITION_RANGE - (POSITION_RANGE / 2),
particle = new THREE.Vertex(
new THREE.Vector3(pX, pY, pZ)
);
// add it to the geometry
particles.vertices.push(particle);
}
var particleSystem = new THREE.ParticleSystem(
particles, pMaterial);
scene.add(particleSystem);
...
Is the reason for the better performance of the above code due to the Particle System? From what I have read in the documentation it seems the Particle System can only be used by the WebGL renderer.
So my question(s) are
a) Can I render such large number of particles using the Canvas renderer or is it always going to be slower than the WebGL/ParticleSystem version? If so, how do I go about doing that? What objects and or tricks do I use to improve performance?
b) Is there a compromise I can reach if I give up some features? In other words, can I still use the Canvas renderer for the large dataset if I give up the need to color the individual points?
c) If I have to give up the Canvas and use the WebGL version, is it possible to change the colors of the individual points? It seems the color is set by the material passed to the ParticleSystem and that sets the color for all the points.
EDIT: ParticleSystem and PointCloud has been renamed to Points. In addition, ParticleBasicMaterial and PointCloudMaterial has been renamed to PointsMaterial.
This answer only applies to versions of three.js prior to r.125.
To have a different color for each particle, you need to have a color array as a property of the geometry, and then set vertexColors to THREE.VertexColors in the material, like so:
// vertex colors
var colors = [];
for( var i = 0; i < geometry.vertices.length; i++ ) {
// random color
colors[i] = new THREE.Color();
colors[i].setHSL( Math.random(), 1.0, 0.5 );
}
geometry.colors = colors;
// material
material = new THREE.PointsMaterial( {
size: 10,
transparent: true,
opacity: 0.7,
vertexColors: THREE.VertexColors
} );
// point cloud
pointCloud = new THREE.Points( geometry, material );
Your other questions are a little too general for me to answer, and besides, it depends on exactly what you are trying to do and what your requirements are. Yes, you can expect Canvas to be slower.
EDIT: Updated for three.js r.124

Resources