Three.js Extruding standard complex geometries - three.js

I created a ring geometry that I used to represent a plane in a sphere. The problem with this geometry is that if I put the camera in perpendicular, it disappears because it has no width.
To solve that I want to extrude this geometry instead of directly creating the mesh.
There is a lot of posts on how to create the shapes needed to extrude by pushing points and holes and whatever, but not on how to obtain this vertices correctly from a geometry.
First I tried to create the shape passing the vertices of the ring geometry directly. It fails with an undefined at "vertices":
var orb_plane_shape = new THREE.Shape(ring_geom.vertices.clone());
Then, I tried to copy the vertices vector, place by place, and give it to the Shape constructor. It works but with the following problems:
-A warning: unable to triangulate polygon!
-There is no clear hole in the ring, and looks like the vertice connection order has changed.
var vertices = [];
for (var i = 0; i < ring_geom.vertices.length ; i++) {
vertices.push(ring_geom.vertices[i].clone());
}
var orb_plane_shape = new THREE.Shape(vertices);
// extrude options
var options = {
amount: 1, // default 100, only used when path is null
bevelEnabled: false,
bevelSegments: 2,
steps: 1, // default 1, try 3 if path defined
extrudePath: null // or path
};
// geometry
var geometry = new THREE.ExtrudeGeometry( orb_plane_shape, options );
plane_orb = new THREE.Mesh(geometry, material_plane_orb);
I would like to establish a method to convert any of the standar 2D geometries (circle, ring...) to a shape, to be able to extrude it.
Thanks

Related

How to get vertices of obj model object in Three.JS?

After loading a .obj model in Three.js I am unable to find vertices data. Vertices data is needed to apply collision detection as suggested by this answer
var loader = new THREE.OBJLoader();
loader.load('models/wall.obj', function ( object ) {
object.traverse( function ( node ) {
if ( node.isMesh ) {
console.log(node);
}
});
scene.add( object );
});
In mesh there is geometry.attributes.position.array but I am unable to find "vertices" anywhere in object.
Right now trying to convert position.array data to vertices but below code is not working, this answer is pointing the problem correctly but I am unable to use it to solve the issue:
var tempVertex = new THREE.Vector3();
// set tempVertex based on information from mesh.geometry.attributes.position
mesh.localToWorld(tempVertex);
// tempVertex is converted from local coordinates into world coordinates,
// which is its "after mesh transformation" position
geometry.attributes.position.array IS the vertices. Every three values makes up one vertex. You will also want to look at the index property (geometry.index), because that is a list of indices into the position array, defining the vertices that make up a shape. In the case of a Mesh defined as individual triangles, every three indices makes up one triangle. (Tri-strip data is slightly different, but the concept of referencing vertex values by the index is the same.)
You could alternately use the attribute convenience functions:
BufferAttribute.getX
BufferAttribute.getY
BufferAttribute.getZ
These functions take the index into account. So if you want the first vertex of the first triangle (index = 0):
let pos = geometry.attributes.position;
let vertex = new THREE.Vector3( pos.getX(0), pos.getY(0), pos.getZ(0) );
This is equivalent to:
let pos = geometry.attributes.position.array;
let idx = geometry.index.array;
let size = geometry.attributes.position.itemSize;
let vertex = new THREE.Vector3( pos[(idx[0] * size) + 0], pos[(idx[0] * size) + 1], pos[(idx[0] * size) + 2] );
Once you have your vertex, then you can use mesh.localToWorld to convert the point to world-space.

Three.js: Draw a vector on plane

I'm a newbie in three.js (and in stackoverflow).
I try to find answer but I'm not able to do this simple things.
I'm playing with Helpers and Plane.
I want to create a Plane (and it's PlaneHelper), and draw an arbitrary vector on this Plane.
All is right if the plane's distance from origin is set to 0.
If I give a distance to the plane, the vector is not on the plane.
Here is the commented code I use for this little experiment.
Projecting both the origin and the vector on the plane I was convinced that arrowHelper_Point remained on the plane, but it's not.
Where is my mistake? I can not understand it.
// Define ARROW_LENGTH to display ArrowHelper
const ARROW_LENGTH = 5;
// Point (0,0,0)
var origin = new THREE.Vector3(0, 0, 0);
// Axes helper in (0,0,0)
var axesHelperOrigin = new THREE.AxesHelper(100);
scene.add(axesHelperOrigin);
// Define a plane by the normal, color and distance from (0,0,0)
var vectorNormal = {
normal: new THREE.Vector3(1, 1, 0).normalize(),
color: "rgb(255, 255, 0)",
colorNormal: "rgb(255,100,0)",
colorVector: "rgb(194, 27, 255)",
distance: -3,
};
// Create Plane from the normal and distance
var plane = new THREE.Plane(vectorNormal.normal, vectorNormal.distance);
// Add PlaneHelper to scene
var planeHelper = new THREE.PlaneHelper(plane, 100, vectorNormal.color);
scene.add(planeHelper);
// Add ArrowHelper to display normal
// Find the projection of origin on plane
var originOnPlane = plane.projectPoint(origin);
var arrowHelper_Normal = new THREE.ArrowHelper(vectorNormal.normal, originOnPlane, ARROW_LENGTH, vectorNormal.colorNormal);
scene.add(arrowHelper_Normal);
// Define a point "random"
var point = new THREE.Vector3(5, -2, 6);
// Project the point on plane
var pointOnPlane = plane.projectPoint(point);
// Draw ArrowHelper to display the pointOnPlane, from originOnPlane
var arrowHelper_Point = new THREE.ArrowHelper(pointOnPlane.normalize(), originOnPlane, ARROW_LENGTH, vectorNormal.colorVector);
scene.add(arrowHelper_Point);
EDIT: OK, I think I find the error.
Looking at this Get direction between two 3d vectors using Three.js?
I need the vector between the two points:
var dir=new THREE.Vector3();
dir.subVectors(pointOnPlane,originOnPlane).normalize();
And use dir as the arrow direction.
Sorry for asking an obviously thing.
Looking at this Get direction between two 3d vectors using Three.js?
I need the vector between the two points:
var dir=new THREE.Vector3();
dir.subVectors(pointOnPlane,originOnPlane).normalize();
And use dir as the arrow direction.

Inconsistent surface behaviour in Three.js

when I set a point light at a THREE.BoxGeometry object it looks like this:
THREE.BoxGeometry with point light
var light = new THREE.PointLight (0xffffff, 1, 100);
light.position.set (10, 10, 10);
scene.add (light);
var geometry = new THREE.BoxGeometry (1, 1, 1);
var material = new THREE.MeshPhongMaterial ();
var cube = new THREE.Mesh (geometry, material);
scene.add (cube);
When I now set a point light at a THREE.PolyhedronGeometry object it looks like this:
THREE.PolyhedronGeometry with point light
var light = new THREE.PointLight (0xffffff, 1, 100);
light.position.set (10, 10, 10);
scene.add (light);
var geometry = new THREE.PolyhedronGeometry (vertices, faces, 1, 0);
var vertices = [-1,-1,-1,1,-1,-1,1,1,-1,-1,1,-1,-1,-1,1,1,-1,1,1,1,1,-1,1,1];
var faces = [2,1,0,0,3,2,0,4,7,7,3,0,0,1,5,5,4,0,1,2,6,6,5,1,2,3,7,7,6,2,4,5,6,6,7,4];
var material = new THREE.MeshPhongMaterial ();
var cube = new THREE.Mesh (geometry, material);
scene.add (cube);
I want to know, where this behaviour comes from and how I can manage to make polyhedrons' faces behave as nice as boxs?
I read that it might be related to geometry.computeFaceNormals().
So I tried it out, but it doesn't make any difference.
when something is different with how light behaves on a surface, first candidates to look at are normals
this is true for the box face
boxGeometry.faces[i].normal.equals(boxGeometry.faces[i].vertexNormals[j]);//true
so box has only simple normal for each face
the polyhedron has different face normal from the vertex normals
polyhedronGeo.faces[i].normal.equals(polyhedronGeo.faces[i].vertexNormals[j]);//not true
and some of the vertex normals are not equal among each other
polyhedronGeo.faces[i].vertexNormals[j].equals(polyhedronGeo.faces[i].vertexNormals[k]);
//not true for some j,k
that is why the light looks ~shadowy - normal is interpolated for the shader from vertexNormals
to modify the polyhedron to look like box just change the vertex notmals to match the face normal
as for
geometry.computeFaceNormals();
it will only compute the face normals, not the vertexNormals
there is another function
geometry.computeVertexNormals();
but that would create vertex normals as are in polyhedron
Thanks Derte. Your reflection got me closer to the point. So with advanced keywords I found this: https://github.com/mrdoob/three.js/issues/1982
The answer to my question is this line, flattening shading for "free forms":
material.shading = THREE.FlatShading;

threejs selecting different parts of a mesh

I'm using THREE.js. I have a model of a human that I want to be able to select different portions of. For example, if you click on one of the legs a particular action will be executed. My original idea was to split the model up into separate meshes and then use raytracing to determine which object was selected. But now when i render the scene, the shading along the edges of each mesh doesn't blend with adjoining meshes. This leaves ragged looking lines across the model between selectable portions. Is there a way to blend the shading between the mesh pieces I've created? Or is there a better way to select part of a mesh other than creating separate meshes? I have some programming experience, but this is the first time I've tried to use three.js. Any insight would be greatly appreciated.
You may create additional attribute for each triangle, that would be color of the bodypart that it belongs to. So, all triangles of the left leg would be red, all triangles of right leg would be blue etc.
Render your model normally, and add second pass where you would render triangles colored in the way described above, so no shading at all. Then, you could get your mouse position where the user clicked and look up in that bodypart-colored framebuffer and just check the pixel color on the place where user clicked.
This technique of picking 3d objects by assigning them different colors, rendering those colors to another texture and then checking color of clicked pixel is quite common, although it has some flaws. On the other hand, neither is ray testing absolutely accurate.
I believe that this demo runs actually based on that concept - demo.
var aiGeojj = new t.CubeGeometry(30, 30, 30);
var uprighters = Math.floor((Math.random() * 11));
var aiMaterialjj = new t.MeshBasicMaterial({ map: t.ImageUtils.loadTexture('images/images_bots/greenbot/upright/' + uprighters + '.gif'), opacity: 0, transparent: true });
var ojj= new t.Mesh(aiGeojj, aiMaterialjj);
ojj.limbs = [];
ojj.trunk = [];
var aiGeojjkey2c = new t.CubeGeometry(50, 50, 50);
var uprightersc = Math.floor((Math.random() * 11));
var aiMaterialjjc = new t.MeshBasicMaterial({ map: t.ImageUtils.loadTexture('images/images_bots/greenbot/upright/' + uprightersc + '.gif'), opacity: 1, transparent: true });
var ojjkey2c = new t.Mesh(aiGeojjkey2c, aiMaterialjjc);
ojjkey2c.id = "hiworld";
ojj.add(ojjkey2c);
ojj.trunk.push(ojjkey2c);
for( var you = 0; you < ojj.length; you++){
for( var youb = 0; youb < ojj[you].trunk.length; youb++){
window.alert( ojj[you].trunk[youb].id);
}
}

Changing material color on a merged mesh with three js

Is that possible to interact with the buffer used when merging multiple mesh for changing color on the selected individual mesh ?
It's easy to do such thing with a collection of mesh but what about a merged mesh with multiple different material ?
#hgates, your last comment was very helpful to me, I was looking for the same thing for days !
Ok i set on each face a color, and set to true vertexColor on the
material, that solve the problem ! :)
I write here the whole concept that I used in order to add a proper answer for those who are in the same situation :
// Define a main Geometry used for the final mesh
var mainGeometry = new THREE.Geometry();
// Create a Geometry, a Material and a Mesh shared by all the shapes you want to merge together (here I did 1000 cubes)
var cubeGeometry = new THREE.CubeGeometry( 1, 1, 1 );
var cubeMaterial = new THREE.MeshBasicMaterial({vertexColors: true});
var cubeMesh = new THREE.Mesh( cubeGeometry );
var i = 0;
for ( i; i<1000; i++ ) {
// I set the color to the material for each of my cubes individually, which is just random here
cubeMaterial.color.setHex(Math.random() * 0xffffff);
// For each face of the cube, I assign the color
for ( var j = 0; j < cubeGeometry.faces.length; j ++ ) {
cubeGeometry.faces[ j ].color = cubeMaterial.color;
}
// Each cube is merged to the mainGeometry
THREE.GeometryUtils.merge(mainGeometry, cubeMesh);
}
// Then I create my final mesh, composed of the mainGeometry and the cubeMaterial
var finalMesh = new THREE.Mesh( mainGeometry, cubeMaterial );
scene.add( finalMesh );
Hope it will help as it helped me ! :)
Depends on what you mean with "changing colors". Note that after merging, the mesh is like any other non-merged mesh.
If you mean vertex colors, it would be possibly to iterate over the faces and determine the vertices which color to change based on the material index.
If you mean setting a color to the material itself, sure it's possible. Merged meshes can still have multiple materials the same way ordinary meshes do - in MeshFaceMaterial, though if you are merging yourself, you need to pass in a material index offset parameter for each geometry.
this.meshMaterials.push(new THREE.MeshBasicMaterial(
{color:0x00ff00 * Math.random(), side:THREE.DoubleSide}));
for ( var face in geometry.faces ) {
geometry.faces[face].materialIndex = this.meshMaterials.length-1;
}
var mesh = new THREE.Mesh(geometry);
THREE.GeometryUtils.merge(this.globalMesh, mesh);
var mesh = new THREE.Mesh(this.globalMesh, new THREE.MeshFaceMaterial(this.meshMaterials));
Works like a charm, for those who need example but ! This creates mutliple additional buffers (indices and vertex data) , and multiple drawElements call too :(, i inspect the draw call with webgl inpector, before adding the MeshFaceMaterial : 75 call opengl api running at 60fps easily, after : 3490 call opengl api fps drop about 20 % 45-50 fps, this means that drawElements is called for every mesh, we loose the context of merging meshes, did i miss something here ? i want to share different materials on the same buffer

Resources