I'm currently trying to calculate/draw the bounding box from a force graph using 3d-force-3d and three.js. I'm not having any luck creating the box. force-graph has a getGraphBbox() function but not much documentation. I have tried Any help would be appreciated! ...here is some not working code, i'm working with. Edit - my data is dynamic, I can add a mostly transparent cube to the scene around the points but in a static graph… I want the cube to always render the appropriate size around the nodes as some graphs have 1000 nodes and others 50 in my data sets. Hence the bounding box.
const Graph = ForceGraph3D()
(document.getElementById('3d-graph'))
.graphData(data)
.nodeLabel('id')
.nodeAutoColorBy('group');
//.getGraphBbox();//Not sure how to use this
let helper = new THREE.Box3Helper(Graph, 0xff0000);
helper.update();
Graph.scene().add(helper)
Related
Coded using:
Using ThreeJS v0.130.1
Framework: Angular 12, but that's not relevant to the issue.
Testing on Chrome browser.
I am building an application that gets more than 100K points. I use these points to render a THREE.Points object on the screen.
I found that default THREE.PointsMaterial does not support lighting (the points are visible with or without adding lights to the scene).
So I tried to implement a custom ShaderMaterial. But I could not find a way to add lighting to the rendered object.
Here is a sample of what my code is doing:
Sample App on StackBlitz showing my current attempt
In this code, I am using sample values for point cloud data, normals and color but everything else is similar to my actual application. I can see the 3D object, but need more proper lighting using normals.
I need help or guidance to implement the following:
Add lighting to custom shader material. I have Googled and tried many things, no success so far.
Using normals, show the effects of lighting (In this sample code, the normals are fixed to Y-axis direction, but I am calculating them based on some vector logic in actual application). So calculating normals is already done, but I want to use them to show light shine/shading effect in the custom shader material.
And in this sample, color attribute is set to fixed red color, but in actual application I am able to apply colors using UV range from a texture to color attribute.
Please advise how/if I can get lighting based on normals for Point Cloud. Thanks.
Note: I looked at this Stackoveflow question but it only deals with changing the alpha/transparency of points and not lighting.
Adding lighting to a custom material is a very complex process. Especially since you could use Phong, Lambert, or Physical lighting methods, and there's a lot of calculations that need to pass from the vertex to the fragment shader. For instance, this segment of shader code is just a small part of what you'd need.
Instead of trying to re-create lighting from scratch, I recommend you create a PlaneGeometry with the material you'd like (Phong, Lambert, Physical, etc...) and use an InstancedMesh to create thousands of instances, just like in this example.
Based on that example, the pseudo-code of how you could achieve a similar effect is something like this:
const count = 100000;
const geometry = new PlaneGeometry();
const material = new THREE.MeshPhongMaterial();
mesh = new THREE.InstancedMesh( geometry, material, count );
mesh.instanceMatrix.setUsage( THREE.DynamicDrawUsage ); // will be updated every frame
scene.add( mesh );
const dummy = new THREE.Object3D();
update() {
// Sets the rotation so it's always perpendicular to camera
dummy.lookAt(camera);
// Updates positions of each plane
for (let i = 0; i < count; i++){
dummy.position.set( x, y, z );
dummy.updateMatrix();
mesh.setMatrixAt( i ++, dummy.matrix );
}
}
The for() loop would be the most expensive part of each frame, so if you need to update it on each frame, you might want to calculate this in the vertex shader, but that's another question altogether.
I am using THREE and I am trying to intersect a box mesh with a custom geometry I am creating and converting it to geometry using :
const g = new THREE.Geometry().fromBufferGeometry(shape3d)
I aim to add faces to the custom geometry, that is why I do that. So I expect to get back from the intersection my custom geometry + polygons that the box has.
I get that indeed, though I get also some holes as you can see in the below image :
I used many csg versions that are out there, the manthrax one, the ThreeCSG etc but no luck!
thank you
I suggest you set bevelEnabled:false to your mesh extrusion, because I am psychic and I can see your code in my head. :D
I am creating a viewer using three.js and found that setting camera near and far plane to fixed values is causing flickering for some 3d models.
I see that this is due to the fact that GPU is running out of precision for model having bounding box length around 4000-5000.
Near plane is currently set to 0.1 and far to 20000.
You can move up your near plane to get more resolution. Maybe 1.0...
Another option to be aware of is logarithmic depth buffer:
https://threejs.org/examples/webgl_camera_logarithmicdepthbuffer.html
You can get the bounding box of the mesh via its geometry... geometry.boundingBox and geometry.boundingSphere .. sometimes you need to recalculate them using mesh.geometry.computeBoundingBox and computeBoundingSphere...
To get the bounding box in camera space is a bit tricky.. I don't know of a super optimal one-liner to do it, but someone else may weigh in...
a brute force way would be to transform the mesh vertices to screen space..
Maybe something like:
var gclone = mesh.geometry.clone();
for(var i=0;i<geometry.vertices.length;i++)
gclone.vertices[i].applyMatrix4(mesh.matrixWorld).project(camera)
gclone.computeBoundingBox()
var zExtent = gclone.boundingBox.max.z-gclone.boundingBox.min.z
In my program, I add points to a particle system and then calculate bounding box for it as:
var object = new THREE.Mesh(geometry, new THREE.MeshBasicMaterial(0xff0000));
var box = new THREE.BoxHelper( object, 0xffff00 );
scene.add(box);
geometry is instance of BufferGeometry and it contains all the points constituting particle system.
What I see is that bounding box is wrongly aligned. Its in direction perpendicular to expected direction.
So I expect wireframe structure to envelop the point cloud.
Do I need to do something extra here?
Edit:
Code I am working upon is in github repo:
github file
In function ParticleSystemBatcher.prototype.push, points read from a file are pushed in particle system.I have added code above at end of this function. Bounding box does appear,but aligned wrongly.
You have a THREE.ShaderMaterial which applies some logic to positioning these vertices. Hence, the result rendered is different than the result stored in the main memory.
You can debug this by making a Mesh or sprite, and positioning each where you expect the particle in the system to be using just rhe scene graph (object.position.set()). The result will be a bunch of dots that are not in the same space as your particle system. These will also fit the bounding box.
The solution is to apply the same transformation that is being applied by the shader.
I have looked and looked but have not come across anything close to what I'm trying to accomplish.
I am using THREE.js and I have a 3D object, say a human skull. I want to wrap a square grid around its surface while maintaining the unit spacing of the grid lines. Like wire frame but all the line spacing is uniform and straight.
Goal is to have the user easily see the surface length of an object. Real world example. Draw a line from the front of a skull to the back over the top and measure the length of the line.
Any code example, or even a start on the process I could take to solve this problem using THREE.js.
Thanks
you can compute Bounding Box on the object(skull) , and draw a cube using properties of Bounding Box .
object.computeBoundingBox();
// adding cube
var centerX = (object.boundingBox.max.x - object.boundingBox.min.x);
var centerY = (object.boundingBox.max.y - object.boundingBox.min.y);
var centerZ = (object.boundingBox.max.z - object.boundingBox.min.z);
var boundingBoxGeometry = new THREE.CubeGeometry(centerX, centerY, centerZ);
here is a sample fiddle for that : Bounding box fiddle