How do I give a position for each block in an instanedMesh (three.js)? - three.js

I am making a 3d voxel game using threejs. I am coming across a problem where I have a geometry, as well as a material and I am trying to make an instanced mesh of about 1000 blocks (to increase game performance) and I want to give each of those blocks a different position. How do I do this? This is what I have so far so you can refer to it, thanks!
var geometry = new THREE.BoxBufferGeometry(1, 1, 1);
var blockMaterial = new THREE.MeshBasicMaterial({color : 0x00ff00});
Now that I have this, what should I continue to write if I want about 1000 blocks in an instancedMesh in certain positions that I want? Thanks again :)

Related

ThreeJS Points (Point Cloud) with Lighting using custom Shader Material

Coded using:
Using ThreeJS v0.130.1
Framework: Angular 12, but that's not relevant to the issue.
Testing on Chrome browser.
I am building an application that gets more than 100K points. I use these points to render a THREE.Points object on the screen.
I found that default THREE.PointsMaterial does not support lighting (the points are visible with or without adding lights to the scene).
So I tried to implement a custom ShaderMaterial. But I could not find a way to add lighting to the rendered object.
Here is a sample of what my code is doing:
Sample App on StackBlitz showing my current attempt
In this code, I am using sample values for point cloud data, normals and color but everything else is similar to my actual application. I can see the 3D object, but need more proper lighting using normals.
I need help or guidance to implement the following:
Add lighting to custom shader material. I have Googled and tried many things, no success so far.
Using normals, show the effects of lighting (In this sample code, the normals are fixed to Y-axis direction, but I am calculating them based on some vector logic in actual application). So calculating normals is already done, but I want to use them to show light shine/shading effect in the custom shader material.
And in this sample, color attribute is set to fixed red color, but in actual application I am able to apply colors using UV range from a texture to color attribute.
Please advise how/if I can get lighting based on normals for Point Cloud. Thanks.
Note: I looked at this Stackoveflow question but it only deals with changing the alpha/transparency of points and not lighting.
Adding lighting to a custom material is a very complex process. Especially since you could use Phong, Lambert, or Physical lighting methods, and there's a lot of calculations that need to pass from the vertex to the fragment shader. For instance, this segment of shader code is just a small part of what you'd need.
Instead of trying to re-create lighting from scratch, I recommend you create a PlaneGeometry with the material you'd like (Phong, Lambert, Physical, etc...) and use an InstancedMesh to create thousands of instances, just like in this example.
Based on that example, the pseudo-code of how you could achieve a similar effect is something like this:
const count = 100000;
const geometry = new PlaneGeometry();
const material = new THREE.MeshPhongMaterial();
mesh = new THREE.InstancedMesh( geometry, material, count );
mesh.instanceMatrix.setUsage( THREE.DynamicDrawUsage ); // will be updated every frame
scene.add( mesh );
const dummy = new THREE.Object3D();
update() {
// Sets the rotation so it's always perpendicular to camera
dummy.lookAt(camera);
// Updates positions of each plane
for (let i = 0; i < count; i++){
dummy.position.set( x, y, z );
dummy.updateMatrix();
mesh.setMatrixAt( i ++, dummy.matrix );
}
}
The for() loop would be the most expensive part of each frame, so if you need to update it on each frame, you might want to calculate this in the vertex shader, but that's another question altogether.

intersecting meshes results to mesh with wholes

I am using THREE and I am trying to intersect a box mesh with a custom geometry I am creating and converting it to geometry using :
const g = new THREE.Geometry().fromBufferGeometry(shape3d)
I aim to add faces to the custom geometry, that is why I do that. So I expect to get back from the intersection my custom geometry + polygons that the box has.
I get that indeed, though I get also some holes as you can see in the below image :
I used many csg versions that are out there, the manthrax one, the ThreeCSG etc but no luck!
thank you
I suggest you set bevelEnabled:false to your mesh extrusion, because I am psychic and I can see your code in my head. :D

Three.js - How to update UV mapping when using morph targets?

I've been struggling with this one for hours, and found nothing either in the docs or here on SO that would point me to the right direction to achieve what I aim at.
I'm loading a scene containing several meshes. The first one is used as an actual mesh, rendered on the scene, the other ones are just used as morph targets (their geometries, properly speaking).
loader.load("scene.json", function (loadedScene) {
camera.lookAt( scene.position );
var basis = loadedScene.getObjectByName( "Main" ).geometry;
var firstTarget = loadedScene.getObjectByName( "Target1" ).geometry;
// and so on for the rest of the "target" meshes
basis.morphTargets[0] = {name: 'fstTarget', vertices: firstTarget.vertices};
var MAIN = new THREE.Mesh(basis);
This works very well, and I can morph the MAIN mesh with no hassle by playing with the influence values. The differences between the basis mesh and the target are not huge, basically they're just XY adjustments (2D shape variations).
Now, I'm using a textures material: UVs are properly projected (Blender export) and the result is nice with the MAIN mesh as is.
The problem comes when the basis shape is morphed towards a target geometry: as expected, the texture (UVs) adapts automatically, but this is not what I need to achieve => I'd need to have the UVs "morph" towards the morph target's UVs for the texture to look the same.
Here is an example of what I have now (left: basis mesh, right: morphTargetInfluences = 1 for the first morph target)
morph target and texture
What I'd like to have is the exact same texture projection on the final, morphed mesh...
I can't figure out how to do that the right way. Should I reassign the target UVs to the MAIN mesh (and how would I do that)?
The result would be like having a cloth below which a shape is morphed, and the cloth being "shrinked-wrapped" all the time against that underlying shape => you can actually see the shape changes, but the cloth itself is not deformed, just wrapping itself properly and consistently around the shape...
Any help would be much appreciated! Thanks in advance :)

How to Set Plane Mesh to always lookAt camera without tilting

I'm trying to make a Plane to always face the camera or another moving object but I want the Plane to only rotate on 1 axis. How can I use the lookAt function to make it only rotate side ways without tilting to look up or down at the moving object?
thanks, I managed to solve it easily by just keeping the y position of the rotating object constant.
if(planex){
var yaw_control = controls.getYawObject();
pos = new THREE.Vector3( yaw_control.position.x, planex.position.y, yaw_control.position.z );
planex.lookAt(pos);
}
http://www.lighthouse3d.com/opengl/billboarding/index.php?billCyl
maybe this article of any help for you. You are looking for those cylindrical billboards i think but read up from the first page ;) You can modify the specific mesh matrix yourself, although i am not sure if this is the most efficient way. I also did this myself once.
Get the camera look vec:
three.js set and read camera look vector
Then get the camera upVec and afterwards get the cross prodcut of those = rightVec according to the article above.
using those vectors, you can fill in a new Three.Matrix4() like explained in the article and then replace the meshes matrix with the newly created one. As I said, i am not quite into the matrix stuff in three.js but this works but it is probably not that efficient.
For this to work you will have to deactive the meshes auto matrix update with
mesh.matrixAutoUpdate = false;

Three.js instancing equivalent

I've recently started playing with three.js. Noticed that even with a few thousand simple cubes, the performance starts to drop.
So this brings my main question: is there any way to instance using three.js? I'm pretty sure this drop in performance is related to the drawcalls, therefore if instancing is possible with three.js somehow, it can help support the performance.
I'm aware of buffers but at this point it's impossible for me to create a geometry buffer that will give me the power to modify individual objects during runtime. If there is a library to handle all this, this also counts as a solution.
Shortly, I'm looking for an equivalent of object instancing in three.js. Any suggestions are appreciated.
I've understood that instancing can be emulated / reimplemented with a shader. Am not sure and have not tried though.
This old demo has 150k individually animated cubes on the GPU btw, but the source is minified so was hard to see what's going on. Perhaps not a proper instancing solution that would work for any mesh, am not sure, might even be. http://alteredqualia.com/three/examples/webgl_cubes.html
Will keep an eye open for this as we'd need it too I think.. (have added trees to vizicities in a demo now)
I have only thought up a solution and have not tried it yet. I want to instance very complex meshes and have them animated using a skeleton. It seems the JSON loader only loads as Geometry G. I want to convert to BufferGeometry BG1, than make another BufferGeometry BG2. Then assign references of vertex attributes, etc from BG2 to BG1
//load mesh
...
var mesh = loadMesh();
//convert to buffer geometry
var BG1 = new BufferGeometry();
BG1.fromGeometry(mesh);
var BG2 = new BufferGeometry();
BG2.addAttribute('position', G1.attributes['position']);
BG2.addAttribute('normal', G1.attributes['normal']);
BG2.addAttribute('uv', G1.attributes['uv']);
BG2.addAttribute('color', G1.attributes['color']);
BG2.drawcalls = BG1.drawcalls;
BG2.boundingBox = BG1.boundingBox;
BG2.boundingSphere = BG1.boundingSphere;
Its my understanding that webgl will share these buffers and not duplicate the memory used in VRAM. Any comments are welcome.
I had the same experience when trying to draw a couple of thousands of Spheres.
After some research I achieved a better performance (up to a million of items) using the PointCloud Object. Basically you create the PointCloud object from a geometry (it can be created from raw in this example or use one of the existing in Three.js) and the PointCloudMaterial, where you can modify the properties of each item.
An example could be as follows (adding 10 points)
var geo = new THREE.Geometry();
var mat = new THREE.PointCloudMaterial({size: 1, color:0xff0000});
//assign different positions to the points
for (var i=0 ; i<10 ; i++){
var point = new THREE.Vector3(3*i,0,0);
geo.vertices.push(point);
}
system = new THREE.PointCloud(geo, mat);
scene.add(system);
To modify the appearance, you can play with the PointCloudMaterial properties, or load a texture so that each point gets a desired shape (cube-like in your case).
If you share more details (why do you need cubes, for example) or some code, maybe I can be more helpful
Most of the answers here are extremely outdated.
THREE.js now supports instancing via InstancedMesh
https://threejs.org/docs/?q=instancedmesh#api/en/objects/InstancedMesh
See example here:
https://threejs.org/examples/webgl_buffergeometry_instancing.html

Resources