Three.js SimplifyModifier modify all buffer geometry attributes? - three.js

In three.js, I want to use the SimplifyModifier to simplify a model to several different degrees. The problem is that the geometry for the model contains attributes for normals, skinIndex, skinWeight, and position. The modifier removes all attributes other than position, and in the end, I am left with a black, static geometry. Is there any way I can programmatically create low polygon versions of my buffer geometry which also have these attributes modified as well?

Related

What is more efficient: new mesh with cloned material or cloned mesh with cloned material?

For a large number of objects (graph nodes in my case) which all have same geometry but different color, what is more efficient:
1) creating a mesh and then cloning the mesh and material of the mesh (with mesh.traverse) for each node
2) create new mesh with cloned material for each node
The only difference between an object creation via new and .clone() is that the latter one also copies all properties from the source object. If you don't need this, it's better to just create your meshes via the new operator.
Since you have large number of objects, you might want to consider to use instanced rendering in order to reduce the amount of draw calls in your application. Right now, you have to draw each node separately. With instanced rendering, the nodes are drawn all at once. There are a couple of examples that demonstrate instanced rendering with three.js:
https://threejs.org/examples/?q=instancing
Using instanced rendering is not a must but it can be helpful if you run into performance issues. Another option which is easier to implement but less flexible is to merge all meshes into a large single geometry and use vertex colors. You will have a single geometry, material and mesh object but an additional vertex color attribute will ensure that your nodes are colored differently.
three.js R103

Difference between buffer geometry and geometry

I'm new to three JS where I have researched through all the topics such as camera, renderer, scene and geometry. Where coming through the geometry there are geometry and buffer geometry(say conebufferGeometry and coneGeometry).Where the features are same in both. So whats the difference between geometry and buffer geometry. Is that influence anything in performance or something
The difference is essentially in underlying data structures (how geometry stores and handles vertices, faces etc in memory).
For learning purposes you should not care about it, just use ConeGeometry until you come across performance issues. Then come to the topic again, next time you will be more prepared to get the difference between two.
Please check BufferGeometry
An efficient representation of mesh, line, or point geometry. Includes
vertex positions, face indices, normals, colors, UVs, and custom
attributes within buffers, reducing the cost of passing all this data
to the GPU.
To read and edit data in BufferGeometry attributes, see
BufferAttribute documentation.
For a less efficient but easier-to-use representation of geometry, see
Geometry.
On another side Geometry:
Geometry is a user-friendly alternative to BufferGeometry. Geometries
store attributes (vertex positions, faces, colors, etc.) using objects
like Vector3 or Color that are easier to read and edit, but less
efficient than typed arrays.
Prefer BufferGeometry for large or serious projects.
BufferGeometry performance explained here: why-is-the-geometry-faster-than-buffergeometry
From 2021
This is now a moot point, geometry was removed from threejs in r125.
Geometry is now just an alias for BufferGeometry, source here.
export { BoxGeometry, BoxGeometry as BoxBufferGeometry };
Geometry is converted to buffergeometry in the end so if you dont have any performance issues, stick to geometry if its convenient for you.
Here you can see that ConeGeometry calls CylinderGeometry constructor.
CylinderGeometry.call( this, 0, radius, height, radialSegments, heightSegments, openEnded, thetaStart, thetaLength );
https://github.com/mrdoob/three.js/blob/dev/src/geometries/ConeGeometry.js
Then CylinderGeometry is created using CylinderBufferGeometry.
this.fromBufferGeometry( new CylinderBufferGeometry( radiusTop, radiusBottom, height, radialSegments, heightSegments, openEnded, thetaStart, thetaLength ) );
https://github.com/mrdoob/three.js/blob/dev/src/geometries/CylinderGeometry.js

When to updateMatrixWorld to have localToWorld and worldToLocal properly working?

I'll explain the problem I'm facing, cause probably I'm doing something wrong.
I have some models in blender, these are exported to json and used in three.js. In these models there are some planes, which then in js are replaced on the flight with another mesh to enable a cloth simulation.
The models can rotate once in the scene, and these planes being children of the models will also rotate. Moreover, the original planes from blender could have some rotation applied.
However we want the wind to be global, so for each plane and each frame, a global (world) wind direction vector is cloned and then transformed into the local coordinates of each plane, so that cloth particles can be moved correctly.
This is accomplished simply with :
globalDir = new THREE.Vector3(0,0,1); // Wind from north
// ...
var localDir = plane.worldToLocal(globalDir.clone());
// use localDir vector for moving around vertices base don wind
This "works", meaning that all clothes children of a single model are aligned to the same global wind, but :
nothing else changing, only refreshing the page, given the same values for the globalDir vector, the wind direction is always different.
from model to model, the direction is different.
It seems to be all about how the world matrix gets updated relatively to the object hierarchy, on the order the models are loaded and added, and so on.
I've been trying to add and remove calls to updateMatrix and updateMatrixWorld everywhere around, I'm asking for guidelines about how the methods updateMatrix, updateMatrixWorld, localToWorld and worldToLocal are supposed to be used.

Three.js Merge objects and textures

My question is related to this article:
http://blog.wolfire.com/2009/06/how-to-project-decals/
If my understanding is correct, a mesh made from the intersection of the original mesh and a cube is added to the scene to make a decal appear.
I need to save the final texture. So I was wondering if there is a way to 'merge' the texture of the original mesh and the added decal mesh?
You'd need to do some tricky stuff to convert from the model geometry space into UV coordinate space so you could draw the new pixels into the texture map. If you want to be able to use more than one material that way, you'd also probably need to implement some kind of "material map" similar to how some deferred rendering systems work. Otherwise you're limited to at most, one material per face, which wouldn't work for detailed decals with alpha.
I guess you could copy the UV coordinates from the original mesh into the decal mesh, and the use that information to reproject the decal texture into the original texture

How to apply texture to mesh without specifying UV's in geometry using three.js?

Is it possible to apply texture to mesh without specifying UV's in geometry in three.js ?
There are classes such as THREE.CubeGeometry, THREE.SphereGeometry, etc. that automatically generate the UV coordinates for you. However, if you are creating your own geometry from scratch (i.e., specifying vertex locations, creating faces, etc.) then the answer is no. Either you need to set the UV coordinates manually when creating the geometry, or you need to write a custom shader which determines the UV coordinates for any given point. Think about it this way: if you don't specify UV coordinates, the points on your geometry have no idea which point on your texture they should display.

Resources