I use ThreeJS r68.
I always used THREE.Geometry for my project and it just works fine.
Now I want to change from THREE.Geometry to THREE.BufferGeometry because I read that this is the better choice.
But I couldn't get SmoothShading to work with my THREE.BufferGeometry.
I load my Object into a BufferGeometry and call bufferGeometry.computerVertexNormals. And then my result is FlatShading.
I read in the computeVertexNormals() method that BufferGeometry calculates differently if I use an "index" attribute. I tried to create an "Indexed" BufferGeometry but that just made everything worse. I don't know if I just created that right. I just added the indices like I would add them to the faces in a normal Geometry. The BufferGeometry.fromGeometry() method does not create an indexed BufferGeometry so I don't know where to look.
Do I need an indexed BufferGeometry for SmoothShading?
UPDATE
[... some time later....]
I think I could create a indexed THREE.BufferGeometry now. It's more like Geometry. And smooth shading looks fine with an indexed BufferGeometry. So now i have SmoothShading but a invalid uv-map. But why is the uv-map different in an indexed BufferGeometry to compared to not indexed BufferGeometry? BufferGeometry is really not easily loaded.
OK.
Here is what i got:
1.) SmoothShading only works for indexed THREE.BufferGeometry. (as far as I know) And not for non indexed BufferGeometry.
2.) An indexed THREE.BufferGeometry only has 1 uv point per vertex, and not 1 uv point per face-vertex.
That means if you have a square with 4 points, then you only have 4 uv points and not 6 like in THREE.Geometry and non indexed THREE.BufferGeometry. (That is confusing and will not allow complicated uv-maps)
UPDATE
[... a few hours of sleep later ...]
I looked into THREE.BufferGeometry.computerVertexNormals() again.
And I have to correct myself.
indexed THREE.BufferGeometry:
1) only 1 uv per vertex
2) only 1 normal per vertex
result :
- only smoothShading possible.
- only simple uv maps.
- limit of 65.535 vertices.
non indexed THREE.BufferGeometry:
1) 1 uv per face vertex
2) 1 normal per face vertex
result:
- calculating normals in ThreeJS(r68): only FlatShading
- calculating normals outside of ThreeJS and import the normals: FlatShading and SmoothShading
- complicated uv maps possible
You can apply THREE.FlatShading to your material to get a flat shaded indexed THREE.BufferGeometry. In that case you don't need to define any normals at all.
This saves you a lot of headaches and overhead:
geometry = new THREE.BufferGeometry
material = new THREE.MeshPhongMaterial({
color: 0xff0000,
shading: THREE.FlatShading
});
mesh = new THREE.Mesh( geometry, material );
Your mesh will render flat shaded.
This doesn't work for THREE.MeshLambertMaterial yet. But they are working on it. Check the related issue here on GitHUB.
Related
For a large number of objects (graph nodes in my case) which all have same geometry but different color, what is more efficient:
1) creating a mesh and then cloning the mesh and material of the mesh (with mesh.traverse) for each node
2) create new mesh with cloned material for each node
The only difference between an object creation via new and .clone() is that the latter one also copies all properties from the source object. If you don't need this, it's better to just create your meshes via the new operator.
Since you have large number of objects, you might want to consider to use instanced rendering in order to reduce the amount of draw calls in your application. Right now, you have to draw each node separately. With instanced rendering, the nodes are drawn all at once. There are a couple of examples that demonstrate instanced rendering with three.js:
https://threejs.org/examples/?q=instancing
Using instanced rendering is not a must but it can be helpful if you run into performance issues. Another option which is easier to implement but less flexible is to merge all meshes into a large single geometry and use vertex colors. You will have a single geometry, material and mesh object but an additional vertex color attribute will ensure that your nodes are colored differently.
three.js R103
I'm new to three JS where I have researched through all the topics such as camera, renderer, scene and geometry. Where coming through the geometry there are geometry and buffer geometry(say conebufferGeometry and coneGeometry).Where the features are same in both. So whats the difference between geometry and buffer geometry. Is that influence anything in performance or something
The difference is essentially in underlying data structures (how geometry stores and handles vertices, faces etc in memory).
For learning purposes you should not care about it, just use ConeGeometry until you come across performance issues. Then come to the topic again, next time you will be more prepared to get the difference between two.
Please check BufferGeometry
An efficient representation of mesh, line, or point geometry. Includes
vertex positions, face indices, normals, colors, UVs, and custom
attributes within buffers, reducing the cost of passing all this data
to the GPU.
To read and edit data in BufferGeometry attributes, see
BufferAttribute documentation.
For a less efficient but easier-to-use representation of geometry, see
Geometry.
On another side Geometry:
Geometry is a user-friendly alternative to BufferGeometry. Geometries
store attributes (vertex positions, faces, colors, etc.) using objects
like Vector3 or Color that are easier to read and edit, but less
efficient than typed arrays.
Prefer BufferGeometry for large or serious projects.
BufferGeometry performance explained here: why-is-the-geometry-faster-than-buffergeometry
From 2021
This is now a moot point, geometry was removed from threejs in r125.
Geometry is now just an alias for BufferGeometry, source here.
export { BoxGeometry, BoxGeometry as BoxBufferGeometry };
Geometry is converted to buffergeometry in the end so if you dont have any performance issues, stick to geometry if its convenient for you.
Here you can see that ConeGeometry calls CylinderGeometry constructor.
CylinderGeometry.call( this, 0, radius, height, radialSegments, heightSegments, openEnded, thetaStart, thetaLength );
https://github.com/mrdoob/three.js/blob/dev/src/geometries/ConeGeometry.js
Then CylinderGeometry is created using CylinderBufferGeometry.
this.fromBufferGeometry( new CylinderBufferGeometry( radiusTop, radiusBottom, height, radialSegments, heightSegments, openEnded, thetaStart, thetaLength ) );
https://github.com/mrdoob/three.js/blob/dev/src/geometries/CylinderGeometry.js
I've been struggling with this one for hours, and found nothing either in the docs or here on SO that would point me to the right direction to achieve what I aim at.
I'm loading a scene containing several meshes. The first one is used as an actual mesh, rendered on the scene, the other ones are just used as morph targets (their geometries, properly speaking).
loader.load("scene.json", function (loadedScene) {
camera.lookAt( scene.position );
var basis = loadedScene.getObjectByName( "Main" ).geometry;
var firstTarget = loadedScene.getObjectByName( "Target1" ).geometry;
// and so on for the rest of the "target" meshes
basis.morphTargets[0] = {name: 'fstTarget', vertices: firstTarget.vertices};
var MAIN = new THREE.Mesh(basis);
This works very well, and I can morph the MAIN mesh with no hassle by playing with the influence values. The differences between the basis mesh and the target are not huge, basically they're just XY adjustments (2D shape variations).
Now, I'm using a textures material: UVs are properly projected (Blender export) and the result is nice with the MAIN mesh as is.
The problem comes when the basis shape is morphed towards a target geometry: as expected, the texture (UVs) adapts automatically, but this is not what I need to achieve => I'd need to have the UVs "morph" towards the morph target's UVs for the texture to look the same.
Here is an example of what I have now (left: basis mesh, right: morphTargetInfluences = 1 for the first morph target)
morph target and texture
What I'd like to have is the exact same texture projection on the final, morphed mesh...
I can't figure out how to do that the right way. Should I reassign the target UVs to the MAIN mesh (and how would I do that)?
The result would be like having a cloth below which a shape is morphed, and the cloth being "shrinked-wrapped" all the time against that underlying shape => you can actually see the shape changes, but the cloth itself is not deformed, just wrapping itself properly and consistently around the shape...
Any help would be much appreciated! Thanks in advance :)
I want to see a chart with color specified per vertex and to get little bit of shading too.
But if I use MeshBasicMaterial I only get VertexColor with no dynamic shading.
On the other hand, if I use MeshPhongMaterial I just get shading but without emissiveness from my vertex colors.
As the THREE.JS PhongMaterial supports vertexColors, giving you a nice combination of dynamic lighting and vertex colors, I'm not quite sure I understand your question. Perhaps that is something you should investigate more?
However, as an alternative to writing a custom shader you could try rendering your model in multiple passes.
This will not give you as much control over the way the vertex colors and phong lighting are combined as a shader would, but often a simple add/multiply blend can give pretty decent results.
Algorithm:
- create two meshes for the BufferGeometry, one with the BasicMaterial and one with the PhongMaterial
- for the PhongMaterial, set
depthFunc = THREE.EqualDepth
transparent = true;
blending = THREE.AdditiveBlending(or MultiplyBlending)
- render the first mesh
- render the second mesh at the exact same spot
I am building quite a complex 3D environment in Three.js (FPS-a-like). For this purpose I wanted to structure the loading of textures and materials in an object oriƫnted way. For example; materials.wood.brownplank is a reusable material with a certain texture and other properties. Below is a simplified visualisation of the process where models uses materials and materials uses textures.
loadTextures();
loadMaterials();
loadModels();
//start doing stuff in the scene
I want to use that material on differently sized objects. However, in Three.js you can't (AFAIK) set a certain texture scale. You will have to set the repeat to scale it appropiate to your object. But I don't want to do that for every plane of every object I use.
Here is how it looks now
As you can see, the textures are not uniform in size.
Is there an easy way achieve this? So cloning the texture and/or material every time and setting the repeat according to the geometry won't do :)
I hope someone can help me.
Conclusion:
There is no real easy way to do this. I ended up changing my loading methods, where things like materials.wood.brownplank are now for example getMaterial('wood', 'brownplank') In the function new objects are instantiated
You should be able to do this by modifying your geometry UV coordinates according to the "real" dimensions of each face.
In Three.js, UV coordinates are relative to the face and texture (as in, 0.0 = one edge, 1.0 = other edge), no matter what the actual size of texture or face is. But by modifying the UVs in geometry (multiply them by some factor based on face physical size), you can use the same material and texture in different sizes (and orientations) per face.
You just need to figure out the mapping between UVs, geometry scale and your desired working units (eg. mm or m). Sorry I don't have, or know a ready algorithm to do it, but that's the approach you probably need to take. Should be quite doable with a bit of experimentation and google-fu.