Custom UVGenerator - howto? - three.js
I have a use case in which users can specify texture that then is to be mapped onto an object across multiple faces. So unfortunately I cannot use standard UV mapping like with Blender and thus want to write my own custom UVGenerator to create my own projection.
So I am looking for detailed info how to do it.
I know that THREE.ExtrudeGeometry.WorldUVGenerator can serve as an example that is located in src/extras/geometries/ExtrudeGeometry.js
In there I found two methods where I am not sure how the work together. The first is generateTopUV which basically takes 3 vertices as parameters and expects that I return three pairs of u/v values. So that's pretty straight forward what the method does.
generateTopUV: function ( geometry, indexA, indexB, indexC )
The second is a odd to me since it takes four vertices and I wonder why and also the name doesn't really help me. Hope sb can shed some light on this.
It's code for WorldUVGenerator is this:
generateSideWallUV: function ( geometry, indexA, indexB, indexC, indexD ) {
var vertices = geometry.vertices;
var a = vertices[ indexA ];
var b = vertices[ indexB ];
var c = vertices[ indexC ];
var d = vertices[ indexD ];
if ( Math.abs( a.y - b.y ) < 0.01 ) {
return [
new THREE.Vector2( a.x, 1 - a.z ),
new THREE.Vector2( b.x, 1 - b.z ),
new THREE.Vector2( c.x, 1 - c.z ),
new THREE.Vector2( d.x, 1 - d.z )
];
} else {
return [
new THREE.Vector2( a.y, 1 - a.z ),
new THREE.Vector2( b.y, 1 - b.z ),
new THREE.Vector2( c.y, 1 - c.z ),
new THREE.Vector2( d.y, 1 - d.z )
];
}
}
Cheers Tom
In the end I solved this by just setting geometry.faceVertexUvs with an array of arrays where the inner array contained 3 vector2's that give the uv values.
Works nicely :-) and no need to handle 4 vertices
The three vertices in the Top and Bottom UV generation process each define one triangle.
The side wall is (separately) created as a string of rectangles, each of which is subsequently split into two coplanar triangles. Those 4 vertices define one rectangle.
If a bevel is requested, it is part of this rectangle process.
Related
Three.js: how to combine several indices & vector arrays to one
I am trying to visualize a grand strategy (EU4, CK3, HOI) like map in Three.js. I started creating meshes for every cell. the results are fine (screenshot 1 & 2). Separate mesh approach - simple land / water differentiation : Separate mesh approach - random cell color : however, with a lot of cells, performance becomes an issue (I am getting 15fps with 10k cells). In order to improve performance I would like to combine all these separate indices & vertex arrays into 2 big arrays, which will then be used to create a single mesh. I am looping through all my cells to push their indices, vertices & colors into the big arrays like so: addCellGeometryToMapGeometry(cell) { let startIndex = this.mapVertices.length; let cellIndices = cell.indices.length; let cellVertices = cell.vertices.length; let color = new THREE.Color( Math.random(), Math.random(), Math.random() ); for (let i = 0; i < cellIndices; i++) { this.mapIndices.push(startIndex + cell.indices[i]); } for (let i = 0; i < cellVertices; i++) { this.mapVertices.push(cell.vertices[i]); this.mapColors.push (color); } } I then generate the combined mesh: generateMapMesh() { let geometry = new THREE.BufferGeometry(); const material = new THREE.MeshPhongMaterial( { side: THREE.DoubleSide, flatShading: true, vertexColors: true, shininess: 0 } ); geometry.setIndex( this.mapIndices ); geometry.setAttribute( 'position', new THREE.Float32BufferAttribute( this.mapVertices, 3 ) ); geometry.setAttribute( 'color', new THREE.Float32BufferAttribute( new Float32Array(this.mapColors.length), 3 ) ); for ( let i = 0; i < this.mapColors.length; i ++ ) { geometry.attributes.color.setXYZ(i, this.mapColors[i].r, this.mapColors[i].g, this.mapColors[i].b); } return new THREE.Mesh( geometry, material ); } Unfortunately the results are underwhelming: While the data in the combined arrays look okay, only every third cell is rendered. In some cases the indices seem to get mixed up too. Combined approach - random cell colors : In other similar topics it is recommended to merge existing meshes. However, I figured that my approach should allow me to better understand what is actually happening & potentially save on performance as well. Has my code obvious flaws that I cannot see? Or am I generally on a wrong path, if so, how should it be done instead?
I actually found the issue in my code. wrong: let startIndex = this.mapVertices.length; The issue here is that the values in the indices array always reference a vertex (which consists of 3 consecutive array entries in the vertices array). correct: let startIndex = this.mapVertices.length / 3; Additionally I should only push one color per vertex instead of one per vertex array entry (= 1 per coordinate) but make sure that the arraylength of the geometry.color attribute stays at it is. With these 2 changes, the result for the combined mesh looks exactly the same as when creating a separate mesh for every cell. The performance improvement is impressive. separate meshes: 60 - 65 ms needed to render a frame 144 mb allocated memory combined mesh: 0 - 1 ms needed to render a frame 58 mb allocated memory Here are the fixed snippets: addCellGeometryToMapGeometry(cell) { let startIndex = this.mapVertices.length / 3; let cellIndices = cell.indices.length; let cellVertices = cell.vertices.length; console.log('Vertex-- maplength: ' + startIndex + ' celllength: ' + cellVertices); console.log('Indices -- maplength: ' + this.mapIndices.length + ' celllength: ' + cellIndices); console.log({cell}); let color = new THREE.Color( Math.random(), Math.random(), Math.random() ); for (let i = 0; i < cellIndices; i++) { this.mapIndices.push(startIndex + cell.indices[i]); } for (let i = 0; i < cellVertices; i++) { this.mapVertices.push(cell.vertices[i]); if (i % 3 === 0) { this.mapColors.push (color); } } } generateMapMesh() { let geometry = new THREE.BufferGeometry(); const material = new THREE.MeshPhongMaterial( { side: THREE.DoubleSide, flatShading: true, vertexColors: true, shininess: 0 } ); geometry.setIndex( this.mapIndices ); geometry.setAttribute( 'position', new THREE.Float32BufferAttribute( this.mapVertices, 3 ) ); geometry.setAttribute( 'color', new THREE.Float32BufferAttribute( new Float32Array(this.mapVertices.length), 3 ) ); for ( let i = 0; i < this.mapColors.length; i ++ ) { geometry.attributes.color.setXYZ(i, this.mapColors[i].r, this.mapColors[i].g, this.mapColors[i].b); } return new THREE.Mesh( geometry, material ); }
How to morphTarget of an .obj file (BufferGeometry)
I'm trying to morph the vertices of a loaded .obj file like in this example: https://threejs.org/docs/#api/materials/MeshDepthMaterial - when 'wireframe' and 'morphTargets' are activated in THREE.MeshDepthMaterial. But I can't reach the desired effect. From the above example the geometry can be morphed via geometry.morphTargets.push( { name: 'target1', vertices: vertices } ); however it seems that morphTargets is not available for my loaded 3D object as it is a BufferGeometry. Instead I tried to change independently each vertices point from myMesh.child.child.geometry.attributes.position.array[i], it kind of works (the vertices of my mesh are moving) but not as good as the above example. Here is a Codepen of what I could do. How can I reach the desired effect on my loaded .obj file?
Adding morph targets to THREE.BufferGeometry is a bit different than THREE.Geometry. Example: // after loading the mesh: var morphAttributes = mesh.geometry.morphAttributes; morphAttributes.position = []; mesh.material.morphTargets = true; var position = mesh.geometry.attributes.position.clone(); for ( var j = 0, jl = position.count; j < jl; j ++ ) { position.setXYZ( j, position.getX( j ) * 2 * Math.random(), position.getY( j ) * 2 * Math.random(), position.getZ( j ) * 2 * Math.random() ); } morphAttributes.position.push(position); // I forgot this earlier. mesh.updateMorphTargets(); mesh.morphTargetInfluences[ 0 ] = 0; // later, in your render() loop: mesh.morphTargetInfluences[ 0 ] += 0.001; three.js r90
Programmatically generate simple UV Mapping for models
Coming from this question I'm trying to generate UV Mappings programmatically with Three.js for some models, I need this because my models are being generated programmatically too and I need to apply a simple texture to them. I have read here and successfully generated UV mapping for some simple 3D text but when applying the same mapping to more complex models it just doesn't work. The texture I'm trying to apply is something like this: The black background it's just transparent in the PNG image. I need to apply this to my models, it's just a glitter effect so I don't care about the exact position in the model, is any way to create a simple UV Map programatically for this cases? I'm using this code from the linked question which works great for planar models but doesn't work for non-planar models: assignUVs = function( geometry ){ geometry.computeBoundingBox(); var max = geometry.boundingBox.max; var min = geometry.boundingBox.min; var offset = new THREE.Vector2(0 - min.x, 0 - min.y); var range = new THREE.Vector2(max.x - min.x, max.y - min.y); geometry.faceVertexUvs[0] = []; var faces = geometry.faces; for (i = 0; i < geometry.faces.length ; i++) { var v1 = geometry.vertices[faces[i].a]; var v2 = geometry.vertices[faces[i].b]; var v3 = geometry.vertices[faces[i].c]; geometry.faceVertexUvs[0].push([ new THREE.Vector2( ( v1.x + offset.x ) / range.x , ( v1.y + offset.y ) / range.y ), new THREE.Vector2( ( v2.x + offset.x ) / range.x , ( v2.y + offset.y ) / range.y ), new THREE.Vector2( ( v3.x + offset.x ) / range.x , ( v3.y + offset.y ) / range.y ) ]); } geometry.uvsNeedUpdate = true; }
You need to be more specific. Here, I'll apply UV mapping programmatically for (i = 0; i < geometry.faces.length ; i++) { geometry.faceVertexUvs[0].push([ new THREE.Vector2( 0, 0 ), new THREE.Vector2( 0, 0 ), new THREE.Vector2( 0, 0 ), ]); } Happy? There are an infinite ways of applying UV coordinates. How about this for (i = 0; i < geometry.faces.length ; i++) { geometry.faceVertexUvs[0].push([ new THREE.Vector2( Math.random(), Math.random() ), new THREE.Vector2( Math.random(), Math.random() ), new THREE.Vector2( Math.random(), Math.random() ), ]); } There's no RIGHT answer. There's just whatever you want to do is up to you. It's kind of like asking how do I apply pencil to paper. Sorry to be so snarky, just pointing out the question is in one sense nonsensical. Anyway, there are a few common methods for applying a texture. Spherical mapping Imagine your model is translucent, there's a sphere inside made of film and inside the sphere is a point light so that it projects (like a movie projector) from the sphere in all directions. So you do the math to computer the correct UVs for that situation To get a point on there sphere multiply your points by the inverse of the world matrix for the sphere then normalize the result. After that though there's still the problem of how the texture itself is mapped to the imaginary sphere for which again there are an infinite number of ways. The simplest way is I guess called mercator projection which is how most 2d maps of the world work. they have the problem that lots of space is wasted at the north and south poles. Assuming x,y,z are the normalized coordinates mentioned in the previous paragraph then U = Math.atan2(z, x) / Math.PI * 0.5 - 0.5; V = 0.5 - Math.asin(y) / Math.PI; Projection Mapping This is just like a movie. You have a 2d image being projected from a point. Imagine you pointed a movie projector (or a projection TV) at a chair. Compute those points Computing these points is exactly like computing the 2D image from 3D data that nearly all WebGL apps do. Usually they have a line in their vertex shader like this gl_Position = matrix * position; Where matrix = worldViewProjection. You can then do clipSpace = gl_Position.xy / gl_Position.w You now have x,y values that go from -1 to +1. You then convert them to 0 to 1 for UV coords uv = clipSpace * 0.5 + 0.5; Of course normally you'd compute UV coordinates at init time in JavaScript but the concept is the same. Planar Mapping This is the almost the same as projection mapping except imagine the projector, instead of being a point, is the same size as you want to project it. In other words, with projection mapping as you move your model closer to the projector the picture being projected will get smaller but with planar it won't. Following the projection mapping example the only difference here is using an orthographic projection instead of a perspective projection. Cube Mapping? This is effectively planar mapping from 6 directions. It's up to you to decide which UV coordinates get which of the 6 planes. I'd guess most of the time you'd take the normal of the triangle to see which plane it most faces, then do planar mapping from that plane. Actually I might be getting my terms mixed up. You can also do real cube mapping where you have a cube texture but that requires U,V,W instead of just U,V. For that it's the same as the sphere example except you just use the normalized coordinates directly as U,V,W. Cylindrical mapping This is like sphere mapping except assume there's tiny cylinder projecting on to your model. Unlike a sphere a cylinder has orientation but basically you can move the points of the model into the orientation of the cylinder then assuming x,y,z are now relative to the cylinder (in other words you multiplied them by the inverse matrix of the matrix that represents the orientation of the cylinder), then . U = Math.atan2(x, z) / Math.PI * 0.5 + 0.5 V = y 2 more solutions Maybe you want Environment Mapping? Here's 1 example and Here's another. Maybe you should consider using a modeling package like Maya or Blender that have UV editors and UV projectors built in.
What is the meaning of skin indices and skin weights?
This is undocumented, so I'm asking here. I'm trying to animate a mesh in JavaScript. I'm using Blender->Three.js exporter because it is convenient. I can't use Three.js itself because I was not able to figure out how to solve certain problems on it (rendering the normals and depth informations of a scene with animated meshes to a buffer). So, how do you read the "skinIndices" and "skinWeights" properties that get exported from Blender to Three.js? What do they mean, and what are they roles when calculating the position of the vertices on the animations? "bones" : [ {"parent":-1,"name":"pelvis","pos":[-3.52132e-08,0.0410043,0.880063],"rotq":[0,0,0,1]}, {"parent":0,"name":"thigh.L","pos":[0.0878887,0.00522349,0.102822],"rotq":[0,0,0,1]}, {"parent":1,"name":"shin.L","pos":[0.103679,0.00638392,-0.445744],"rotq":[0,0,0,1]}, {"parent":2,"name":"foot.L","pos":[0.0655578,0.0194668,-0.418675],"rotq":[0,0,0,1]}, {"parent":3,"name":"toe.L","pos":[0.0280578,-0.107185,-0.0704246],"rotq":[0,0,0,1]}, {"parent":3,"name":"heel.L","pos":[3.58224e-05,0.036576,-0.0885088],"rotq":[0,0,0,1]}, {"parent":0,"name":"thigh.R","pos":[-0.0878888,0.00522352,0.102822],"rotq":[0,0,0,1]}, {"parent":6,"name":"shin.R","pos":[-0.103679,0.00638412,-0.445745],"rotq":[0,0,0,1]}, {"parent":7,"name":"foot.R","pos":[-0.0655576,0.0194677,-0.418675],"rotq":[0,0,0,1]}, {"parent":8,"name":"toe.R","pos":[-0.0280577,-0.107185,-0.0704248],"rotq":[0,0,0,1]}, {"parent":8,"name":"heel.R","pos":[-3.57926e-05,0.036576,-0.0885083],"rotq":[0,0,0,1]}, {"parent":0,"name":"stomach","pos":[5.37268e-09,-0.008465,0.121596],"rotq":[0,0,0,1]}, {"parent":11,"name":"chest","pos":[1.94616e-08,0.0538289,0.269019],"rotq":[0,0,0,1]}, {"parent":12,"name":"upper_arm.L","pos":[0.160045,-0.010388,0.159844],"rotq":[0,0,0,1]}, {"parent":13,"name":"forearm.L","pos":[0.165089,0.0102809,-0.232678],"rotq":[0,0,0,1]}, {"parent":14,"name":"hand.L","pos":[0.0980782,-0.0148839,-0.245313],"rotq":[0,0,0,1]}, {"parent":15,"name":"index.L.001","pos":[0.019191,-0.040475,-0.0743723],"rotq":[0,0,0,1]}, {"parent":16,"name":"index.L.002","pos":[-0.00562334,-0.00824448,-0.0310695],"rotq":[0,0,0,1]}, {"parent":17,"name":"index.L.003","pos":[-0.00953785,-0.00126594,-0.0192741],"rotq":[0,0,0,1]}, {"parent":15,"name":"middle.L.001","pos":[0.0191911,-0.0188201,-0.0769786],"rotq":[0,0,0,1]}, {"parent":19,"name":"middle.L.002","pos":[0.00288424,-0.00695575,-0.0326532],"rotq":[0,0,0,1]}, {"parent":20,"name":"middle.L.003","pos":[-0.0111618,-0.00550338,-0.0242877],"rotq":[0,0,0,1]}, {"parent":15,"name":"ring.L.001","pos":[0.0186397,0.00194495,-0.0777299],"rotq":[0,0,0,1]}, {"parent":22,"name":"ring.L.002","pos":[0.00393239,-0.00062982,-0.0309386],"rotq":[0,0,0,1]}, {"parent":23,"name":"ring.L.003","pos":[-0.00873661,-0.00165674,-0.024165],"rotq":[0,0,0,1]}, {"parent":15,"name":"pinky.L.001","pos":[0.0191911,0.02271,-0.0758559],"rotq":[0,0,0,1]}, {"parent":25,"name":"pinky.L.002","pos":[-0.0057596,0.0014303,-0.0236881],"rotq":[0,0,0,1]}, {"parent":26,"name":"pinky.L.003","pos":[-0.00877053,-0.0020119,-0.0195478],"rotq":[0,0,0,1]}, {"parent":15,"name":"thumb.L.001","pos":[-0.0073517,-0.0318671,-0.0156776],"rotq":[0,0,0,1]}, {"parent":28,"name":"thumb.L.002","pos":[-0.00941652,-0.0166059,-0.0179188],"rotq":[0,0,0,1]}, {"parent":29,"name":"thumb.L.003","pos":[-0.0081799,-0.0129757,-0.0276645],"rotq":[0,0,0,1]}, {"parent":12,"name":"upper_arm.R","pos":[-0.160044,-0.010388,0.159844],"rotq":[0,0,0,1]}, {"parent":31,"name":"forearm.R","pos":[-0.165089,0.0102809,-0.232679],"rotq":[0,0,0,1]}, {"parent":32,"name":"hand.R","pos":[-0.0980774,-0.0148839,-0.245313],"rotq":[0,0,0,1]}, {"parent":33,"name":"index.R.001","pos":[-0.0185038,-0.0404748,-0.0743726],"rotq":[0,0,0,1]}, {"parent":34,"name":"index.R.002","pos":[0.00562337,-0.00824449,-0.0310695],"rotq":[0,0,0,1]}, {"parent":35,"name":"index.R.003","pos":[0.00953785,-0.00126596,-0.0192741],"rotq":[0,0,0,1]}, {"parent":33,"name":"middle.R.001","pos":[-0.0185038,-0.0188199,-0.0769789],"rotq":[0,0,0,1]}, {"parent":37,"name":"middle.R.002","pos":[-0.00288421,-0.00695577,-0.0326532],"rotq":[0,0,0,1]}, {"parent":38,"name":"middle.R.003","pos":[0.0111619,-0.00550339,-0.0242877],"rotq":[0,0,0,1]}, {"parent":33,"name":"ring.R.001","pos":[-0.0179525,0.00194514,-0.0777302],"rotq":[0,0,0,1]}, {"parent":40,"name":"ring.R.002","pos":[-0.00393245,-0.000629827,-0.0309386],"rotq":[0,0,0,1]}, {"parent":41,"name":"ring.R.003","pos":[0.00873658,-0.00165676,-0.024165],"rotq":[0,0,0,1]}, {"parent":33,"name":"pinky.R.001","pos":[-0.0185039,0.0227101,-0.0758562],"rotq":[0,0,0,1]}, {"parent":43,"name":"pinky.R.002","pos":[0.0057596,0.00143027,-0.0236881],"rotq":[0,0,0,1]}, {"parent":44,"name":"pinky.R.003","pos":[0.00877053,-0.00201192,-0.0195478],"rotq":[0,0,0,1]}, {"parent":33,"name":"thumb.R.001","pos":[0.00803882,-0.0318669,-0.0156779],"rotq":[0,0,0,1]}, {"parent":46,"name":"thumb.R.002","pos":[0.00941664,-0.0166059,-0.0179188],"rotq":[0,0,0,1]}, {"parent":47,"name":"thumb.R.003","pos":[0.00817987,-0.0129757,-0.0276645],"rotq":[0,0,0,1]}, {"parent":12,"name":"neck","pos":[1.6885e-08,-0.0164749,0.225555],"rotq":[0,0,0,1]}, {"parent":49,"name":"head","pos":[0.000806741,-0.0273245,0.0637051],"rotq":[0,0,0,1]}], "skinIndices" : [ 11,0,11,0,1,11,11,0,0,11,0,11,1,11,1,11,0,11,11,0,11,0,11,0,1,11,11,0,11,0,0,11,1,11,1,11,0,11,0,11,0,11,12,0,0,11,0,11, 0,11,12,0,12,0,11,0,11,0,11,0,12,0,11,0,11,0,11,0,12,0,12,11,11,0,11,0,12,13,11,0,0,11,11,0,12,13,12,13,0,11,12,13,12,0, 12,0,13,12,13,12,12,13,12,0,12,13,13,12,12,13,12,13,13,12,13,0,12,13,12,0,12,13,13,0,12,0,13,0,0,13,13,0,13,0,0,13,0,13, 13,0,13,0,0,13,13,12,13,12,13,12,13,0,13,0,13,0,13,12,13,0,13,0,13,0,13,0,13,0,13,0,13,0,13,0,13,0,13,0,13,0,13,0,13,0,13, (... too big)] "skinWeights" : [ 0.454566,0.443267,0.456435,0.4405,0.568642,0.331477,0.452697,0.446034,0.600277,0.577654,0.603738,0.578153,0.557686,0.334716, 0.579597,0.328238,0.596817,0.577156,0.481496,0.447683,0.604872,0.59171,0.466162,0.448242,0.567426,0.35812,0.49683,0.447124, 0.618979,0.590887,0.592533,0.590764,0.578989,0.341559,0.555862,0.37468,0.477411,0.438341,0.617349,0.569542,0.454728,0.432345, 0.401061,0.337472,0.500093,0.444338,0.633534,0.572105,0.601164,0.56698,0.388198,0.308292,0.413925,0.366652,0.449179,0.424051, 0.618298,0.58735,0.458406,0.430254,0.473939,0,0.439952,0.417849,0.605333,0.579977,0.631263,0.594722,0.517687,0,0.430191,0.274572, (... too big)] "animations" : [ {"name":"ArmatureAction", "fps":24, "length":0.625, "hierarchy": [{"parent":-1,"keys":[ {"time":0,"pos":[-3.52132e-08,0.0410043,0.880063],"rot":[0,0,0,1],"scl":[1,1,1]}, {"time":0.291667,"pos":[-3.52132e-08,0.0410043,0.880063]}, {"time":0.625,"pos":[-3.52132e-08,0.0410043,0.880063],"rot":[0,0,0,1],"scl":[1,1,1]}] }, {"parent":0,"keys":[ {"time":0,"pos":[0.0878887,0.00522349,0.102822],"rot":[0,0,0,1],"scl":[1,1,1]}, {"time":0.291667,"pos":[0.0878887,0.00522349,0.102822],"rot":[-0.36166,-1.53668e-08,-7.05768e-10,0.93231]}, {"time":0.625,"pos":[0.0878887,0.00522349,0.102822],"rot":[0,0,0,1],"scl":[1,1,1]} ]}, {"parent":1,"keys":[ {"time":0,"pos":[0.103679,0.00638392,-0.445744],"rot":[0,0,0,1],"scl":[1,1,1]}, {"time":0.291667,"pos":[0.103679,0.00638392,-0.445744]}, {"time":0.625,"pos":[0.103679,0.00638392,-0.445744],"rot":[0,0,0,1],"scl":[1,1,1]} ]}, {"parent":2,"keys":[ {"time":0,"pos":[0.0655578,0.0194668,-0.418675],"rot":[0,0,0,1],"scl":[1,1,1]}, {"time":0.291667,"pos":[0.0655578,0.0194668,-0.418675]}, {"time":0.625,"pos":[0.0655578,0.0194668,-0.418675],"rot":[0,0,0,1],"scl":[1,1,1]} ]}, {"parent":3,"keys":[ {"time":0,"pos":[0.0280578,-0.107185,-0.0704246],"rot":[0,0,0,1],"scl":[1,1,1]}, {"time":0.291667,"pos":[0.0280578,-0.107185,-0.0704246]}, {"time":0.625,"pos":[0.0280578,-0.107185,-0.0704246],"rot":[0,0,0,1],"scl":[1,1,1]} ]}, {"parent":4,"keys":[ {"time":0,"pos":[3.58149e-05,0.036576,-0.0885088],"rot":[0,0,0,1],"scl":[1,1,1]}, {"time":0.291667,"pos":[3.58149e-05,0.036576,-0.0885088]}, {"time":0.625,"pos":[3.58149e-05,0.036576,-0.0885088],"rot":[0,0,0,1],"scl":[1,1,1]} ]},
Each vertex corresponds to one skin index which corresponds to one skin weight. The skin index is the index of the bone that the particular vertex is influenced by (each vertex can only belong to one bone). The skin weight is the amount of influence that bone has over that vertex.
The skinIndices and skinWeights properties are arrays of arrays (technically the inner arrays are three.js Vector4 objects). Each item in the outer array of either corresponds, one-to-one, based on the indexed position, with each vertex in the mesh. Here's the relevant JSONLoader code that creates the values for these properties from a three.js JSON model file: if ( json.skinWeights ) { for ( var i = 0, l = json.skinWeights.length; i < l; i += influencesPerVertex ) { var x = json.skinWeights[ i ]; var y = ( influencesPerVertex > 1 ) ? json.skinWeights[ i + 1 ] : 0; var z = ( influencesPerVertex > 2 ) ? json.skinWeights[ i + 2 ] : 0; var w = ( influencesPerVertex > 3 ) ? json.skinWeights[ i + 3 ] : 0; geometry.skinWeights.push( new Vector4( x, y, z, w ) ); } } if ( json.skinIndices ) { for ( var i = 0, l = json.skinIndices.length; i < l; i += influencesPerVertex ) { var a = json.skinIndices[ i ]; var b = ( influencesPerVertex > 1 ) ? json.skinIndices[ i + 1 ] : 0; var c = ( influencesPerVertex > 2 ) ? json.skinIndices[ i + 2 ] : 0; var d = ( influencesPerVertex > 3 ) ? json.skinIndices[ i + 3 ] : 0; geometry.skinIndices.push( new Vector4( a, b, c, d ) ); } } The docs for the three.js Geometry object were updated to include info about this [some formatting and two inconsistency corrections mine]: Just like the skinWeights property, the skinIndices' values correspond to the geometry's vertices. Each vertex can have up to 4 bones associated with it. So if you look at the first vertex, and the first skin index, this will tell you the bones associated with that vertex. For example the first vertex could have a value of ( 10.05, 30.10, 12.12 ). Then the first skin index could have the value of ( 10, 2, 0, 0 ). The first skin weight could have the value of ( 0.8, 0.2, 0, 0 ). In affect this would take the first vertex, and then the bone mesh.bones[10] and apply it 80% of the way. Then it would take the bone mesh.bones[2] and apply it 20% of the way. The next two values have a weight of 0, so they would have no affect. In code another example could look like this: // e.g. geometry.skinIndices[15] = new THREE.Vector4( 0, 5, 9, 0 ); geometry.skinWeights[15] = new THREE.Vector4( 0.2, 0.5, 0.3, 0 ); // corresponds with the following vertex geometry.vertices[15]; // these bones will be used like so: skeleton.bones[0]; // weight of 0.2 skeleton.bones[5]; // weight of 0.5 skeleton.bones[9]; // weight of 0.3 skeleton.bones[10]; // weight of 0
generate bounding spheres for child objects
we are trying to generate bounding Spheres for child objects. Sadly we do not know how to get the proper positions for them. Even if the child objects are clearly NOT placed in the center, the position vector is still 0,0,0 We are using the UTF8Loader for importing objects and do not have access to the "standard" vertices array. Furthermore it looks like bounding radius is calculated by distance from center of the parent. Any ideas? Live demo: Login with this demo account: http://threever.org/login user: Demo passw: demoacc and go to: threever.org/editor A picture of the problem: picture And the code snippet when traversing the object: object.traverse( function( node ) { if( node.geometry ) { if(node.geometry.boundingSphere){ [...] var sphere = new THREE.Mesh( spheregeometry, material); sphere.position.copy(node.position); [...] _this.scene.add(sphere); } } }); EDIT: We tried the described workflow (thanks #WestLangley) for updating the matrix position but still had no luck. Since it looks like that the bounding spheres are not generated correctly either (in general to large) we decided to try out another approach: We were generating bounding geometry for the sole purpose of having "selection meshes" for ray casting (which can be somewhat difficult if the mesh is THREE.BufferGeometry [not?]). So we looked in the THREE.UTF8Loader file and tried to reconstruct the way geometry is created when useBuffers is false . Then we set the dynamic flag for THREE.BufferGeometry to true which gives access to the attribute Arrays. When the mesh is generated we use our createSelectionDummy function to generate a simplified THREE.Geometry version of the mesh (no uvs, merged Vertices, normal vector(0,0,0)) using the attribute Arrays. What do you think of this technique? Does this have a performance gain vs useBuffers:false ? this.createSelectionDummy = function( indices, positions ) { var geom = new THREE.Geometry(); for( var i = 0; i < indices.length; i += 3 ) { geom.vertices.push( new THREE.Vector3( positions[ i*3 ], positions[ i*3+1 ], positions[ i*3+2 ] ) ); geom.vertices.push( new THREE.Vector3( positions[ i*3+3 ], positions[ i*3+4 ], positions[ i*3+5 ] ) ); geom.vertices.push( new THREE.Vector3( positions[ i*3+6 ], positions[ i*3+7 ], positions[ i*3+8 ] ) ); geom.faces.push( new THREE.Face3( indices[ i ], indices[ i + 1 ], indices[ i + 2 ], new THREE.Vector3(0,0,0) , null, 0 ) ); } //reduce mesh size by 3 geom.mergeVertices(); geom.computeCentroids(); geom.computeFaceNormals(); return (new THREE.Mesh( geom, new THREE.MeshBasicMaterial({wireframe:true}) )); };
You can get an object's world position like so: var position = object.matrixWorld.getPosition(); It is important to make sure the object's world matrix is updated first. This normally happens in each render call. But if for some reason it is not, you can call the following first: object.updateMatrixWorld( true ); You can compute the bounding sphere radius object.geometry.boundingSphere.radius this way: object.geometry.computeBoundingSphere();