Related
I want to convert my object to THIS particular format
{
"metadata" :
{
"formatVersion" : 3.1,
"generatedBy" : "Blender 2.7 Exporter",
"vertices" : 8,
"faces" : 6,
"normals" : 8,
"colors" : 0,
"uvs" : [4],
"materials" : 1,
"morphTargets" : 0,
"bones" : 0
},
"scale" : 1.000000,
"materials" : [ {
"DbgColor" : 15658734,
"DbgIndex" : 0,
"DbgName" : "bake_mat.013",
"blending" : "NormalBlending",
"colorAmbient" : [1.0, 1.0, 1.0],
"colorDiffuse" : [1.0, 1.0, 1.0],
"colorEmissive" : [0.0, 0.0, 0.0],
"colorSpecular" : [0.5, 0.5, 0.5],
"depthTest" : true,
"depthWrite" : true,
"mapDiffuse" : "b_cb-blue-block60x96.png",
"mapDiffuseWrap" : ["repeat", "repeat"],
"shading" : "Lambert",
"specularCoef" : 50,
"transparency" : 1.0,
"transparent" : false,
"vertexColors" : false
}],
"vertices" : [-76.2,-0.249995,121.92,-76.2,-0.250005,-121.92,76.2,-0.250005,-121.92,76.2,-0.249995,121.92,-76.2,0.250005,121.92,-76.2,0.249995,-121.92,76.2,0.249995,-121.92,76.2,0.250005,121.92],
"morphTargets" : [],
"normals" : [-0.577349,-0.577349,-0.577349,-0.577349,-0.577349,0.577349,-0.577349,0.577349,0.577349,-0.577349,0.577349,-0.577349,0.577349,0.577349,-0.577349,0.577349,-0.577349,-0.577349,0.577349,0.577349,0.577349,0.577349,-0.577349,0.577349],
"colors" : [],
"uvs" : [[-2e-06,0.999998,2e-06,-2e-06,1,2e-06,0.999998,1]],
"faces" : [43,1,0,4,5,0,0,1,2,3,0,1,2,3,43,5,6,2,1,0,2,3,0,1,3,4,5,0,43,6,7,3,2,0,2,3,0,1,4,6,7,5,43,0,3,7,4,0,0,1,2,3,1,7,6,2,43,0,1,2,3,0,2,3,0,1,1,0,5,7,43,7,6,5,4,0,2,3,0,1,6,4,3,2],
"bones" : [],
"skinIndices" : [],
"skinWeights" : [],
"animations" : []
}
I tried using blender, but I doesn't gives me these particular keys and nodes that I desire. Does anyone know how this can be achieved? When I try to export the models, I get js like cars.js but I want it to be like the other js.
Please see the attached screenshot
After exporting a 3D model from 3D max to three.js not work tiling. That is in itself not recorded json file settings for this feature
"materials": [
{
"DbgIndex" : 0,
"DbgName" : "02 - Default",
"colorDiffuse" : [0.5882, 0.5882, 0.5882],
"colorAmbient" : [0.5882, 0.5882, 0.5882],
"colorSpecular" : [0.9000, 0.9000, 0.9000],
"transparency" : 1.0,
"specularCoef" : 10.0,
"mapDiffuse" : "rabica.png",
"vertexColors" : false
},
I have used the max exporter but had trouble with smoothing groups so do not use it. I may end up patching it at some stage in the future.
It seems to also not set the 'wrap' setting that you want.
After looking at the JSONLoader and Loader code it seems that it does handle wrap being set in the json file.
I think if you edit your json file manually so it looks as below, it may work.
"materials": [
{
"DbgIndex" : 0,
"DbgName" : "02 - Default",
"colorDiffuse" : [0.5882, 0.5882, 0.5882],
"colorAmbient" : [0.5882, 0.5882, 0.5882],
"colorSpecular" : [0.9000, 0.9000, 0.9000],
"transparency" : 1.0,
"specularCoef" : 10.0,
"mapDiffuse" : "rabica.png",
"vertexColors" : false,
"mapDiffuseWrap" : ["repeat", "repeat"]
},
Though this is untested
Is it possible to assign two materials to one mesh which has been loaded with JSONLoader?
I've made a simple character in blender and exported it to three.js format, which contains morph targets and UVs.
I was trying to assign solid color material to the body and a picture to my character's head (http://touhou.ru/dev/webgl-test-stackoverflow/kourindouhime.jpg), but after loading mesh and materials I get a gray-colored mesh.
Here's production version of my project (use wasd to move and when you see a gray player mesh which you'd be controlling, that's exactly the thing I'm talking about): http://touhou.ru/dev/webgl-test-stackoverflow/
And here's the way I'm loading mesh and materials with JSONLoader:
var player_loader = new THREE.JSONLoader();
player_loader.load( "running_babe.js", function(geo, material) {
material[0].morphTargets = true;
material[1].morphTargets = true;
var materials = new THREE.MeshFaceMaterial(material);
player = new THREE.Mesh( geo, materials );
scene.add(player);
});
Am I doing something wrong?
UPDATE: the problem was in my export. Now the second material looks that way:
{
"DbgColor" : 15597568,
"DbgIndex" : 1,
"DbgName" : "Material.001",
"blending" : "NormalBlending",
"colorAmbient" : [0.6400000190734865, 0.6400000190734865, 0.6400000190734865],
"colorDiffuse" : [0.6400000190734865, 0.6400000190734865, 0.6400000190734865],
"colorSpecular" : [0.5, 0.5, 0.5],
"depthTest" : true,
"depthWrite" : true,
"mapDiffuse" : "kourindouhime.jpg",
"mapDiffuseWrap" : ["repeat", "repeat"],
"shading" : "Lambert",
"specularCoef" : 50,
"transparency" : 1.0,
"transparent" : false,
"vertexColors" : false
}
and it works very nice. Thank you guys.
If I looked your code correctly, running_babe.js is the mesh you are talking about. Looking at its source, the materials are as follows:
"materials" : [ {
"DbgColor" : 15658734,
"DbgIndex" : 0,
"DbgName" : "Material",
"blending" : "NormalBlending",
"colorAmbient" : [0.6400000190734865, 0.6400000190734865, 0.6400000190734865],
"colorDiffuse" : [0.6400000190734865, 0.6400000190734865, 0.6400000190734865],
"colorSpecular" : [0.5, 0.5, 0.5],
"depthTest" : true,
"depthWrite" : true,
"shading" : "Lambert",
"specularCoef" : 50,
"transparency" : 1.0,
"transparent" : false,
"vertexColors" : false
},
{
"DbgColor" : 15658734,
"DbgIndex" : 0,
"DbgName" : "default",
"vertexColors" : false
}],
It can be clearly seen that there are no textures, the second one doesn't have really anything and the first one has all colors as a shade of gray. Seems like the materials aren't exported correctly. That is not a big surprise as exporting materials is hard, as there might not be a clear mapping between 3d modeler concepts and three.js material params. I'd just fix it by manually specifying the material params into that file.
You can have one material per mesh, that's the way OpenGL works. Are you sure you have only one mesh?
I am relatively new to mongoDB.
I set up a shard mongo cluster with 2 Replica Sets; each set in a shard. -> 4 mongo deamons
The deamons are distributed on 2 WIN server with 8gb ram each.
I have a Test Collection with 10 mio documents (~ 600bytes / doc) and using the c# driver to connect to the mongos (primaryPreferred)
Now if i run some thousands single read-queries on the shard key I can see that mongo eats up more and more memory and stalls at around 7,2GB. Almost no more page faults and the queries are extremly fast. Good!
The same with more complex queries on different document properties
(Combined Index for those queries exists)
BUT
If I execute just a couple of update queries, I got a huge drop in memory usage... like mongo frees up 3GB of RAM just in no time and the so fast read queries are getting very slow.
It gets worse if i launch like 500k upserts (Save) in a row.
A complex query that was taking like 2sec to run takes now 22minutes.
I get the same behavior for Count-Queries with the same query parameters.
Is that a rather normal mongoDB behaviour or is there something that I missed to set up?
--- UPDATE #hwatkins
MongoDB version: 2.2.2
1 document scanned for a single upsert
I Have quite high disk activity during the bulk-upsert
explain() for a complex count- query before upsert
Count Explain: { "clusteredType" : "ParallelSort", "shards" : { "set1/xxxx:1234,yyyy:1234" : [{ "cursor" : "BtreeCursor AC", "isMultiKey" : false, "n" : 20799, "nscannedObjects" : 292741, "nscanned" : 292741, "nscannedObjectsAllPlans" : 294290, "nscannedAllPlans" : 294290, "scanAndOrder" : false, "indexOnly" : false, "nYields" : 2, "nChunkSkips" : 0, "millis" : 2382, "indexBounds" : { "f.14.b" : [["A", "A"]], "f.500.b" : [[10, 50]] }, "allPlans" : [{ "cursor" : "BtreeCursor AC", "n" : 20795, "nscannedObjects" : 292741, "nscanned" : 292741, "indexBounds" : { "f.14.b" : [["A", "A"]], "f.500.b" : [[10, 50]] } }, { "cursor" : "BasicCursor", "n" : 4, "nscannedObjects" : 1549, "nscanned" : 1549, "indexBounds" : { } }], "oldPlan" : { "cursor" : "BtreeCursor AC", "indexBounds" : { "f.14.b" : [["A", "A"]], "f.500.b" : [[10, 50]] } }, "server" : "xxxx:1234" }], "set2/xxxx:56789,yyyy:56789" : [{ "cursor" : "BtreeCursor AC", "isMultiKey" : false, "n" : 7000, "nscannedObjects" : 97692, "nscanned" : 97692, "nscannedObjectsAllPlans" : 98941, "nscannedAllPlans" : 98941, "scanAndOrder" : false, "indexOnly" : false, "nYields" : 0, "nChunkSkips" : 0, "millis" : 729, "indexBounds" : { "f.14.b" : [["A", "A"]], "f.500.b" : [[10, 50]] }, "allPlans" : [{ "cursor" : "BtreeCursor AC", "n" : 6996, "nscannedObjects" : 97692, "nscanned" : 97692, "indexBounds" : { "f.14.b" : [["A", "A"]], "f.500.b" : [[10, 50]] } }, { "cursor" : "BasicCursor", "n" : 4, "nscannedObjects" : 1249, "nscanned" : 1249, "indexBounds" : { } }], "oldPlan" : { "cursor" : "BtreeCursor AC", "indexBounds" : { "f.14.b" : [["A", "A"]], "f.500.b" : [[10, 50]] } }, "server" : "yyyy:56789" }] }, "cursor" : "BtreeCursor AC", "n" : 27799, "nChunkSkips" : 0, "nYields" : 2, "nscanned" : 390433, "nscannedAllPlans" : 393231, "nscannedObjects" : 390433, "nscannedObjectsAllPlans" : 393231, "millisShardTotal" : 3111, "millisShardAvg" : 1555, "numQueries" : 2, "numShards" : 2, "millis" : 2384 }
explain() after upsert for the same query
{ "clusteredType" : "ParallelSort", "shards" : { "set1/xxxx:1234,yyyy:1234" : [{ "cursor" : "BtreeCursor AC", "isMultiKey" : false, "n" : 20799, "nscannedObjects" : 292741, "nscanned" : 292741, "nscannedObjectsAllPlans" : 294290, "nscannedAllPlans" : 294290, "scanAndOrder" : false, "indexOnly" : false, "nYields" : 379, "nChunkSkips" : 0, "millis" : 391470, "indexBounds" : { "f.14.b" : [["A", "A"]], "f.500.b" : [[10, 50]] }, "allPlans" : [{ "cursor" : "BtreeCursor AC", "n" : 20795, "nscannedObjects" : 292741, "nscanned" : 292741, "indexBounds" : { "f.14.b" : [["A", "A"]], "f.500.b" : [[10, 50]] } }, { "cursor" : "BasicCursor", "n" : 4, "nscannedObjects" : 1549, "nscanned" : 1549, "indexBounds" : { } }], "server" : "xxxx:1234" }], "set2/xxxx:56789,yyyy:56789" : [{ "cursor" : "BtreeCursor AC", "isMultiKey" : false, "n" : 7000, "nscannedObjects" : 97692, "nscanned" : 97692, "nscannedObjectsAllPlans" : 98941, "nscannedAllPlans" : 98941, "scanAndOrder" : false, "indexOnly" : false, "nYields" : 0, "nChunkSkips" : 0, "millis" : 910, "indexBounds" : { "f.14.b" : [["A", "A"]], "f.500.b" : [[10, 50]] }, "allPlans" : [{ "cursor" : "BtreeCursor AC", "n" : 6996, "nscannedObjects" : 97692, "nscanned" : 97692, "indexBounds" : { "f.14.b" : [["A", "A"]], "f.500.b" : [[10, 50]] } }, { "cursor" : "BasicCursor", "n" : 4, "nscannedObjects" : 1249, "nscanned" : 1249, "indexBounds" : { } }], "oldPlan" : { "cursor" : "BtreeCursor AC", "indexBounds" : { "f.14.b" : [["A", "A"]], "f.500.b" : [[10, 50]] } }, "server" : "yyyy:56789" }] }, "cursor" : "BtreeCursor AC", "n" : 27799, "nChunkSkips" : 0, "nYields" : 379, "nscanned" : 390433, "nscannedAllPlans" : 393231, "nscannedObjects" : 390433, "nscannedObjectsAllPlans" : 393231, "millisShardTotal" : 392380, "millisShardAvg" : 196190, "numQueries" : 2, "numShards" : 2, "millis" : 391486 }
Btw:
*One single upsert (one affected doc) lets the memory usage drop by around 600MB. --> ~ 4,5GB mem usage reached only after some queries.
if i take the query from above and i use the mongoCursor to loop on the result-set it just takes ages... (query still running as i type) :(
UPDATE II #Daniel
Here you got a sample doc stored in the mongoDB-Cluster.
The Shard Key is the b -Property of my doc (it corresponds to a telephone number)
Upsert:
I search back existing docs by the shard-key and update some properties of the f- array. Then I call Save on the mongoDB driver for all those docs one by one (like 500k times).
There is an index: { "f.14.b" : 1, "f.500.b" : 1 }
This index is used for complex queries. Like described above those queries are fast before the bulk-update and extremely slow after the update.
{
"_id" : ObjectId("51248d6xxxxxxxxxxxxx"),
"b" : "33600000000",
"f" : {
"500" : {
"a" : ISODate("2013-02-20T08:45:38.075Z"),
"b" : 91
},
"14" : {
"a" : ISODate("2013-02-20T08:45:38.075Z"),
"b" : "A"
},
"1501" : {
"a" : ISODate("2013-02-20T08:45:38.141Z"),
"b" : ["X", "Y", "Z"]
},
"2000" : {
"a" : ISODate("2013-02-20T08:45:38.141Z"),
"b" : false
}
}
}
Thanks a lot,
Blume
This is interesting. It looks like first, your data is not very evenly distributed. Your explain shows nscanned: 292741 on the first set and nscanned: 97692 on the second set. Pretty big difference. It also shows on the first set nyields:379 and on the second set nyields:0. This implies that only are you reading unevenly from your sets, you are probably writing unevenly to them. You will get more out of your cluster if you choose a shard key that has a more even distribution.
As to why specifically this is happening with your upserts, are you adding more data to your existing documents? If so you are probably a victim of document movement. In your mongodb logs do you see any queries with moved: 1? This means the slow query in the log also had a document movement on disk which causes lots of havok with indexes into arrays/subdocuments. Mongodb I believe still will essentially have to do an index rebuild on the entire document if it moves and will have to do some heavy updating of all indexes into subdocuments/arrays.
The workaround for document movement is to preallocate extra data on document at creation then immediately remove it from the document. Mongo allocates all documents with a fixed space + padding factor on the disk. If they outgrow their space, they must be moved on the disk to a larger area. If you created your documents with already extra data then remove it, you will give yourself a lot of extra padding on disk to accomodate your document growth. This can be a waste of space for sure but it will be a big saver of performance.
What version of mongodb are you on?
When you do the upsert can you do an .explain() on it to see how
many documents it's scanning.
What does the disk io look like during the upserts
I'm creating a little game using Three.js and everything is going well apart from shome shading issues with cubes. I'm basically building a game level by just dropping textured cubes down to form a maze. The problem is that when the cubes are next to one another, each one is shaded in such a way that it looks as if it's a separate entity and not part of a larger wall.
Here is an example, notice the illusion of a single wall is lost:
Is there a different shading technique i should use or is there a nice property to be set somewhere to change this shading behavior?
This is my cube model:
{
"metadata" :
{
"formatVersion" : 3,
"generatedBy" : "Blender 2.60 Exporter",
"vertices" : 8,
"faces" : 6,
"normals" : 8,
"colors" : 0,
"uvs" : 4,
"materials" : 1,
"morphTargets" : 0
},
"scale" : 1.000000,
"materials": [{
"DbgColor" : 15658734,
"DbgIndex" : 0,
"DbgName" : "WallCube",
"colorAmbient" : [1.0, 1.0, 1.0],
"colorDiffuse" : [1.0, 1.0, 1.0],
"colorSpecular" : [0.15, 0.15, 0.15],
"mapDiffuse" : "../../textures/walls/stone/stone.png",
"mapDiffuseWrap" : ["repeat", "repeat"],
"mapNormal" : "../../textures/walls/stone/stone_normal.png",
"mapNormalFactor" : 1.0,
"shading" : "Lambert",
"specularCoef" : 25,
"transparency" : 1.0,
"vertexColors" : false
}],
"vertices": [50.000000,-50.000000,-50.000000,50.000000,-50.000000,50.000000,-50.000000,-50.000000,50.000000,-50.000000,-50.000000,-50.000000,50.000000,50.000000,-50.000000,50.000000,50.000000,50.0000050,-50.000000,50.000000,50.000000,-50.000000,50.000000,-50.000000],
"morphTargets": [],
"normals": [1.000000,-1.000000,-1.000000,1.000000,-1.000000,1.000000,-1.000000,-1.000000,1.000000,-1.000000,-1.000000,-1.000000,1.000000,1.000000,-1.000000,-1.000000,1.000000,-1.000000,-1.000000,1.000000,1.000000,1.000000,1.000000,1.000000],
"colors": [],
"uvs": [[0.000000,1.000000,1.000000,1.000000,1.000000,0.000000,0.000000,0.000000]],
"faces": [43,0,1,2,3,0,0,1,2,3,0,1,2,3,43,4,7,6,5,0,0,1,2,3,4,5,6,7,43,0,4,5,1,0,1,2,3,0,0,4,7,1,43,1,5,6,2,0,1,2,3,0,1,7,6,2,43,2,6,7,3,0,1,2,3,0,2,6,5,3,43,4,0,3,7,0,3,0,1,2,4,0,3,5]
}
and this is how i load it:
JSONLoader = new THREE.JSONLoader();
Light = new THREE.PointLight(0xFFFFFF);
Light.position = {x:0, y:75, z:350};
Meshes = [];
JSONLoader.load("../assets/models/cube.js", function(Geometry)
{
for (var MeshIndex = 0; MeshIndex <= 5; MeshIndex++)
{
Meshes[MeshIndex] = new THREE.Mesh(Geometry, new THREE.MeshFaceMaterial());
Meshes[MeshIndex].position.x = MeshIndex * 100;
Scene.add(Meshes[MeshIndex]);
}
});
Scene.add(Light);
Any ideas how to make the cubes look like a continuous wall?
JSONLoader.load("../assets/models/cube.js", function(Geometry)
{
Geometry.materials[ 0 ].shading = THREE.FlatShading;
// ...
}
This was kindly answered by alteredq over at the three.js site.
https://github.com/mrdoob/three.js/issues/1258#issuecomment-3834489