I'm using same technique explained in this example
I defined a JSON file that contains all objects, geometries and animations. However, I couldn't find the way to specify in the animation a change of object's parent. Also, by looking at the documentation of the Three.js library I don't see any ObjectKeyframeTrack which would be nice to set an object's parent.
My questions is referring the JSON file format, not using the API, to do this.
I know there is a similar question (Change Object parent in Three.js?) but is not what I am looking for.
In any case, other approaches are welcome as well.
This was solved in the forum but i'll crosspost the answer here: the three.js animation system does not provide a way to serialize changes in the scene graph as part of a JSON file. JSON keyframe animation can affect properties of objects like position, rotation, scale, and color. To change the parent of an object, you would need to use custom JS at the appropriate time in the animation playback.
Related
PIXI.js has Container#cacheAsBitmap which causes the container to "render" itself to an image, save that, render the image instead of its children and when a child is added or removed or updated, the cache is updated.
What's the alternative for Three.js (but instead of an image it would be a mesh)?
I may not be understanding your question properly, but your reply to Sabee's answer was helpful. It sounds like you're looking to either merge multiple geometries into a single mesh or implement a form of model instancing, with the goal of reducing draw calls.
There is more than one way to accomplish this, depending on your requirements. You can merge multiple geometries into a single geometry object, and provide either one material or an array of materials (where each index corresponds to one of the merged geometries). You can also use GPU-accelerated instancing to achieve a similar effect with only a single copy of the geometry.
I'll refer to Dusan Bosnjak's excellent Medium series on instancing, which starts here: https://medium.com/#pailhead011/instancing-with-three-js-36b4b62bc127
As well, here are the three.js examples regarding instancing: https://threejs.org/examples/?q=instanc#webgl_buffergeometry_instancing_dynamic
Pixi.js is a 2D javascript library, using WebGL to render the images(frames) into html5 canvas. Three.js allows the creation of Graphical Processing Unit (GPU)-accelerated 3D animations using WebGL.
The browser cannot store rendered 3D frames, this work allows the GPU Accelerated Render Cache, which depends on what hardware's they run. Helpful post understanding what's going on behind the scenes.
But you can cahce your assets in browser like images, json objects of 3D models and etc.
In Three.js Cache class is a global object, used by assets loaders (TextureLoader, ImageLoader, AudioLoader ...), by default is disabled (false). To enable it you can set THREE.Cache.enabled = true ;
I think by default the browser should cache the textures for performance reasons, but if you want to be sure simply enable the cache by force code it in Three.js. Also the creator of Three.js answered this question.
I am doing a bit of work with threejs. And now just wondering if it was possible to name rotations or joints.
So it seems possible to write code like:
arm.rotateZ( 180 ).name="ARM_ANGLE";
But then how does one subsequently access and set the same rotation?
I know in x3d it is possible to do this, so was thinking it would be possible to do as well in threejs. In x3d, one can define a reference as:
<Transform DEF="ArmAngle" rotation="0 0 1 3.19">
And then later define a route to reference it like:
<ROUTE fromNode='spinarm' fromField='value_changed' toNode='ArmAngle' toField='set_rotation'></ROUTE>
What you are describing sounds like animation keys or transform key frames.
You can define these in a modeller like Blender and export them or generate them programatically.
But generally, what you are describing from x3d would have to be a layer built on top of three if you really want that style of interface, but honestly, it's pretty straightforward to use the scene graph style of manipulation.. i.e. finding an object and setting its position and rotation.. OR defining an animation in a modeller and then calling that animation. The advantage of using animations is that you can then blend between them.
You CAN name Objects in three.. so for instance you could name your arm.. and then find it using scene.getObjectByName("arm"). getObjectByName is a method of all Object3Ds.
I am trying to understand how the WebAudio API would work. I have two objects; one representing the listener and one the source. And have used the below link as an example. I am able to move the source and the sound position changes.
https://mdn.github.io/webaudio-examples/panner-node/
The command to change the orientation has been provided: viz this.panner.setOrientation or this.listener.setOrientation. The question I have is: if I have a source or listener object (in canvas mode using ThreeJS viz. we know its position and rotation) how do I change the orientation of either the panner or listener (as the case may be) via JS.
An example would greatly help. Thanks
Any reason not to use THREE's PositionalAudio object? https://threejs.org/docs/index.html#api/audio/PositionalAudio. That lets you add a sound to mesh object, and THREE will take care of moving it. If you want to source a different sound than an AudioBuffer, just connect the audio source to the .gain of the PositionalAudio object.
I was thinking that you need a deformer to read the clusters etc to be able to get the original (usually T pose) position of the skeleton.
Also FBX supports poses etc but never had a file that implemented it.
But my surprise was that importing an fbx file into 3dsmax without any mesh inside if I uncheck "animation" I get the T pose.
Any idea about it?
Thank you in adavnce
FbxCluster has GetTransformMatrix and GetTransformLinkMatrix. The former returns the original transform of the bone (that should be used to initialize the skinning), and the latter the corresponding orientation of the skinned node. Additionally, the skin node can also have "geometric rotation". I don't think there's anything more than that.
I'm trying to convert this model to the three.js model format:
http://tf3dm.com/3d-model/ninja-48864.html
Here's what i've tried so far:
I've imported the ms3d file in blender using the default addon. In blender, animations and mesh look correct; however, bones are only rendered as lines. Then I exported it to js using the three.js exporter. This results in a correct mesh, but the animation is not correctly exported. Only bone positions are exported (which are only rarely used in this specific model), NO rotations at all (except for a few identity quaternions).
It seems I have to modify the model in blender somehow, but since I'm a complete novice in 3d modelling, I'm kind of lost. I've also looked at other questions regarding blender+three.js but none of the tips (apply location/rotation/scale etc.) made a difference. It might also be a bug in the three.js exporter.
Can anybody help me do the conversion, one way or the other?
A nice Python utility is available for converting ms3d format to JSON format.
The link is: https://github.com/pyalot/parse-3d-files/blob/master/ms3d/convert.py
You can easily render this JSON model using THREE.JSONLoader() in three.js
Thanks.