ThreeJS rotate coordinate axes of a FBX models arm - three.js

I would like to rotate the coordinate axes of specific segments of a .fbx model. E.g. as illustrated on the image I would like to rotate the coordinate axes of the right arm 90 degress around the Z-axis.
I can see that THREE.Object3D.DefaultUp can change the default coordinate system of an object, but I cannot apply it to the fbx segment objects.
Just to be clear, I dont want to rotate the mesh/arm, but only the coordinate axes that define the arms rotation. Is that possible?
I load the .fbx model and define the arm as so:
// model
const loader = new THREE.FBXLoader();
loader.load( '{{ url_for('static', filename='ybot.fbx') }}', function ( object ) {
myObj = object;
myObj.traverse( function ( child ) {
if ( child.isMesh ) {
child.castShadow = true;
child.receiveShadow = true;
}
});
rightArm = myObj.getObjectByName('mixamorigRightArm');
// Do the rotations here
scene.add( myObj );
});
}

Related

Mangled rendering when transforming scene coordinates instead of camera coordinates

I've been learning how to integrate ThreeJS with Mapbox, using this example. It struck me as weird that the approach is to leave the loaded model in its own coordinate system, and transform the camera location on render. So I attempted to rewrite the code, so that the GLTF model is transformed when loaded, then the ThreeJS camera is just synchronised with the Mapbox camera, with no further modifications.
The code now looks like this:
function newScene() {
const scene = new THREE.Scene();
// create two three.js lights to illuminate the model
const directionalLight = new THREE.DirectionalLight(0xffffff);
directionalLight.position.set(0, -70, 100).normalize();
scene.add(directionalLight);
const directionalLight2 = new THREE.DirectionalLight(0xffffff);
directionalLight2.position.set(0, 70, 100).normalize();
scene.add(directionalLight2);
return scene;
}
function newRenderer(map, gl) {
// use the Mapbox GL JS map canvas for three.js
const renderer = new THREE.WebGLRenderer({
canvas: map.getCanvas(),
context: gl,
antialias: true
});
renderer.autoClear = false;
return renderer;
}
// create a custom layer for a 3D model per the CustomLayerInterface
export function addModel(modelPath, origin, altitude = 0, orientation = [Math.PI / 2, 0, 0]) {
const coords = mapboxgl.MercatorCoordinate.fromLngLat(origin, altitude);
// transformation parameters to position, rotate and scale the 3D model onto the map
const modelTransform = {
translateX: coords.x,
translateY: coords.y,
translateZ: coords.z,
rotateX: orientation[0],
rotateY: orientation[1],
rotateZ: orientation[2],
/* Since our 3D model is in real world meters, a scale transform needs to be
* applied since the CustomLayerInterface expects units in MercatorCoordinates.
*/
scale: coords.meterInMercatorCoordinateUnits()
};
const scaleVector = new THREE.Vector3(modelTransform.scale, -modelTransform.scale, modelTransform.scale)
return {
id: "3d-model",
type: "custom",
renderingMode: "3d",
onAdd: function(map, gl) {
this.map = map;
this.camera = new THREE.Camera();
this.scene = newScene();
this.renderer = newRenderer(map, gl);
// use the three.js GLTF loader to add the 3D model to the three.js scene
new THREE.GLTFLoader()
.load(modelPath, gltf => {
gltf.scene.position.fromArray([coords.x, coords.y, coords.z]);
gltf.scene.setRotationFromEuler(new THREE.Euler().fromArray(orientation));
gltf.scene.scale.copy(scaleVector);
this.scene.add(gltf.scene);
const bbox = new THREE.Box3().setFromObject(gltf.scene);
console.log(bbox);
this.scene.add(new THREE.Box3Helper(bbox, 'blue'));
});
},
render: function(gl, matrix) {
this.camera.projectionMatrix = new THREE.Matrix4().fromArray(matrix);
this.renderer.state.reset();
this.renderer.render(this.scene, this.camera);
// this.map.triggerRepaint();
}
}
}
It basically works, in that a model is loaded and drawn in the right location in the Mapbox world. However, instead of looking like this:
It now looks like this, a mangled mess that jitters around chaotically as the camera moves:
I'm not yet familiar enough with ThreeJS to have any idea what I did wrong.
Here's a side-by-side comparison of the old, functional code on the right, vs the new broken code on the left.
Further investigation
I suspect possibly the cause is to do with shrinking all the coordinates down to within the [0..1] range of the projected coordinate system, and losing mathematical precision, perhaps. When I scale the model up by 100 times, it renders like this - messy and glitchy, but at least recognisable as something.

Import .gLTF animation to THREE.js

I used Blender to create a 3D object with simple animation, then I export it as .gLTF file, I tried to import to THREE.js, but I only able to import the 3D object but can't load the animation, how can I load the animation to Three.js?
The most basic code for playing an animation looks like so:
loader.load( 'models.glb', function ( gltf ) {
var model = gltf.scene;
var animations = gltf.animations;
scene.add( model );
//
mixer = new THREE.AnimationMixer( model );
var action = mixer.clipAction( animations[ 0 ] ); // access first animation clip
action.play();
} );
You then have to ensure to update the instance of AnimationMixer in your animation loop like so:
var delta = clock.getDelta(); // clock is an instance of THREE.Clock
if ( mixer ) mixer.update( delta );
Check out webgl_animation_skinning_blending to see this code in action.
three.js R109

GLTF Animation doesn't play when adding in material Three.js

I have a working GLTF animation that automatically starts playing when the page loads, however whenever I try to add a material to it, the animation no longer plays but the material appears. How do I fix this or if there is an easier way just to add a block colour to a gltf model please let me know, thanks.
var loader = new THREE.GLTFLoader();
loader.setDRACOLoader( new THREE.DRACOLoader() );
// Load a glTF resource
loader.load(
// resource URL
'../models/fox3.gltf',
// called when the resource is loaded
function ( gltf ) {
gltf.animations; // Array<THREE.AnimationClip>
gltf.scene; // THREE.Scene
gltf.scenes; // Array<THREE.Scene>
gltf.cameras; // Array<THREE.Camera>
gltf.asset; // Object
//Loading in and positioning model
var object = gltf.scene;
object.scale.set(10,10,10);
object.position.set (-300, 20,-400);
object.rotation.y = 0.5;
//Playing Animation
mixer = new THREE.AnimationMixer(gltf.scene);
console.log(gltf.animations)
mixer.clipAction( gltf.animations[0] ).play();
//Adding texture/colour to model (causes animation to stop playing)
// materialObj = new THREE.MeshBasicMaterial( { color: "#9E4300"} );
// object.traverse(function(child){
// if (child instanceof THREE.Mesh){
// //child.material = materialObj;
// }
// });
console.log(object);
scene.add( object )
});
or if there is an easier way just to add a block colour to a gltf model please let me know, thanks.
I'll address this last part of your question. The MeshBasicMaterial has the lighting calculations turned off, and in glTF this is supported with an extension called KHR_materials_unlit.
Here's a sample model called BoxUnlit.gltf that shows this extension in action. Two key places to note are ExtensionsUsed at the top, and the material near the bottom.
One major gotcha here is that the material's BaseColorFactor is specified in linear colorspace, while textures are provided in sRGB colorspace. So you have to take your chosen color and convert it to linear, typically by mathematically raising each component to the power of 2.2.
For example, your question contains color value #9E4300, which in 0..255 scale is equal to (158, 67, 0). Divide each number by 255, then raise by 2.2:
Red == (158 / 255.0) ** 2.2 == 0.348864834
Green == ( 67 / 255.0) ** 2.2 == 0.052841625
Blue == ( 0 / 255.0) ** 2.2 == 0.0
Use those values as the RGB values of the glTF model's BaseColorFactor, along with an alpha value of 1.0, like so:
"materials": [
{
"pbrMetallicRoughness": {
"baseColorFactor": [
0.348864834,
0.052841625,
0.0,
1.0
]
},
"extensions": {
"KHR_materials_unlit": {}
}
}
],
With that, ThreeJS should automatically select the MeshBasicMaterial for the model.
Try changing the skinning to true on the material of gltb and it should work. I had the same problem earlier.

Three.js: Add a texture to an object just on the outside

I'm very new with Three.js and I'm trying to make a ring:
http://www.websuvius.it/atma/myring/preview.html
I have a background texture ( the silver one ) and another one with a text.
I want the text only on the ring external face.
This is part of my code:
var loader = new THREE.OBJLoader( manager );
var textureLoader = new THREE.TextureLoader( manager );
loader.load( 'assets/3d/ring.obj', function ( event ) {
var object = event;
var geometry = object.children[ 0 ].geometry;
var materials = [];
var backgroundTexture = textureLoader.load('img/texture/silver.jpg');
backgroundTexture.flipY = false;
var background = new THREE.MeshBasicMaterial({
map: backgroundTexture,
color: 0xffffff
});
materials.push(background);
var customTexture = textureLoader.load('img/text.png');
customTexture.flipY = false;
var custom = new THREE.MeshBasicMaterial({
map: customTexture,
transparent: true,
opacity: 1,
color: 0xffffff
});
materials.push(custom);
mesh = THREE.SceneUtils.createMultiMaterialObject(geometry, materials);
mesh.position.y=-50;
scene.add(mesh);
}, onProgress, onError );
It is possible?
Thanks
The reason behind your issue appears to be in your .obj file. Judging from a quick glance at the texture coordinates stored in the file, the inside of the ring uses the same part of the texture image as the outside of the ring.
Increasing the transparent parts of the image won't help. Neither will the attempts to stop the texture from repeating. Those would help if the texture coordinates were larger than 1 but this is not your case unfortunately.
However, there are several solutions:
Split the object in a 3D modeling software to two objects - outside and inside of the ring - and apply the texture only to the first one.
Adjust the UV coordinates of the object in a 3D modeling software.
Adjust the UV coordinates of the vertices programmatically after loading the object to Three.JS

3d object that looks the same from every direction

have a 3d maze with walls and floor.
have an image with a key ( or other object its not important, but all of em are images and not 3d models ).
I want to display it on the floor and if the camera moves around the object needs to look the same without rotating the object. How can i achieve this?
Update1:
I created a plane geometry added the image ( its a transparent png ) and rotating at render. Its working good, but if i turn the camera sometimes the plane lose transparency for about a few milisec and the get a solid black background ( blinking ).
Any idea why?
here is the code:
var texture = new THREE.ImageUtils.loadTexture('assets/images/sign.png');
var material = new THREE.MeshBasicMaterial( {map: texture, transparent: true} );
plane = new THREE.Mesh(new THREE.PlaneGeometry(115, 115,1,1), material );
plane.position.set(500, 0, 1500);
scene.add(plane);
// at render:
plane.rotation.copy( camera.rotation );
This will be achieved by using:
function animate() {
not3dObject.rotation.z = camera.rotation.z;
not3dObject.rotation.x = camera.rotation.x;
not3dObject.rotation.y = camera.rotation.y;
...
render();
}

Resources