Why do Camera and Object3D look opposite directions? - three.js

I have a basic question, but I could not find the answer.
I noticed that the following code:
const p = new THREE.Vector3();
const q = new THREE.Quaternion();
const s = new THREE.Vector3();
function setPositionAndRotation(o) {
o.position.set(1, 1, -1);
o.lookAt(0, 0, 0);
o.updateMatrix();
o.matrix.decompose(p, q, s);
console.log(JSON.stringify(q));
}
const camera = new THREE.PerspectiveCamera(60, window.innerWidth / window.innerHeight, .01, 1000);
var mesh = new THREE.Mesh(new THREE.Geometry(), new THREE.MeshBasicMaterial());
setPositionAndRotation(camera);
setPositionAndRotation(mesh);
<script src="https://cdnjs.cloudflare.com/ajax/libs/three.js/97/three.min.js"></script>
produces different quaternions for Camera and Object3D:
{"_x":-0.11591689595929515,"_y":0.8804762392171493,"_z":0.27984814233312133,"_w":0.36470519963100095}
{"_x":0.27984814233312133,"_y":-0.3647051996310009,"_z":0.11591689595929516,"_w":
(These are two quaternions pointing into opposite directions on Z axis.)
The problems lies in the bahavior of the lookAt function. I dug into source code of Object3d and I found this if
https://github.com/mrdoob/three.js/blob/master/src/core/Object3D.js#L331
if ( this.isCamera ) {
m1.lookAt( position, target, this.up );
} else {
m1.lookAt( target, position, this.up );
}
As you can see Object3D is handled differently than Camera. target and position are swapped.
Object3D's documentation says:
lookAt ( x : Float, y : Float, z : Float ) : null
Rotates the object to face a point in world space.
but the code does the opposite. It uses Matrix4's lookAt function
lookAt ( eye : Vector3, center : Vector3, up : Vector3, ) : this
Constructs a rotation matrix, looking from eye towards center oriented by the up vector.
putting target into eye, and position into center.
I can deal with that, but it is weird. Is there anybody able to explain why it is so?
r.97

In three.js, an unrotated object is considered to face its local positive-z axis.
The exception is a camera, which faces its local negative-z axis.
This design decision followed OpenGL convention.

Related

How to move an object in particular direction with specific distance in three.js

I need to move an object along directional vector through some distance. I fount translateOnAxis(vector, distance) of Object3D class. But I'm not able to understand how it works.
I've an object- sphere. I'm scaling it to look like ellipse. And setting position and direction. Now I need this object to move in the same direction which I'm setting it to, through some distance. When I apply it, I can't see the object. Can anybody suggest how it can be achieved?
var geometry = new THREE.SphereGeometry( radius, 64, 64, 0, -Math.PI );
geometry.applyMatrix( new THREE.Matrix4().makeScale( 1, 1, zScale ); //scaling it to look like ellipse
var direction = new THREE.Vector3( xDir, yDir, zDir);
var ellipse = new THREE.Mesh( geometry, material );
ellipse.lookAt(direction);
ellipse.position.set( xPos, yPos, zPos);
ellipse.translateOnAxis(direction, distance);
Your pasted code is buggy.
You're missing a ) on your applyMatrix line.
Are you using a debugger and observing console errors/warnings?

Three.js: Draw a vector on plane

I'm a newbie in three.js (and in stackoverflow).
I try to find answer but I'm not able to do this simple things.
I'm playing with Helpers and Plane.
I want to create a Plane (and it's PlaneHelper), and draw an arbitrary vector on this Plane.
All is right if the plane's distance from origin is set to 0.
If I give a distance to the plane, the vector is not on the plane.
Here is the commented code I use for this little experiment.
Projecting both the origin and the vector on the plane I was convinced that arrowHelper_Point remained on the plane, but it's not.
Where is my mistake? I can not understand it.
// Define ARROW_LENGTH to display ArrowHelper
const ARROW_LENGTH = 5;
// Point (0,0,0)
var origin = new THREE.Vector3(0, 0, 0);
// Axes helper in (0,0,0)
var axesHelperOrigin = new THREE.AxesHelper(100);
scene.add(axesHelperOrigin);
// Define a plane by the normal, color and distance from (0,0,0)
var vectorNormal = {
normal: new THREE.Vector3(1, 1, 0).normalize(),
color: "rgb(255, 255, 0)",
colorNormal: "rgb(255,100,0)",
colorVector: "rgb(194, 27, 255)",
distance: -3,
};
// Create Plane from the normal and distance
var plane = new THREE.Plane(vectorNormal.normal, vectorNormal.distance);
// Add PlaneHelper to scene
var planeHelper = new THREE.PlaneHelper(plane, 100, vectorNormal.color);
scene.add(planeHelper);
// Add ArrowHelper to display normal
// Find the projection of origin on plane
var originOnPlane = plane.projectPoint(origin);
var arrowHelper_Normal = new THREE.ArrowHelper(vectorNormal.normal, originOnPlane, ARROW_LENGTH, vectorNormal.colorNormal);
scene.add(arrowHelper_Normal);
// Define a point "random"
var point = new THREE.Vector3(5, -2, 6);
// Project the point on plane
var pointOnPlane = plane.projectPoint(point);
// Draw ArrowHelper to display the pointOnPlane, from originOnPlane
var arrowHelper_Point = new THREE.ArrowHelper(pointOnPlane.normalize(), originOnPlane, ARROW_LENGTH, vectorNormal.colorVector);
scene.add(arrowHelper_Point);
EDIT: OK, I think I find the error.
Looking at this Get direction between two 3d vectors using Three.js?
I need the vector between the two points:
var dir=new THREE.Vector3();
dir.subVectors(pointOnPlane,originOnPlane).normalize();
And use dir as the arrow direction.
Sorry for asking an obviously thing.
Looking at this Get direction between two 3d vectors using Three.js?
I need the vector between the two points:
var dir=new THREE.Vector3();
dir.subVectors(pointOnPlane,originOnPlane).normalize();
And use dir as the arrow direction.

How to setup a camera that follows a circle path?

I'm trying to create a camera that follows an object that rotates on a orbit around a sphere. But everytime the camera reaches the polar coordinates of the orbit, the direction changes. I just set the position of the camera according to the object that is has to follow and calling lookAt afterwards:
function render() {
rotation += 0.002;
// set the marker position
pt = path.getPoint( t );
// set the marker position
marker.position.set( pt.x, pt.y, pt.z );
marker.lookAt( new THREE.Vector3(0,0,0) );
// rotate the mesh that illustrates the orbit
mesh.rotation.y = rotation
// set the camera position
var cameraPt = cameraPath.getPoint( t );
camera.position.set( cameraPt.x, cameraPt.y, cameraPt.z );
camera.lookAt( marker.position );
t = (t >= 1) ? 0 : t += 0.002;
renderer.render( scene, camera );
}
Here's a complete fiddle: http://jsfiddle.net/krw8nwLn/69/
I've created another fiddle with a second cube which represents the desired camera behaviour: http://jsfiddle.net/krw8nwLn/70/
What happens is that the camera's lookAt function will always try to align the camera with the horizontal plane (so that the "up" direction is always (0, 1, 0). And when you reach the top and bottom of the ellipse path, the camera will instantaneously rotate 180° so that up is still up. You can also see this in your "desired behaviour" example, as the camera cube rotates so that the colors on the other side are shown.
A solution is to not use lookAt for this case, because it does not support cameras doing flips like this. Instead set the camera's rotation vector directly. (Which requires some math, but you look like a math guy.)

Rotation with negative scale

I'm creating a tool to rotate images in ThreeJs, but it doesn't work when dealing with negative scales.
The image is displayed in a Mesh created using a THREE.PlaneGeometry element and a material which maps to to correspongin image.
The tool is an object that has an element called gizmo (it's a small mesh) which is selected and dragged by the user to rotate the object.
To do the rotation I define an angle and an axis. The angle is defined by two vectors created using the the position of the gizmo (original and current) and the position of the Mesh.
var gizmoOriginalPosition = this.gizmoOriginalPosition.clone().applyMatrix4( this.matrixWorld );
var imagePosition = this.imageToTransformOriginalPosition.clone().applyMatrix4( this.imageToTransformParentOriginalMatrix );
var vector1 = gizmoOriginalPosition.sub( imagePosition ).normalize();
var vector2 = point.sub( imagePosition ).normalize();
var angle = Math.acos( vector1.dot( vector2 ) );
var axis = new THREE.Vector3( 0, 0, 1 );
var ortho = vector2.clone().cross( vector1 );
var _m = this.imageToTransformOriginalMatrix.clone();
this.tempMatrix.extractRotation( _m );
var q = new THREE.Quaternion().setFromRotationMatrix( this.tempMatrix );
var _axis = axis.clone().applyQuaternion( q );
var f = ortho.dot( _axis );
f = f > 0 ? 1 : -1;
angle *= -f;
var q = new THREE.Quaternion().setFromAxisAngle( axis, angle );
var Q = new THREE.Quaternion().multiplyQuaternions( this.imageToTransformOriginalQuaternion, q );
imageToTransform.quaternion.copy( Q );
The axis of rotation is always ( 0, 0, 1) because the Mesh is a plane in XY.
point is the new position of the gizmo using a plane of intersection.
The vectors to define the angle are in world coordinates. ortho is a vector to define the direction of the angle, so the Mesh rotates in the direction of the mouse pointer. I define the direction of the angle with the f value obtained using ortho and axis. The axis ( 0, 0, 1 ) is rotated so its direction is in world coordinates ( ortho is in world coordinates ).
This works as expected in almost every case, except when the Mesh has a negative scale in X and Y. Here the image rotates in the opposite direction to the mouse pointer.
Thanks.

Extruded spline (THREE.SceneUtils.createMultiMaterialObject) not responding to Three.Ray

I have a page using elements from the extruded spline example and the mouse tooltip example. Trying to debug the starting elements of this projects before moving on. The mouse tooltip is working on a variety of objects except for the extruded spline.
Using webGL renderer, if that matters.
Code for spline creation (not including Vector3 lines or circular extrude):
function addGeometry( geometry, color, x, y, z, rx, ry, rz, s, name ) {
var mesh = THREE.SceneUtils.createMultiMaterialObject( geometry, [
new THREE.MeshLambertMaterial( { color: color } )
] );
mesh.position.set( x, y, z );
mesh.scale.set( s, s, s );
mesh.name = name;
scene.add( mesh );
}
The intersect/Three.Ray code in update() is the same as the example linked above. I also tried adding the spline to a parent but still no changes onMouseOver. Later this week I might transition over to ThreeX DOM events and Tween :D
Mini issue which could be separate questions:
Witnessing some inaccuracy in Ray-linked OnMouseOver events on planes. It could also be the fact I'm using large distances? Planes are 1000x1000 and camera is 2000px away. I know that's ridiculous and I'm in the process of fixing that as well.
Thanks for listening!
Since you have only provided code snippets, here is a guess: Because a Multi-material object is hierarchical, you need to set the recursive flag in ray.intersectObjects() to true like so:
var intersects = ray.intersectObjects( scene.children, true );

Resources