I'm trying to create a camera that follows an object that rotates on a orbit around a sphere. But everytime the camera reaches the polar coordinates of the orbit, the direction changes. I just set the position of the camera according to the object that is has to follow and calling lookAt afterwards:
function render() {
rotation += 0.002;
// set the marker position
pt = path.getPoint( t );
// set the marker position
marker.position.set( pt.x, pt.y, pt.z );
marker.lookAt( new THREE.Vector3(0,0,0) );
// rotate the mesh that illustrates the orbit
mesh.rotation.y = rotation
// set the camera position
var cameraPt = cameraPath.getPoint( t );
camera.position.set( cameraPt.x, cameraPt.y, cameraPt.z );
camera.lookAt( marker.position );
t = (t >= 1) ? 0 : t += 0.002;
renderer.render( scene, camera );
}
Here's a complete fiddle: http://jsfiddle.net/krw8nwLn/69/
I've created another fiddle with a second cube which represents the desired camera behaviour: http://jsfiddle.net/krw8nwLn/70/
What happens is that the camera's lookAt function will always try to align the camera with the horizontal plane (so that the "up" direction is always (0, 1, 0). And when you reach the top and bottom of the ellipse path, the camera will instantaneously rotate 180° so that up is still up. You can also see this in your "desired behaviour" example, as the camera cube rotates so that the colors on the other side are shown.
A solution is to not use lookAt for this case, because it does not support cameras doing flips like this. Instead set the camera's rotation vector directly. (Which requires some math, but you look like a math guy.)
Related
I have a PlaneGeometry & mesh, extent is X,Y, normal is Z-axis
And a camera centered above that plane looking down from +Z axis.
(basically looking down a the plane which is a topo/terrain map)
By default, OrbitControls will rotate the view around the X & Y axis.
(which is fairly useless in this case)
What [mostly] works is the rotate the scene around the X-axis scene.rotateX(-Math.PI/2)
and then drive the camera/view to be above the Z-axis.
After that, OrbitControls do the right thing:
vertical mouse tilts the view down to (or up from) the plane
horizontal mouse spins the plane around the z-axis (so can see from the other direction)
Two 'problems':
Is there an API to set the OrbitControl to be above the Z-axis?
(after scene.rotateX, the view is at elevation 0, looking across the plane)
I'd like to rotate the camera/view to above the Z-axis at altitude.
Is there an alternative way to get OrbitControls to select which axis to rotate?
(so without the scene.rotateX, the camera is in the right place)
There's a related fiddle (ignore the SpotLight): https://jsfiddle.net/4azo5bvf/65/
Edit:
const camera = new THREE.PerspectiveCamera( 60, w/h, 0.1, 100 );
camera.position.set(0, 0, 50);
camera.up.set(0, 0, 1); // <=== spin around Z-axis
const ob_controls = new THREE.OrbitControls(camera, canvas);
So we can mark this as 'answered' (thanks to #WestLangley)
The easy solution is to use camera.up.set(0, 0, 1)
Apparently, OrbitControls uses that to determine the rotatonal/axial orientation.
const camera = new THREE.PerspectiveCamera( 60, w/h, 0.1, 100 );
camera.position.set(0, 0, 50);
camera.up.set(0, 0, 1); // <=== spin around Z-axis
const ob_controls = new THREE.OrbitControls(camera, canvas);
After further review (and TerekC's answer in Three.JS rotate projection so that the y axis becomes the z-axis) changed to use:
THREE.Object3D.DefaultUp.set(0, 0, 1); // Z-axis up, and spinable
I have a special control called SphericalControls. Its similar to OrbitControls, but it keeps camera at position 0,0,0 and instead rotates camera on x and y to look around a scene. It is placed in the middle of a SphereBufferGeometry which has a 360 equirectangular image projected upon it. The user can look around the 360 image, and as he does the camera x and y rotation values change.
When a user clicks a button, I need to take these x and y rotation values and rotate the sphere to the rotation of the camera. I then set camera back to x:0 and y:0.
The result is that the camera is reset and the 360 scene has now rotated to show the same rotation view that the camera was previously looking at. So to the user, the view stays basically static, just the values for camera.rotation and sphere rotation have swapped.
This works great if I offset the texture on the sphere:
sphereObj.material.map.wrapS = THREE.RepeatWrapping;
sphereObj.material.map.offset.x = ((camera.rotation.x) / (Math.PI * 2));
sphereObj.material.map.needsUpdate = true;
sphereObj.material.needsUpdate = true;
camera.rotation.set(0, 0);
// Success!
But what I need to do is not offset the texture, but rotate the entire geometry. I have tried:
var axis = new THREE.Vector3(0, 1, 0).normalize();;
var offsetRadian = ((camera.rotation.x) / (Math.PI * 2));
sphere.rotateOnAxis(axis, offsetRadian);
// Fail
But the result is that the sphere rotation is off by approx 30%. Any help is appreciated.
Every objects' rotational data is stored in their respective .quaternion object. Both camera and sphereObj have a quaternion, so what you could do is copy the camera's rotational data into the sphere:
// Get camera's rotation
targetRotation = camera.quaternion;
// Invert rotation
targetRotation.inverse();
// Set sphere's rotation
sphereObj.quaternion.copy(targetRotation);
camera.rotation.set(0, 0, 0);
I'm not entirely sure if you need the .inverse() line... if you're noticing the sphere is rotating in the opposite direction, just get rid of it to get the desired result.
I have a three.js animation of a person running. I have embedded this in an iFrame on my site however the character runs off the screen.
I am very happy with the positioning and the camera angle, I just need to move it right so that the character is centred in the iFrame.
Below is the code I am using.
scene = new THREE.Scene();
camera = new THREE.PerspectiveCamera(30, window.innerWidth / window.innerHeight, 1, 4000);
camera.position.set(0, 150, 50);
camera.position.z = cz;
camera.zoom = 3.5;
camera.updateProjectionMatrix();
scene.add(camera);
You could use the camera.lookAt() method, which will point the camera towards the desired position.
// You could set a constant vector
var targetPos = new THREE.Vector3(50, 25, 0);
camera.lookAt(targetPos);
// You could also do it in the animation loop
// if the position will change on each frame
update() {
person.position.x += 0.5;
camera.lookAt(person.position);
renderer.render(scene, camera);
}
I feel like the lookAt() method wouldn't work. It will just rotate the camera, and you specified you like the camera placement/angle.
If you want to move the camera to the right along with you model, set the camera's position.x equal to model.x for every frame(assuming left/right is still the X axis).
person.position.x += 0.5;
camera.position.x = person.position.x;
Alternatively, you could keep the object and camera static and move the ground plane. Or even have a rotating cylinder with a big enough radius flipped on its side.
I try to do an animation which represents a sphere around which camera is rotating and I have drawn a circle on it (drawn with a THREE.TorusGeometry).
Then, I project a plane on the current point defined by the direction from camera position to the origin (0,0,0).
For a circle defined by y=0 and x²+z²=1 (i.e a circle defined into Oxz plane = equatorial plane of the sphere), you can see the result on :
link 1 : circle defined by y=0 and x²+z²=1
As you can see, the coordinates of plane are well drawn but I can't get to understand why the yellow circle is not drawn into Oxz plane (in this link, you can see that it is in Oxy plane).
Before the matrix multiplication, I defined above the vector of Torus by :
var coordTorus = new THREE.Vector3(radius*Math.cos(timer), 0, radius*Math.sin(timer));
i.e, by x'²+z'²=1 and y'=0 (choice 2). In this case, I don't get a valid result for the yellow circle, it is drawn into Oxy plane and not into Oxz plane like expected.
To get a good result, I have to define x'²+y'²=1 and z'=0 in local plane but I can't understand why ?
If someone could tell me the explication ?
It was hard to extract from all the code where exactly your problem was. I cleaned things up and solved it differently and I think this Fiddle shows what you wanted.
Instead of rotating all objects I rotated only the camera which seems much simpler then your solution:
/**
* Rotate camera
*/
function rotateCamera() {
// For camera rotation
stepSize += 0.002;
alpha = 2 * Math.PI * stepSize;
if (alpha > 2 * Math.PI) {
stepSize = 0;
}
// Rotate camera around a circle
camera.position.x = center.x + distance * Math.cos(alpha);
camera.position.z = center.y + distance * Math.sin(alpha);
// Camera should look at center
camera.lookAt(new THREE.Vector3(0, 0, 0));
}
And then I added your tangent plane to the camera instead of the scene:
So it rotates with the camera.
camera.add(plane);
when i have the following code:
camera = new THREE.PerspectiveCamera( 45, width/height, 1, 10000 );
scene.add( camera );
camera.rotation.set(-0.09388335, 0.9945234, 0.0474389);
camera.position.z = 100;
camera.rotation.set(-0.09388335, 0.9945234, 0.0474389);
at render time the position of camera.matrixWorldInverse changes. Anyone knows why?
My guess is that because you are rotating the camera's rotation locally you are changing its position globally. If that's the case then why would the rotate around world axis in the following work:
How to rotate a object on axis world three.js?
I was looking at camera.matrixWorldInverse not camera.matrixWorld