I am working on THREE.js orthographic camera, I am using 'OrbitalControls.js' for controls.
I want to zoom in the direction of mouse in 3D world. Hoping to get solution in JS. I know this one is not easy solution like perspective camera.(where we add in camera direction)
There is a .zoom parameter on the orthographic camera.
Try setting increasing/decreasing it.
This can be solved by figuring out the difference between your current zoom level and the next one, and then by panning in the direction of the mouse.
Your OrthographicCamera has a width and hight in world units camera.right - camera.left and camera.top - camera.bottom. Say you're at zoom level 1, and you zoom to 2, you divide these by two and get the new width and height in world units.
You need to figure out the difference in X Y and pan the camera in that direction.
Related
I'm using ThreeJS, but this is a general math question.
My end goal is to position an object in my scene using 2D screen space coordinates; however, I want a specific z position in the perspective projection.
As an example, I have a sphere that I want to place towards the bottom left of the screen while having the sphere be 5 units away from the camera. If the camera were to move, the sphere would maintain its perceived size and position.
I can't use an orthographic camera because the sphere needs to be able to move around in the perspective projection. At some point the sphere will be undocked from the screen and interact with the scene using physics.
I'm sure the solution is somewhere in the camera inverse matrix, however, that is beyond my abilities at the current moment.
Any help is greatly appreciated.
Thanks!
Your post includes too many questions, which is out of scope for StackOverflow. But I’ll try to answer just the main one:
Create a plane Mesh using PlaneGeometry.
Rotate it to face the camera, place it 5 units away from the camera.
Add it as a child with camera.add(plane); so whenever the camera moves, the plane moves with it.
Use a Raycaster’s .setfromCamera(cords, cam)
then
.intersectObject(plane)
method to convert x, y screen cords into an x, y, z world position where it
intersects the plane. You can read about it in the docs.
Once it’s working, make the plane invisible with visible = false
You can see the raycaster working in this official example: https://threejs.org/examples/#webgl_geometry_terrain_raycast
In three.js, when we define the camera we use something like camera = new THREE.PerspectiveCamera(fov, window.innerWidth / window.innerHeight, near, far); If the object is moved outside the bounds set by the two planes, far and near then it is clipped.
Suppose now I rotate the object and I zoom in/out.
How can I find the current plane my camera is on please? I am learning three.js, hence I dont know If I am explaining this clear enough.
I thought that it was camera.position.z that would give me this info. In fact, I think it gives the correct value when my camera looks down the z-axis. However when I rotate by 90 degrees, (effectively moving my camera on the x-axis) the value of camera.position.z changes by a lot.
I have added a graph in case it helps. The plane defined by the blue outline cuts through the data and is parallel to the far and near planes. As I zoom in and out, I am moving the plane forward and backward, right? Do I understand this correct or I have totally missed it? I would like to know the value (a float between far and near) indicating how far that blue plane is from the camera. If you rotate that distance shouldn't change
My end goal is to be able to find-out how close the scene is to the viewer. If it gets too close then I will be adding some more elements to the scene. If it is too far these elements will be removed.
I'm using THREE.OrbitControls to dolly a THREE.OrthographicCamera. But, even thought the ortho camera renders correctly as repositioned, all that is updating on the orthographic camera is the 'zoom' property. Even after calling camera.updateProjectionMatrix(). Do I need to manually update the 'position' property of the camera based on the updated 'zoom' property? I want to display its position in my UI after dollying it.
(Note, this is a rewrite of my other question,THREE.js Orthographic camera position not updating after zoom with OrbitControl, in which I thought I was zooming with the OrbitControl but was actually dollying. Sorry about this).
Dollying in/out with an ortho cam would have an unnoticeable effect. With ortho cams there is no perception of proximity because it has no perspective. All objects appear the same in size regardless of distance from the lens because the projection rays are all parallel. The only difference you'd notice is when the objects get clipped because they're past the near or far plane.
So, the decision was made that scrolling with OrbitControls would change the zoom of the camera, narrowing in/out of the center.
If you want to force the camera to move further/closer of its focus point, you could just translate it back/forth in the z-axis with:
camera.translateZ(distance); A (-) distance would move it closer, and a (+) distance would move it further from its focus point.
For three.js, is there a way to detect the clip position or when the far clip has been reached based on the camera zoom? Or, some way to convert camera zoom to the same units as the camera near and far clip float values?
It looks like the .zoom field on PerspectiveCamera only affects the camera's field of view and not the near or far clip planes. Here is where it's used when calculating the camera's projection matrix:
https://github.com/mrdoob/three.js/blob/master/src/cameras/PerspectiveCamera.js#L192
You can see how the zoom factor is affecting the field of view by using the getEffectiveFOV function:
https://threejs.org/docs/#api/en/cameras/PerspectiveCamera.getEffectiveFOV
I'm working on a simple Three.js demo that uses OrbitControls.js.
I'd like to change the behavior of panning in OrbitControls. Currently, when you pan the camera, it moves the camera in a plane that is perpendicular to the viewing direction. I'd like to change it so that the camera stays a constant distance from the ground plane and moves parallel to it. Google Earth uses a similar control setup.
Edit: I should have mentioned this detail in the first place, but I'd also like the point where you click and start dragging to remain directly under the cursor throughout the entire drag. There needs to be that solid connection between the mouse movement and what the user expects to happen on the screen. Otherwise, it feels as though I'm 'slipping' when I try to move around the scene.
Can someone give me a high-level explanation of how this might be done (with or without OrbitControls.js)?
EDIT: OrbitControls now supports panning parallel to the "ground plane", and it is the default.
To pan parallel to screen-space (the legacy behavior), set:
controls.screenSpacePanning = true;
Also available is MapControls, which has an API similar to that of Google Earth.
three.js r.94
Some time ago I was working on exactly this issue, i.e. adaptation of OrbitControls.js to map navigation.
Here's the code of MapControls.js.
Here's the demo of the controls.
I figured it out. Here's the overview:
Store the mousedown event somewhere.
When the mouse moves, get the new mousedown event.
For each of those points, find the points on the plane where those clicks are located (You'll need to put the points into camera space, transform them into world space, then fire a ray from the camera through each point to find their intersections with the plane. This page explains the ray-plane intersection test).
Subtract the world-space start intersection point from the world-space end intersection point to get the offset.
Subtract that offset from the camera's target point and you're done!
In the case of OrbitControl.js, the camera always looks at the target point, and its position is relative to that point. So when you change the target, the camera moves with it. Since the target always lies on the plane, the camera moves parallel to that plane (as long as you're panning).
You should set your camera 'up' to z axe:
camera.up.set(0,0,1)
And then, the main problem with OrbitControl is its panUp() function. It should be fixed.
My pull request : https://github.com/mrdoob/three.js/pull/12727
y axe is relative to camera axes and should be relative to a fixed plan in the world. To define the expected y axe, make a 90° rotation of camera x axe, based on world z axe.
v.setFromMatrixColumn( objectMatrix, 0 ); // get X column of objectMatrix
v.applyAxisAngle( new THREE.Vector3( 0, 0, 1 ), Math.PI / 2 );
v.multiplyScalar( distance );
panOffset.add( v )
Enjoy!