React Three Fiber: How do I change the camera angle? - react-three-fiber

Currently I have a PerspectiveCamera set up like so:
<Canvas>
<PerspectiveCamera
makeDefault
fov={50}
position={[0, 0, 80]}
/>
</Canvas>
It seems to be pointing straight into the z direction. I would like the camera to point straight down in the negative y direction. How can I change the angle that it points at?

You can add a rotation attribute to the camera same as you added position.
<Canvas>
<PerspectiveCamera
makeDefault
fov={50}
position={[0, 0, 80]}
rotation={[x, y, z]}
/>
</Canvas>

Related

Is instancing applicable to a Three.js scene consisting of ExtrudeGeometry meshes with varying geometries?

I have a Three.js scene consisting of many buildings which are formed by stacking ExtrudeGeometry meshes (think buildings in Mapbox GL JS):
I'm creating these meshes using THREE.Shape and THREE.ExtrudeGeometry (I'm using react-three-fiber):
function coordsToShape(coords) {
const shape = new Shape();
let [x, z] = coords[0];
shape.moveTo(x, z);
for (const [x, z] of coords.slice(1)) {
shape.lineTo(x, z);
}
return shape;
}
function Floor(props) {
const {coords, bottom, top, color} = props;
const shape = coordsToShape(coords);
const geom = new ExtrudeGeometry(shape, {depth: top - bottom, bevelEnabled: false});
return (
<mesh castShadow geometry={geom} position={[0, bottom, 0]} rotation-x={-Math.PI / 2}>
<meshPhongMaterial color={color} />
</mesh>
)
}
Then I stack the floors to produce the scene:
export default function App() {
return (
<Canvas>
{ /* lights, controls, etc. */ }
<GroundPlane />
<Floor coords={coords1} bottom={0} top={1} color="skyblue" />
<Floor coords={coords2} bottom={1} top={3} color="pink" />
<Floor coords={coords3} bottom={0} top={1} color="aqua" />
<Floor coords={coords4} bottom={1} top={3} color="orange" />
</Canvas>
)
}
Full code/demo here. This results in one mesh for the ground plane and one for each building section, so five total.
I've read that using instancing to reduce the number of meshes is a good idea. Is instancing relevant to this scene? Most instancing examples show identical geometries with colors, positions and rotations varying. But can the geometry vary? Should I be using mergeBufferGeometries? But if I do that, will I still get the performance wins? Since I have coordinate arrays already, I'd also be happy using them to construct a large buffer of coordinates directly.
Is instancing relevant to this scene?
Instancing in general is applicable if you are going to render a large number of objects with the same geometry and material but with different transformations (and other per-instance properties like e.g. color).
Merging geometries only makes sense if they can all share the same material. So if you need different colors per objects you can achieve this by defining vertex color data. Besides, the geometries should be considered as static since it is complicated to perform individual transformations if the data are already merged.
Both approaches are intended to lower the number of draw calls in your app which is an important performance metric. Try to use them whenever possible.

ThreeJS world unit to pixel conversion

Is there a way compute the ratio between world unit and pixels in ThreeJS ? I need to determine how many units apart my objects need to be in order to be rendered 1 pixel apart on the screen.
The camera is looking at the (x,y) plane from a (0, 0, 10) coordinate, and objects are drawn in 2D on the (x,y) plane at z=0.
<Canvas gl={{ alpha: true, antialias: true }} camera={{ position: [0, 0, 10] }}>
I cannot seem to figure out what the maths are or if there is any function that does it already...
I'm thinking I might have to compare the size of the canvas in pixels and world units, but I dont know how to get that either. There's also this raycasting solution, but surely there has to be a way to just compute it, no ?

Three.js Orbit Controls in Equirectangular mapped sphere - maintaining position

I have mapped an equirectangular image onto a sphere using three.js, placed the camera in the middle of the sphere, and am using the OrbitControls to handle things like zooming & rotation.
This all works fantastically until I want to programmatically adjust what the camera is looking at (I tween camera.target) which ends up changing, I believe, the position of the camera. The issue here is that afterwards when you rotate, you rotate out of the sphere. What would be the proper way to achieve this only by adjusting camera.rotation and camera.zoom. I'm okay with stripping down the OrbitControls but don't fully understand how the rotation should work and am also open to other optins.
If you are using OrbitControls in the center of a panorama, you should leave controls.target at the origin, and set the camera position close to the origin:
camera.position.set( 0, 0, 1 );
By setting
controls.enablePan = false;
controls.enableZoom = false;
the camera will always remain a distance of 1 from the origin, i.e., on the unit sphere.
Then, to look at ( x, y, z ), you programmatically set the camera's position like so:
camera.position.set( x, y, z, ).normalize().negate();
That way, when the camera looks at the target (the origin), it will automatically be looking at ( x, y, z ), too.
three.js r.85

How do you tile a texture across a surface in React VR?

I have a box 100m x 100m to act as a floor in a reactVR test I am working on, I'd like to add a texture to it but the tile texture just stretches over the entire surface rather than tile, as desired. Here is my component code, nothing special:
<Box
dimWidth={100}
dimDepth={100}
dimHeight={0.5}
texture={asset('check_floor_tile.jpg')}
style={{
color:'#333333',
transform: [{translate: [0, -1, 0]}]
}}
lit
/>
I've had a look for examples without success, any help would be appreciated. Thanks.
You can now tile a texture across a surface by using specifying repeat on the texture property of any component that extends BasicMesh (Box, Plane, Sphere, Cylinder, Model).
The functionality has been added to Reach VR via this PR.
<Plane
texture={{
...asset('texture.jpg'),
repeat: [4, 4],
}}
/>

Three.js plane facing away from camera aligned with viewport

I'm trying to have a plane face away from the camera with same orientation so it's aligned in the viewport.
I have a plane in front of the camera, perfectly aligned to the cameras viewport, and I want to flip it in front of the camera, along the objects Y axis, regardless of camera orientation.
The following will orient my plane to face at the camera and works for any orientation:
target.rotation.copy(camera.rotation);
The following will then flip the plane along the plane's Y axis:
target.rotation.y += Math.PI;
All good so far? Except when the camera rotation has a funky tilt to it, let's say it's looking up and to the left, tilted slightly to the right, the plane's flip is tilted, but not the same way as the camera, leaving me with a plane tilted either to the left or right...
I've tried several things such as:
target.rotation.z -= camera.rotation.z;
Nothing... Thanks for your help.
So the problem I was running into was when the camera was in negative z coordinates. This causes the flip on the Y axis to get messed up.
So basically you would do something like this:
var target = new THREE.Object3D();
//position
target.position.copy(s.camera.position);
target.position.add(THREE.Utils.cameraLookDir(s.camera).multiplyScalar(300));
//rotation
target.rotation.copy(s.camera.rotation);
target.rotation.y += PI;
target.rotation.z = -s.camera.rotation.z;
if (s.camera.position.z < 0) {
target.rotation.z = s.camera.rotation.z;
}
EDIT:
Add the following to appropriate spots in your program.
camera.rotation.eulerOrder = 'XZY';
target.rotation.eulerOrder = 'XZY';
Seems to solve previously encountered tilt issues! (see below)
RESOLVED:
Flipped planes tilted the wrong way in some instances, for example when in negative z coords and also the y rotation is not equal to 0, example: point in space hovering and looking at 0, 0, 0.
This is the solution I was looking for when I found this page (taken from this answer):
mesh.lookAt( camera.position );
The local z-axis of the mesh should then point toward the camera.

Resources