Three.js - Drawing a torus but unable to understand the equation defined it - three.js

I try to do an animation which represents a sphere around which camera is rotating and I have drawn a circle on it (drawn with a THREE.TorusGeometry).
Then, I project a plane on the current point defined by the direction from camera position to the origin (0,0,0).
For a circle defined by y=0 and x²+z²=1 (i.e a circle defined into Oxz plane = equatorial plane of the sphere), you can see the result on :
link 1 : circle defined by y=0 and x²+z²=1
As you can see, the coordinates of plane are well drawn but I can't get to understand why the yellow circle is not drawn into Oxz plane (in this link, you can see that it is in Oxy plane).
Before the matrix multiplication, I defined above the vector of Torus by :
var coordTorus = new THREE.Vector3(radius*Math.cos(timer), 0, radius*Math.sin(timer));
i.e, by x'²+z'²=1 and y'=0 (choice 2). In this case, I don't get a valid result for the yellow circle, it is drawn into Oxy plane and not into Oxz plane like expected.
To get a good result, I have to define x'²+y'²=1 and z'=0 in local plane but I can't understand why ?
If someone could tell me the explication ?

It was hard to extract from all the code where exactly your problem was. I cleaned things up and solved it differently and I think this Fiddle shows what you wanted.
Instead of rotating all objects I rotated only the camera which seems much simpler then your solution:
/**
* Rotate camera
*/
function rotateCamera() {
// For camera rotation
stepSize += 0.002;
alpha = 2 * Math.PI * stepSize;
if (alpha > 2 * Math.PI) {
stepSize = 0;
}
// Rotate camera around a circle
camera.position.x = center.x + distance * Math.cos(alpha);
camera.position.z = center.y + distance * Math.sin(alpha);
// Camera should look at center
camera.lookAt(new THREE.Vector3(0, 0, 0));
}
And then I added your tangent plane to the camera instead of the scene:
So it rotates with the camera.
camera.add(plane);

Related

threejs rotate the object gradually to where camera is looking using orbit control

I'm planning to use Orbit Control to do a simple 3rd person camera view,
But I cant seem to figure out how to do it.
when I rotate the camera around an object, and press say “W” key to move forward, I want the object “look” to gradually rotate and move to the new direction the camera is facing.
How can I do that?
It's possible to do exactly that by gradually rotating the object to the camera direction.
Made a codepen here which uses a generic replacement to orbit controls for simplicity:
https://codepen.io/cdeep/pen/QWMWyYW
// Get the X-Z plane in which camera is looking to move the player
camera.getWorldDirection(tempCameraVector);
const cameraDirection = tempCameraVector.setY(0).normalize();
// Get the X-Z plane in which player is looking to compare with camera
model.getWorldDirection(tempModelVector);
const playerDirection = tempModelVector.setY(0).normalize();
// Get the angle to x-axis. z component is used to compare if the angle is clockwise or anticlockwise since angleTo returns a positive value
const cameraAngle = cameraDirection.angleTo(xAxis) * (cameraDirection.z > 0 ? 1 : -1);
const playerAngle = playerDirection.angleTo(xAxis) * (playerDirection.z > 0 ? 1 : -1);
// Get the angle to rotate the player to face the camera. Clockwise positive
const angleToRotate = playerAngle - cameraAngle;
// Get the shortest angle from clockwise angle to ensure the player always rotates the shortest angle
let sanitisedAngle = angleToRotate;
if(angleToRotate > Math.PI) {
sanitisedAngle = angleToRotate - 2 * Math.PI
}
if(angleToRotate < -Math.PI) {
sanitisedAngle = angleToRotate + 2 * Math.PI
}
// Rotate the model by a tiny value towards the camera direction
model.rotateY(
Math.max(-0.05, Math.min(sanitisedAngle, 0.05))
);

Three.js: How to rotate a Sphere on Axis using camera rotation values

I have a special control called SphericalControls. Its similar to OrbitControls, but it keeps camera at position 0,0,0 and instead rotates camera on x and y to look around a scene. It is placed in the middle of a SphereBufferGeometry which has a 360 equirectangular image projected upon it. The user can look around the 360 image, and as he does the camera x and y rotation values change.
When a user clicks a button, I need to take these x and y rotation values and rotate the sphere to the rotation of the camera. I then set camera back to x:0 and y:0.
The result is that the camera is reset and the 360 scene has now rotated to show the same rotation view that the camera was previously looking at. So to the user, the view stays basically static, just the values for camera.rotation and sphere rotation have swapped.
This works great if I offset the texture on the sphere:
sphereObj.material.map.wrapS = THREE.RepeatWrapping;
sphereObj.material.map.offset.x = ((camera.rotation.x) / (Math.PI * 2));
sphereObj.material.map.needsUpdate = true;
sphereObj.material.needsUpdate = true;
camera.rotation.set(0, 0);
// Success!
But what I need to do is not offset the texture, but rotate the entire geometry. I have tried:
var axis = new THREE.Vector3(0, 1, 0).normalize();;
var offsetRadian = ((camera.rotation.x) / (Math.PI * 2));
sphere.rotateOnAxis(axis, offsetRadian);
// Fail
But the result is that the sphere rotation is off by approx 30%. Any help is appreciated.
Every objects' rotational data is stored in their respective .quaternion object. Both camera and sphereObj have a quaternion, so what you could do is copy the camera's rotational data into the sphere:
// Get camera's rotation
targetRotation = camera.quaternion;
// Invert rotation
targetRotation.inverse();
// Set sphere's rotation
sphereObj.quaternion.copy(targetRotation);
camera.rotation.set(0, 0, 0);
I'm not entirely sure if you need the .inverse() line... if you're noticing the sphere is rotating in the opposite direction, just get rid of it to get the desired result.

How to preserve threejs texture scale while applying texture rotation

I'd like to enable a user to rotate a texture on a rectangle while keeping the aspect ratio of the texture image intact. I'm doing the rotation of a 1:1 aspect ratio image on a surface that is rectangular (say width: 2 and length: 1)
Steps to reproduce:
In the below texture rotation example
https://threejs.org/examples/?q=rotation#webgl_materials_texture_rotation
If we change one of the faces of the geometry like below:
https://github.com/mrdoob/three.js/blob/master/examples/webgl_materials_texture_rotation.html#L57
var geometry = new THREE.BoxBufferGeometry( 20, 10, 10 );
Then you can see that as you play around with the rotation control, the image aspect ratio is distorted. (form a square to a weird shape)
At 0 degree:
At some angle between 0 and 90:
I understand that by changing the repeatX and repeatY factor I can control this. It's also easy to see what the values would be at 0 degree, 90 degree rotations.
But I'm struggling to come up with the formula for repeatX and repeatY that works for any texture rotation given length and width of the rectangular face.
Unfortunately when stretching geometry like that, you'll get a distortion in 3D space, not UV space. In this example, one UV.x unit occupies twice as much 3D space as one UV.y unit:
This is giving you those horizontally-skewed diamonds when in between rotations:
Sadly, there's no way to solve this with texture matrix transforms. The horizontal stretching will be applied after the texture transform, in 3D space, so texture.repeat won't help you avoid this. The only way to solve this is by modifying the UVs so the UV.x units take up as much 3D space as UV.y units:
With complex models, you'd do this kind of "equalizing" in a 3D editor, but since the geometry is simple enough, we can do it via code. See the example below. I'm using a width/height ratio variable to use in my UV.y remapping, that way the UV transformations will match up, regardless of how much wider it is.
//////// Boilerplate Three setup
const renderer = new THREE.WebGLRenderer({canvas: document.querySelector("canvas")});
const camera = new THREE.PerspectiveCamera(50, 1, 1, 100);
camera.position.z = 3;
const scene = new THREE.Scene();
/////////////////// CREATE GEOM & MATERIAL
const width = 2;
const height = 1;
const ratio= width / height; // <- magic number that will help with UV remapping
const geometry = new THREE.BoxBufferGeometry(width, height, width);
let uvY;
const uvArray = geometry.getAttribute("uv").array;
// Re-map UVs to avoid distortion
for (let i2 = 0; i2 < uvArray.length; i2 += 2){
uvY = uvArray[i2 + 1]; // Extract Y value,
uvY -= 0.5; // center around 0
uvY /= ratio; // divide by w/h ratio
uvY += 0.5; // remove center around 0
uvArray[i2 + 1] = uvY;
}
geometry.getAttribute("uv").needsUpdate = true;
const uvMap = new THREE.TextureLoader().load("https://raw.githubusercontent.com/mrdoob/three.js/dev/examples/textures/uv_grid_opengl.jpg");
// Now we can apply texture transformations as expected
uvMap.center.set(0.5, 0.5);
uvMap.repeat.set(0.25, 0.5);
uvMap.anisotropy = 16;
const material = new THREE.MeshBasicMaterial({map: uvMap});
const mesh = new THREE.Mesh(geometry, material);
scene.add(mesh);
window.addEventListener("mousemove", onMouseMove);
window.addEventListener("resize", resize);
// Add rotation on mousemove
function onMouseMove(ev) {
uvMap.rotation = (ev.clientX / window.innerWidth) * Math.PI * 2;
}
function resize() {
const width = window.innerWidth;
const height = window.innerHeight;
renderer.setSize(width, height);
camera.aspect = width / height;
camera.updateProjectionMatrix();
}
function animate(time) {
mesh.rotation.y = Math.cos(time/ 3000) * 2;
renderer.render(scene, camera);
requestAnimationFrame(animate);
}
resize();
requestAnimationFrame(animate);
body { margin: 0; }
canvas { width: 100vw; height: 100vh; display: block; }
<script src="https://threejs.org/build/three.js"></script>
<canvas></canvas>
First of all, I agree with the solution #Marquizzo provided to your problem. And setting UV explicitly to the geometry should be the easiest way to solve your problem.
But #Marquizzo did not answer why changing the matrix of the texture (set repeatX and repeatY) does not work.
We all know the 2D rotation matrix R
cos -sin
sin cos
UVs are calculated in the shader with a transform matrix T, which is the texture matrix from your question.
T * UV = new UV
To simplify the question, we only consider rotation. And assume we have another additional matrix X for calculating the new UV. Then we have
X * R * UV = new UV
The question now is whether we can find a solution ofX, so that with any rotation, new UV of any points in your question can be calculated correctly. If there is a solution of X, then we can simply use
var X = new Matrix3();
//X.set(x,y,z,...)
texture.matrix.premultiply(X);
Otherwise, we can't find the approach you expected.
Let's create several equations to figure out X.
In the pic below, ABCD is one face of your geometry, and the transparent green is the texture. The UV of point A is (0,1), point B is (0,0), and (1,0), (1,1) for C and D respectively.
The first equation comes from the consideration, without any rotation, the original UV should never be changed (UV for A is always (0,1)). So we should have
X * I * (0, 1) = (0, 1) // I is the identity matrix
From here we can see X should also be an identity matrix.
Then let's see whether the identity matrix X can satisfy the second equation. What's the second equation? Simplify again, let B be the rotation centre(origin) and rotate the texture 90 degrees(counterclockwise). We use -90 to calculate UV though we rotate 90 degrees.
The new UV for point A after rotating the texture 90 degrees should be the current value of E. The value of E is (a/b, 0). Then we have
From this equation we can see X should not be an identity matrix, which means, WE ARE NOT ABLE TO FIND A SOLUTION OF X TO SOLVE YOUR PROBLEM WITH
X * R * UV = new UV
Certainly, you can change the shader of calculating new UVs, but it's even harder than the way #Marquizzo provided.

How to setup a camera that follows a circle path?

I'm trying to create a camera that follows an object that rotates on a orbit around a sphere. But everytime the camera reaches the polar coordinates of the orbit, the direction changes. I just set the position of the camera according to the object that is has to follow and calling lookAt afterwards:
function render() {
rotation += 0.002;
// set the marker position
pt = path.getPoint( t );
// set the marker position
marker.position.set( pt.x, pt.y, pt.z );
marker.lookAt( new THREE.Vector3(0,0,0) );
// rotate the mesh that illustrates the orbit
mesh.rotation.y = rotation
// set the camera position
var cameraPt = cameraPath.getPoint( t );
camera.position.set( cameraPt.x, cameraPt.y, cameraPt.z );
camera.lookAt( marker.position );
t = (t >= 1) ? 0 : t += 0.002;
renderer.render( scene, camera );
}
Here's a complete fiddle: http://jsfiddle.net/krw8nwLn/69/
I've created another fiddle with a second cube which represents the desired camera behaviour: http://jsfiddle.net/krw8nwLn/70/
What happens is that the camera's lookAt function will always try to align the camera with the horizontal plane (so that the "up" direction is always (0, 1, 0). And when you reach the top and bottom of the ellipse path, the camera will instantaneously rotate 180° so that up is still up. You can also see this in your "desired behaviour" example, as the camera cube rotates so that the colors on the other side are shown.
A solution is to not use lookAt for this case, because it does not support cameras doing flips like this. Instead set the camera's rotation vector directly. (Which requires some math, but you look like a math guy.)

Rotate camera X on local axis using Three.js

I'm new to Three.js and fairly new to 3d space engines and what I'm trying to achieve is a 360 equirectangular image viewer.
What my script does so far is to create a camera at 0,0,0 and a sphere mesh at the same location with normals inverted and an emission map of my 360 image.
Representation of scene using Blender's viewport.
The user should be enabled to rotate the camera using mouse drag or keyboard arrows, so using mouse listeners I created the drag feature which calculates the amount of rotation in the camera's Y axis (blue) and X axis (red) at each render frame. I also created min and max rotation limit on X (so the user couldn't spin backward), as follows:
var render = function () {
requestAnimationFrame( render );
if((camera.rotation.x < Math.PI/6 && speedX >= 0) || (camera.rotation.x > -Math.PI/6 && speedX <= 0))
camera.rotation.x += speedX * (Math.PI/180);
camera.rotation.y += speedY * (Math.PI/180);
renderer.render(scene, camera);
};
Where speedX and speedY represent the amount of rotation in each axis.
So far so good, but since those rotation coordinates are relative to the world and not the camera itself the X rotation makes the camera go wild, since after a couple of rotated degrees in the Y axis, the camera's X axis is no longer the same as the world's X axis.
My question, finally, is: how do I rotate the camera on it's own X axis at each frame?
If you want a camera's rotation to have meaning in terms of yaw (heading), pitch, and roll, you need set:
camera.rotation.order = 'YXZ'; // default is 'XYZ'
For more information, see this stackoverflow answer.
three.js r.82

Resources