How to *set* (not increment) OrbitControls distance from target programmatically? - three.js

I'm trying to adjust the ThreeJS OrbitControls so that I can set the distance from the target programmatically.
https://github.com/mrdoob/three.js/blob/master/examples/jsm/controls/OrbitControls.js
The goal is to be able to call it like this:
const controls = new THREE.OrbitControls(camera);
controls.setDolly(1); // Near
controls.setDolly(10); // Far
A setDolly method doesn't exist, so I've adjusted the OrbitControls.js script, and have added:
this.setDolly = function(newDolly) {
spherical.radius = newDolly;
};
However, due to some way that OrbitControls has been written that I don't understand, the camera does not budge.
Does anyone know what the problem is?

You can try this custom function I wrote like this:
function setDolly(value){
let positive = Math.abs(value); //Absolute value to prevent negative increments
camera.position.set(1, 1, 1); //Default of 1 so it dollys to the desired position
//Moving camera.
camera.position.x *= positive;
camera.position.y *= positive;
camera.position.z *= positive;
};
It would be easier to add this to your code instead of OrbitControls.js, because then you would need to make some substantial changes.
That should do it~

Related

Three.js - repositioning vertices in a 'particle' mesh

I have a basic three.js game working and I'd like to add particles. I've been searching online, including multiple questions here, and the closest I've come to getting a 'particle system' working is using a THREE.BufferGeometry, a THREE.BufferAttribute and a THREE.Points mesh. I set it up like this:
const particleMaterial = new THREE.PointsMaterial( { size: 10, map: particleTexture, blending: THREE.AdditiveBlending, transparent: true } );
const particlesGeometry = new THREE.BufferGeometry;
const particlesCount = 300;
const posArray = new Float32Array(particlesCount * 3);
for (let i = 0; i < particlesCount; i++) {
posArray[i] = Math.random() * 10;
}
const particleBufferAttribute = new THREE.BufferAttribute(posArray, 3);
particlesGeometry.setAttribute( 'position', particleBufferAttribute );
const particlesMesh = new THREE.Points(particlesGeometry, particleMaterial);
particlesMesh.counter = 0;
scene.add(particlesMesh);
This part works and displays the particles fine, at their initial positions, but of course I'd like to move them.
I have tried all manner of things, in my 'animate' function, but I am not happening upon the right combination. I'd like to move particles, ideally one vertex per frame.
The current thing I'm doing in the animate function - which does not work! - is this:
particleBufferAttribute.setXYZ( particlesMesh.counter, objects[0].position.x, objects[0].position.y, objects[0].position.z );
particlesGeometry.setAttribute( 'position', particleBufferAttribute );
//posArray[particlesMesh.counter] = objects[0].position;
particlesMesh.counter ++;
if (particlesMesh.counter > particlesCount) {
particlesMesh.counter = 0;
}
If anyone has any pointers about how to move Points mesh vertices, that would be great.
Alternatively, if this is not at all the right approach, please let me know.
I did find Stemkoski's ShaderParticleEngine, but I could not find any information about how to make it work (the docs are very minimal and do not seem to include examples).
You don't need to re-set the attribute, but you do need to tell the renderer that the attribute has changed.
particleBufferAttribute.setXYZ( particlesMesh.counter, objects[0].position.x, objects[0].position.y, objects[0].position.z );
particleBufferAttribute.needsUpdate = true; // This is the kicker!
By setting needsUpdate to true, the renderer knows to re-upload that attribute to the GPU.
This might not be concern for you, but just know that moving particles in this way is expensive, because you re-upload the position attribute every single frame, which includes all the position data for every particle you aren't moving.

threejs - raycasting in AR with controller after repositioning

I'm rather new to threejs, so what I'm doing might not be the most efficient way.
I have an object in AR on a mobile device and I want to know if I intersect with it when touching on the screen.
I use the following code to generate the raycast, and it works initally.
const tempMatrix = new THREE.Matrix4();
tempMatrix.identity().extractRotation(this.controller.matrixWorld);
this.raycaster.ray.origin.setFromMatrixPosition(this.controller.matrixWorld);
this.raycaster.ray.direction.set(0, 0, -1).applyMatrix4(tempMatrix);
However, I have the ability to reposition the object (i.e. reset the position so the object is in front, relative to the current camera direction and position) by moving and rotating the whole scene.
After the repositioning, the raycasting is completely offset and is not casting rays anywhere near where I touch the screen.
Repositioning is done like this (while it works, if there's a better way, let me know!) :
public handleReposition(): void {
const xRotation = Math.abs(this.camera.rotation.x) > Math.PI / 2 ? -Math.PI : 0;
const yRotation = this.camera.rotation.y;
this.scene.rotation.set(xRotation, yRotation, xRotation);
this.scene.position.set(this.camera.position.x, this.camera.position.y, this.camera.position.z);
}
How can I achieve to raycast to the correct new location?
Thanks!
Assuming this.scene is actually the main threejs Scene, it's usually a bad idea to change its rotation or position, since it will affect everything inside the scene, including the controller. I'd suggest moving your object instead, or add your object(s) to a Group and move that.

Three.js: Object3D added in scene, attached to another object3D doesn't update position on translation

The logic of my code is such as the object3D which are included in the scene, on double click get added with a Line (with BufferGeometry).
I am getting the object3D by using Raycaster intersect.
The way I am adding it is:
scene.add( newLine );
newLine.updateMatrixWorld();
THREE.SceneUtils.attach( newLine, scene, intersects[0].object );
The following is my mousemove code which helps me moving the object3D in XZ plane.
function onDocumentMouseMove( event ) {
var mouseX = ( event.clientX / window.innerWidth ) * 2 - 1;
var mouseY = - ( event.clientY / window.innerHeight ) * 2 + 1;
var mouse = new THREE.Vector2( mouseX, mouseY );
raycaster.setFromCamera( mouse, camera);
if (selection) {
var intersects = raycaster.intersectObject( plane );
selection.position.copy( intersects[0].point.sub( offset ));
}
}
Nothing complicated. Simple code. And the movement is happening well. I can easily move the object3D around.
When I am checking the console for change in position of object3D on grabbing and moving it, it is changing which is what should happen. But I do not see any change at in the position of the Line, i.e., newLine as in my code. The issue is unless I am calling .updateWorldMatrix() as well which as per THREE docs, should automatically be called in each render cycle. Still I am calling that. Why am not able to get the position of my newLine when clearly its position is moving along with the object3D when I am dragging object3D around?
Why is it needed? Unless the position of the line can show as changing, I can't update an HTML element, which I am attaching to the end of that line. Hence, the position change is imperative. Gif attached which shows, when the cube/ sphere/ cone is moved, render(..) gives me changing position log of that. However, when logging the same for the Line it doesn't change. If any can help me with the issue, it will be amazing. Thanks much.
EDIT
When I am attaching the HTMLElement directly to parent object3D it shows expected result. It moves when I move object3D. This is because as said its position is being updated continuously in the render cycle when I move it.
Gif:
The line's .position attribute is not changing because its local position remains the same. Since the line is attached to the parent, its relative position to the parent doesn't change, only the global position does. To get the global position of line, you can use the .getWorldPosition() method:
// Declare var to store world position
var worldPos = new THREE.Vector3();
// Get world position of line
line.getWorldPosition(worldPos);
// Now global position is stored in Vec3
console.log(worldPos);

three.js and TrackballControls - keeping horizon flat

I'm attempting to modify TrackballControls.js so that its rotation is like that of OrbitControls.js, where the horizon stays flat, but maintain the ability to rotate over and around a scene (specifically, a collada building model). I've been trying to figure this out for the better part of a day now, but I'm a designer, not a programmer. :-) I'm not even sure if I should be focusing on this.rotateCamera and/or this.update.
(BTW, I would just use OrbitControls.js, but it doesn't support panning, which is necessary when looking at large collada building models.)
Any help would be much appreciated.
It's been awhile since this question was asked, but I ran into the same problem and didn't find much discussion online, so I thought I'd post my solution.
If you must use TrackballControls and you want to a flat horizon, you can simply edit the TrackballControls.js library by adding the following line to the end of the 'this.rotateCamera' method
this.object.up = new THREE.Vector3(0,1,0);
This locks the camera up direction in the (0,1,0) direction (i.e in the y direction). The entire modified method function would then read:
this.rotateCamera = function () {
var angle = Math.acos( _rotateStart.dot( _rotateEnd ) / _rotateStart.length() / _rotateEnd.length() );
if ( angle ) {
var axis = ( new THREE.Vector3() ).crossVectors( _rotateStart, _rotateEnd ).normalize();
quaternion = new THREE.Quaternion();
angle *= _this.rotateSpeed;
quaternion.setFromAxisAngle( axis, -angle );
_eye.applyQuaternion( quaternion );
_this.object.up.applyQuaternion( quaternion );
_rotateEnd.applyQuaternion( quaternion );
if ( _this.staticMoving ) {
_rotateStart.copy( _rotateEnd );
} else {
quaternion.setFromAxisAngle( axis, angle * ( _this.dynamicDampingFactor - 1.0 ) );
_rotateStart.applyQuaternion( quaternion );
}
}
// Lock the camera up direction
this.object.up = new THREE.Vector3(0,1,0);
};

Three.js - move custom geometry to origin

I have some custom geometries obtained from a STEP file conversion and I use the mouse to rotate them. They rotate around the origin of the scene, but since they are far from it, they seem rotating on a virtual sphere. How can I move them to the origin so that they don't seem "floating" around (I mean that I'd like to reduce to zero the radius of the virtual sphere). This is the example I'd like to move. I've tried setting their position to (0, 0, 0) doing:
object.position.x = 0;
object.position.y = 0;
object.position.z = 0;
but it didin't work.
The typical solution to this problem is to translate the geometry right after it is created. You do that by applying a translation matrix to the geometry like so:
geometry.applyMatrix( new THREE.Matrix4().makeTranslation( distX, distY, distZ ) );
EDIT: You can simply do this, instead:
geometry.translate( distX, distY, distZ ); // three.js r.72
The function geometry.computeBoundingBox() may be of help to you in determining an amount to translate.
However, I see in your case, you have multiple geometries, so it it a bit more complicated, but doable. You will need to translate each geometry by the same amount.
EDIT
Tip: Instead of adding each object to the scene, create a parent object, add it to the scene, and then add the objects to the parent.
var parent;
parent = new THREE.Object3D();
scene.add( parent );
parent.add( object1 );
parent.add( object2 );
// and so on...
Then in your render function, just rotate the parent, not the individual objects.

Resources