ThreeJS FPS drops - three.js

I've got some problems with a character movements and the FPS of my scene. The more I move the character, the lowest FPS are. I have a first period at the beggining where I can move without any problem but after several seconds of movements, fps are dropping to a really low level. I don't understand if that come from my movement function or the animation one. I followed some tutorials for the movments and none of them have fps drops. Here are my functions of movement and animate.
var xSpeed = 0.0001;
var ySpeed = 0.0001;
document.addEventListener("keydown", onDocumentKeyDown, false);
function onDocumentKeyDown(event) {
var keyCode = event.which;
if (keyCode == 90) {
avatar.translateZ( -1 );
} if (keyCode == 83) {
avatar.translateZ( 1 );
} if (keyCode == 81) {
avatar.rotation.y -= 0.1;
} if (keyCode == 68) {
avatar.rotation.y += 0.1;
}
render();
};
var render = function() {
requestAnimationFrame(render);
renderer.render(scene, camera);
};
function animate() {
requestAnimationFrame( animate );
render();
stats.update();
}

Pay attention to your code on each keydown event. It calls render() which requests a new animation loop. By the second keydown you'll get two render loops per frame, each time adding more and more loops, until your computer can’t handle any more.
You only need one requestAnimationFrame loop, no need to add a new one on each keystroke.

Related

How to reset the OrbitControls and when to use the update method?

I want to reset my camera which has an active damping. I've tried different ways but I don't know if they are correct.
The goal is to stop the damping and set the initial position of the camera and then activate the damping again. I want to avoid that the model/camera is rotating a bit after I reset the controls with a button. I would do it like this:
controls.enableDamping = false;
controls.update();
camera.position.set( 10, 13, 10 );
camera.lookAt( 0, 0, 0 );
controls.enableDamping = true;
controls.update();
My rendering function is called by a EventListener:
controls.addEventListener( "change", requestRenderer );
And the render function:
const renderer = new THREE.WebGLRenderer( { canvas: canvas, antialias: true, alpha: true } );
let renderRequested = false;
function render( time ) {
time *= 0.001;
renderRequested = false;
resizeRenderer( renderer, camera );
controls.update();
renderer.render( scene, camera );
}
function requestRenderer() {
if( !renderRequested ) {
renderRequested = true;
requestAnimationFrame( render );
}
}
This works pretty well. The question is if this is the correct way and when do I have to update the controls? I think the first update is necessary to tell the controls that the damping isn't active anymore (but what does the update do?) and I think, that I don't need the second update.
From your code, it seems like you don't need the second update. You aren't changing a key property about the controls, in your render() loop:
controls.update();
Which covers every other case.

threejs mixer not updating with `setTime`

I have a threejs animation mixer set up as follows:
this.mixer = new THREE.AnimationMixer(this.object);
this.mixer.timeScale = 2; //play twice as fast
this.mixer.addEventListener('finished', this.handlePullAnimFinished);
this.pullAnim = this.mixer.clipAction(this.animations[0]);
this.pullAnim.clampWhenFinished = true;
this.pullAnim.setLoop(THREE.LoopOnce);
this.pullAnim.enable = true;
but if I try to do something like this.mixer.setTime(0.5), followed, optionally, by this.mixer.update() nothing happens
How do I programmatically set the mixer to a specific point in an animation (and not have it autoplay)?
I've seen the documentation here on setting up animations to autoplay (and successfully gotten that to work)
At first glance it looks like your mistake is that .update() does not have any arguments in it.
According to the docs, the mixer expects the update method to receive a change in seconds on each frame.
According to this demo you can do something like this:
var clock = new THREE.Clock();
function animate() {
var dt = clock.getDelta();
mixer.update( dt );
renderer.render( scene, camera );
requestAnimationFrame( animate );
}
you could try this:
// r3f version
useEffect(()=>{
action.play()
}, [])
useFrame(()=>{
mixer.setTime(.3)
})
// vanilla version
actions['EmptyAction.001'].play()
let time = .3
animate(){
mixer.setTime(time)
requestAnimationFrame(animate)
}
animate()

touch controls: repeat action until touchend

I am trying to add touch controls to a three.js scene. I want to move the camera in whatever direction the user touches. It works great using the keyboard because you can press and hold the button and the camera moves continuously. But when I try the same thing using touchstart, you have to keep tapping the screen over and over to move, you can't just hold your finger down like on a keyboard or mouse.
I looked at touchmove, but if you just tap and hold without moving, there are no new touches.
Is there something similar to holding down the keyboard or mousekey using touch events?
There is no builtin callback for a touch event which fires repeatedly like the keyboard. You can, however, simply track the start and end of the touch and then call the move method at a set interval.
First, subscribe to the correct events and set a bool to track the state:
var isTouching = false;
window.addEventListener("touchstart", () => isTouching = true);
window.addEventListener("touchend", () => isTouching = false);
In Three.js you will most likely already have a render loop (e.g. a function called "animate"). Check the state variable at every iteration and apply the movement each time. You may need to also factor in deltaTime (the duration of the last frame), to make movement framerate independent.
function animate() {
requestAnimationFrame(animate);
mesh.rotation.x += 0.005;
mesh.rotation.y += 0.01;
if (isTouching) {
console.log("move camera");
}
renderer.render(scene, camera);
}
Here is a snippet which shows the basic approach. Click and hold in the left or right half of the output window to move the camera.
var camera, scene, renderer, mesh, material, clock;
init();
animate();
var isTouching = false;
var mousePositionX;
window.addEventListener("mousedown", (e) => {
isTouching = true;
mousePositionX = e.clientX;
});
window.addEventListener("mouseup", (e) => isTouching = false);
function init() {
renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);
clock = new THREE.Clock();
camera = new THREE.PerspectiveCamera(70, window.innerWidth / window.innerHeight, 1, 1000);
camera.position.z = 400;
scene = new THREE.Scene();
material = new THREE.MeshPhongMaterial();
var geometry = new THREE.BoxGeometry(200, 200, 200);
mesh = new THREE.Mesh(geometry, material);
scene.add(mesh);
var light = new THREE.AmbientLight(0x404040);
scene.add(light);
var directionalLight = new THREE.DirectionalLight(0xffffff);
directionalLight.position.set(1, 1, 1).normalize();
scene.add(directionalLight);
window.addEventListener('resize', onWindowResize, false);
}
function animate() {
requestAnimationFrame(animate);
mesh.rotation.x += 0.005;
mesh.rotation.y += 0.01;
let deltaTime = clock.getDelta();
if (isTouching) {
let speed = 200; // px per second
let movement = speed * deltaTime;
if (mousePositionX > window.innerWidth / 2) {
camera.translateX(-movement);
} else {
camera.translateX(movement);
}
}
renderer.render(scene, camera);
}
function onWindowResize() {
camera.aspect = window.innerWidth / window.innerHeight;
camera.updateProjectionMatrix();
renderer.setSize(window.innerWidth, window.innerHeight);
}
body {
padding: 0;
margin: 0;
}
canvas {
display: block;
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/three.js/93/three.min.js"></script>

Rotate an object like OrbitControls but only the object itself

We love three.js! And here is a page we built using it a few years ago.
https://www.jgprolock.com
We are in the process of revising the animations on this site.
Once the page loads, the user has the ability to drag and rotate the object. But it is really a trick. We are using orbit controls to rotate the camera around our scene, and thus our main object which is centered in the scene (positions x,y,z all equal to 0). If we did not place the object in the center, it starts to look uneven in its rotation as the camera now is rotating around a center that the object doesn't have.
In order to make it look like the object is on the left side, we ended up moving the canvas to the left and then we bring it back to the right or left as the animation continues after scrolling.
So, my question is .. does anyone have an example how to achieve this functionality just by rotating the actual object itself, instead of rotating the camera around the entire scene using the orbit controls plugin?
Or is there away to modify the orbit controls to rotate around an object and not the entire scene?
I've been searching for this for a while but right after asking this question I came across this link, which actually has an example of what we are trying to do.
https://jsfiddle.net/n6u6asza/1205/
The key to making this work as copied from the link: (although I am not 100% sure what this all means)
/* */
var isDragging = false;
var previousMousePosition = {
x: 0,
y: 0
};
$(renderer.domElement).on('mousedown', function(e) {
isDragging = true;
})
.on('mousemove', function(e) {
//console.log(e);
var deltaMove = {
x: e.offsetX-previousMousePosition.x,
y: e.offsetY-previousMousePosition.y
};
if(isDragging) {
var deltaRotationQuaternion = new three.Quaternion()
.setFromEuler(new three.Euler(
toRadians(deltaMove.y * 1),
toRadians(deltaMove.x * 1),
0,
'XYZ'
));
cube.quaternion.multiplyQuaternions(deltaRotationQuaternion, cube.quaternion);
}
previousMousePosition = {
x: e.offsetX,
y: e.offsetY
};
});
/* */
If you want an article on how to achieve this without the use of unnecessary jquery dependencies you can have a look here
This uses the eventListener to find a mousemove event whilst a mousedown event is occurring, and then passes the coordinates to a custom function.
var mouseDown = false,
mouseX = 0,
mouseY = 0;
var canvas = renderer.domElement
canvas.addEventListener('mousemove', function (evt) {
if (!mouseDown) {return}
//console.log('drag')
evt.preventDefault();
var deltaX = evt.clientX - mouseX,
deltaY = evt.clientY - mouseY;
mouseX = evt.clientX;
mouseY = evt.clientY;
// DO SOMETHING HERE WITH X and Y
object.rotation.x += deltaX
}, false);
canvas.addEventListener('mousedown', function (evt) {
evt.preventDefault();
mouseDown = true;
mouseX = evt.clientX;
mouseY = evt.clientY;
}, false);
canvas.addEventListener('mouseup', function (evt) {
evt.preventDefault();
mouseDown = false;
}, false);
}
But not that this will not work if you have OrbitControls or DragControls imported!

Three.js multiple Canvases and Shader

1.
I am trying to set up multiple canvases on a page like the given examples on threejs.org.
My basic code is like this:
var scene, camera, controls, renderer, pointLight, geometry, material;
var container, position, dimensions, apps = [];
var windowWidth = window.innerWidth;
var windowHeight = window.innerHeight;
function preView( id ){
apps.push( new App( id ) );
//animate(); // if i call this here, all canvases renders once
function App( id ) {
container = $('#preView_' + id);
dimensions = { width: container.width(), height: container.height()};
camera = new THREE.PerspectiveCamera(45, dimensions.width/dimensions.height, 1, 5 * radius);
camera.position.x = 0;
camera.position.y = 0;
camera.position.z = 100;
scene = new THREE.Scene();
/* add meshes */
/* ======================= */
renderer = new THREE.WebGLRenderer();
renderer.setSize(dimensions.width, dimensions.height);
container.append(renderer.domElement);
this.animate = function() {
if( camera.position.z > -(1/3) * 100 )
{
/* simple fly through the scene */
camera.position.x += 0.05;
camera.position.y += 0.05;
camera.position.z -= 0.1;
}
camera.lookAt(new THREE.Vector3(0,0,0));
render();
};
}
}
function animate(){
for ( var i = 0; i < apps.length; ++i ) {
apps[ i ].animate();
}
requestAnimationFrame(animate);
}
function render(){
renderer.render(scene, camera);
}
The strange thing what happends is, that only the last canvas renderes (at all) if i call animate(); after all canvases are drawn. And if i call animate(); in the preView(); Function, all sences are rendered once but only the last canvas renderes the 'camera fly through'. But a console.log(apps[i]); in the animate(); function go through all apps, but dont render the scene.
What do i do wrong here?
2.
Also i try to achieve this shader effect for every object which i declare as 'light', nomatter which position it has in the scene.
I tried to play a little with all position values in the shaders with absolutly no effect.
The only effect was in the VolumetericLightShader on line 333.
I hope for any Hints here.
Put all the variables, except apps=[], in App( id ) function. Thus you'll make them local for App. In your case now, every time you call
new App( id )
you put information in global variables which you created once. So in those variables you have the data you've stored there since last call of App( id ).
It means that you re-write the data in global variables. The same about the render() method. Put it inside the App() function too. As you mentioned about the example from Threejs.org, you had to notice where this method is stored. It's inside the App() function there. Sample jsfiddle
Maybe it would be easier to use the technique of lens flares. https://threejs.org/examples/webgl_lensflares.html

Resources