Ray intersection when using morphtargets not working - three.js

Slightly complex, so bear with me:
Ray intersect works perfectly when an object has no morphTargets.
When an object has morphTargets only the original position can be intersected, that is to say, if I morph a model from 0,0,0 to 50,50,50 the ray will not intersect with the object at 50,50,50, instead, when I mouse over 0,0,0 I get an intersection (even though the object is no longer there!?).
Is there some sort of flag I need to turn on to make three.js aware that the verts have moved?
Edit, code added.
This makes my mesh and adds it to the objects array (which ray intersect uses):
function createDeer( deerGeometry, materials ) {
mesh = new THREE.MorphAnimMesh( deerGeometry, new THREE.MeshLambertMaterial( { color: 0xE8E8E8, ambient: 0xE8E8E8, morphTargets: true, vertexColors: THREE.FaceColors } ) );
mesh.scale.set( 3, 3, 3 );
mesh.position.set( 0, -3, 0 );
mesh.rotation.set( 0, 0, 0 );
mesh.castShadow = true;
mesh.receiveShadow = true;
mesh.geometry.dynamic = true;
scene.add( mesh );
objects.push( mesh );
}
Ray intersection happens on mouseDown (there's a mouseOver as well, same thing), like I said, the code works fine, it's just intersecting with the original unmorphed mesh:
function onDocumentMouseDown( event ) {
event.preventDefault();
var vector = new THREE.Vector3( mouse.x, mouse.y, 0.5 );
projector.unprojectVector( vector, camera );
var ray = new THREE.Ray( camera.position, vector.subSelf( camera.position ).normalize() );
var intersects = ray.intersectObjects( objects );
if ( intersects.length > 0 ) {
SELECTED = intersects[ 0 ].object;
for(var i=0; i<objects.length; i++)
{
if(SELECTED.position.x == objects[0].position.x) {
thisObject = i;
}
}
}
var intersects = ray.intersectObject( plane );
container.style.cursor = 'pointer';
}
}
I've decided the problem must be related to the fact that the position of the deer (as in the mesh transform) never changes, however the vertices do move away, and as the ray intersect is comparing object positions perhaps the problem is here?

I've made a pull request that has been merged and fixes this.
Note that for it to work, the boundingSphere of the object needs to contain the full extent of the morphing

The MorphTarget animation takes place entirely on GPU (in the shaders code) while the ray intersection is always computed on CPU. So in fact, there's no easy way to achieve what you're describing here.

Related

Three js raycaster WITHOUT camera

I seem to find only examples to use the raycaster with the camera, but none that just have a raycaster from Point A to Point B.
I have a working raycaster, it retrieves my Helpers, Lines etc. but it seems it does not recognize my sphere.
My first thought was my points are off, so i decided to create a line from my pointA to my pointB with a direction like so:
var pointA = new Vector3( 50, 0, 0 );
var direction = new Vector3( 0, 1, 0 );
direction.normalize();
var distance = 100;
var pointB = new Vector3();
pointB.addVectors ( pointA, direction.multiplyScalar( distance ) );
var geometry = new Geometry();
geometry.vertices.push( pointA );
geometry.vertices.push( pointB );
var material = new LineBasicMaterial( { color : 0xff0000 } );
var line = new Line( geometry, material );
This will show a line from my point (50 0 0) to (50 100 0) right trough my sphere which is at point (50, 50, 0) so my pointA and direction values are correct.
Next i add a raycaster:
To avoid conflicts with any side effects i recreated my points here:
var raycaster = new Raycaster(new Vector3( 50, 0, 0 ), new Vector3( 0, 1, 0 ).normalize());
var intersects = raycaster.intersectObject(target);
console.log(intersects);
Seems pretty straight forward to me, i also tried to use raycaster.intersectObjects(scene.children) but it gives Lines, helpers etc. but not my sphere.
What am i doing wrong? I am surely missing something here.
IMG of the line and the sphere:
What you see is explained in the following github issue:
https://github.com/mrdoob/three.js/issues/11449
The problem is that the ray emitted from THREE.Raycaster does not directly hit a face but its vertex which results in no intersection.
There are several workarounds to solve this issue e.g. slightly shift the geometry or the ray. For your case:
var raycaster = new THREE.Raycaster( new THREE.Vector3( 50, 0, 0 ), new THREE.Vector3( 0, 1, 0.01 ).normalize() );
However, a better solution is to fix the engine and make the test more robust.
Demo: https://jsfiddle.net/kzwmoug2/3/
three.js R106

Three.js raycasting collision not working

I am working on an arcade style Everest Flight Simulator.
In my debugger where I am building this, I have a terrain and helicopter class which generate the BufferGeometry terrain mesh, the Groups for the helipad Geometries, and the group for the helicopter Camera and Geometry.
My issue is that currently I can't seem to get any collision to detect. I imagine it may not support BufferGeometries so that is an issue for me because I need the terrain to be a Buffer since it's far too expansive... as a standard geometry it causes a memory crash in the browser.
However, testing the helipad geometries alone it still does not trigger. They are in a group so I add the groups to a global window array and set the collision check to be recursive but to no avail.
Ultimately, I am open to other forms of collision detection and may need two types as I have to use buffer geometries. Any ideas on how to fix this or a better solution?
The Helicopter Object Itself
// Rect to Simulate Helicopter
const geometry = new THREE.BoxGeometry( 2, 1, 4 ),
material = new THREE.MeshBasicMaterial(),
rect = new THREE.Mesh( geometry, material );
rect.position.x = 0;
rect.position.y = terrain.returnCameraStartPosY();
rect.position.z = 0;
rect.rotation.order = "YXZ";
rect.name = "heli";
// Link Camera and Helicopter
const heliCam = new THREE.Group(),
player = new Helicopter(heliCam, "OH-58 Kiowa", 14000);
heliCam.add(camera);
heliCam.add(rect);
heliCam.position.set( 0, 2040, -2000 );
heliCam.name = "heliCam";
scene.add(heliCam);
Adding Objects to Global Collision Array
// Add Terrain
const terrain = new Terrain.ProceduralTerrain(),
terrainObj = terrain.returnTerrainObj(),
helipadEnd = new Terrain.Helipad( 0, 1200, -3600, "Finish", true ),
helipadStart = new Terrain.Helipad( 0, 2000, -2000, "Start", false ),
helipadObjStart = helipadStart.returnHelipadObj(),
helipadObjEnd = helipadEnd.returnHelipadObj();
window.collidableMeshList.push(terrainObj);
window.collidableMeshList.push(helipadObjStart);
window.collidableMeshList.push(helipadObjEnd);
Collision Detection Function Run Every Frame
collisionDetection(){
const playerOrigin = this.heli.children[1].clone(); // Get Box Mesh from Player Group
for (let i = playerOrigin.geometry.vertices.length - 1; i >= 0; i--) {
const localVertex = playerOrigin.geometry.vertices[i].clone(),
globalVertex = localVertex.applyMatrix4( playerOrigin.matrix ),
directionVector = globalVertex.sub( playerOrigin.position ),
ray = new THREE.Raycaster( playerOrigin, directionVector.clone().normalize() ),
collisionResults = ray.intersectObjects( window.collidableMeshList, true ); // Recursive Boolean for children
if ( collisionResults.length > 0 ){
this.landed = true;
console.log("Collision");
}
// if ( collisionResults.length > 0 && collisionResults[0].distance < directionVector.length() ){
// this.landed = true;
// console.log("Collision with vectorLength")
// }
}
}
It's hard to tell what's going on inside your custom classes, but it looks like you're using an Object3D as the first argument of the raycaster, instead of a Vector3 when you use this.heli.children[1].clone(). Why don't you try something like:
var raycaster = new THREE.Raycaster();
var origin = this.heli.children[1].position;
raycaster.set(origin, direction);
Also, are you sure you're using a BufferGeometry? Because when you access a vertex value like this: playerOrigin.geometry.vertices[i], it should give you an error. There is no vertices attribute in a BufferGeometry so I don't know how you're determining the direction vector.

Raycasting against a mesh is not found where it 's visible in the scene

I'm having a strange problem with raycasting. My scene consists of a room with a couple of components that you can move around inside that room. When the component is moving i'm measuring the distances to the walls, an invisible roof and floor. The problem is that the roof which is a ShapeGeometry is visible where it should be at the top of the walls but not hit when raycasting.
Here's where i create the mesh for the invisible roof
const roofShape = new THREE.Shape();
roofShape.moveTo(roofPoints[0].x, roofPoints[0].y);
for (let i = 1; i < roofPoints.length; i++) {
roofShape.lineTo(roofPoints[i].x, roofPoints[i].y);
}
roofShape.lineTo(roofPoints[0].x, roofPoints[0].y);
const geometry = new THREE.ShapeGeometry(roofShape);
const material = new THREE.MeshBasicMaterial({color: 0x000000, side: THREE.DoubleSide});
material.opacity = 0;
material.transparent = true;
const mesh = new THREE.Mesh(geometry, material);
mesh.position.x = 0;
mesh.position.y = 0;
mesh.position.z = room._height;
mesh.name = "ROOF";
mesh.userData = <Object3DUserData> {
id: IntersectType.INVISIBLE_ROOF,
intersectType: IntersectType.INVISIBLE_ROOF,
};
The function that's invoking the raycasting. The direction vector is(0, 0, 1) in this case. And the surfaces parameter is an array which only contains the mesh created above.
function getDistanceToSurface(componentPosition: THREE.Vector3, surfaces: THREE.Object3D[], direction: THREE.Vector3): number {
const rayCaster = new THREE.Raycaster(componentPosition, direction.normalize());
const intersections = rayCaster.intersectObjects(surfaces);
if (!intersections || !intersections.length) {
return 0;
}
const val = intersections[0].distance;
return val;
}
By changing the z direction to -1 i found that the raycaster found the roof at z=0. It seems that the geometry is still at position z=0.
I then tried to translate the geometry shape
geometry.translate(0, 0, room._height);
And now the raycaster finds it where i expect it to be. But visually it it's double the z position(mesh opacity=1). Setting the mesh position z to 0 makes it visibly correct and the raycasting still works.
I've been looking at the examples of raycasting but can't find anywhere where a ShapeGeometry needs do this.
Am i doing something wrong? Have i missed something? Do i have to set z position of the geometry, is it not enough with positioning the mesh?
As hinted in the comment by #radio the solution was as described in How to update vertices geometry after rotate or move object
mesh.position.z = room._height;
mesh.updateMatrix();
mesh.geometry.applyMatrix(mesh.matrix);
mesh.matrix.identity();

Draw line in direction of raycaster in three.js

In three.js, I'm using PointerLock controls the make a basic first person shooter.
I use
function onDocumentMouseDown( event ) {
var raycaster = new THREE.Raycaster();
mouse3D.normalize();
controls.getDirection( mouse3D );
raycaster.set( controls.getObject().position, mouse3D );
var intersects = raycaster.intersectObjects( objects );
...
}
to detect a collision with an object, which means you "shot" the object.
Now, I want to visualize the path the bullet took. I was thinking about drawing a line from where the user is looking to, in direction of the raycaster, but I can't figure out how to do this... Anyone who can help me? I'm new to three.js, never thought drawing a line would be this hard.
Update:
I'm trying to draw a line using:
var geometry = new THREE.Geometry();
geometry.vertices.push(...);
geometry.vertices.push(...);
var line = new THREE.Line(geometry, material);
scene.add(line);
but I can't figure out what to put in place of the "..." . How can I detect which point the line should go to? And how to determine which point it starts from? The player is able to move and even jump so the starting point is always different too.
You can use the following (using r83):
// Draw a line from pointA in the given direction at distance 100
var pointA = new THREE.Vector3( 0, 0, 0 );
var direction = new THREE.Vector3( 10, 0, 0 );
direction.normalize();
var distance = 100; // at what distance to determine pointB
var pointB = new THREE.Vector3();
pointB.addVectors ( pointA, direction.multiplyScalar( distance ) );
var geometry = new THREE.Geometry();
geometry.vertices.push( pointA );
geometry.vertices.push( pointB );
var material = new THREE.LineBasicMaterial( { color : 0xff0000 } );
var line = new THREE.Line( geometry, material );
scene.add( line );
Codepen at: https://codepen.io/anon/pen/evNqGy
You can use something like this:
function animate_Line(frame, totalFrames) {
//Calculate how much of the line should be drawn every iteration
var delta = lineDistance/(totalFrames);
var deltaSpeed = delta * frame;
for(var i=0; i<f_Ray_List[0].length; i++) {
for(var j=0; j<f_Ray_List[1].length; j++) {
//Change Offsets
line.geometry.vertices[1].y = line.geometry.vertices[0].y - deltaSpeed;
//Update Rays = true (Make FRT rays draw-able)
line.geometry.verticesNeedUpdate = true;
}
}
}
where frame is the current frame (a counter in your animate function), totalFrames would be the amount of frames that the line would take to be animated. The lineDistance can be calculated by using this:
lineDistance = line.geometry.vertices[0].y - line.vertices[1].y; //Add this line where you create the line object.
and remember to call line.geometry.verticesNeedUpdate = true; in every line individually, so that the line would able to be animated.
Notice that this is only based on Line.y axis. This would not be great at first. I'm currently working on converting this to Polar coordinates instead but I have no idea what is going on hahah.

Unexpected mesh results from ThreeCSG boolean operation

I am creating a scene & have used a boolean function to cut out holes in my wall. However the lighting reveals that the resultant shapes have messed up faces. I want the surface to look like one solid piece, rather than fragmented and displaying lighting backwards. Does anyone know what could be going wrong with my geometry?
The code that booleans objects is as follows:
//boolean subtract two shapes, convert meshes to bsps, subtract, then convert back to mesh
var booleanSubtract = function (Mesh1, Mesh2, material) {
//Mesh1 conversion
var mesh1BSP = new ThreeBSP( Mesh1 );
//Mesh2 conversion
var mesh2BSP = new ThreeBSP( Mesh2 );
var subtract_bsp = mesh1BSP.subtract( mesh2BSP );
var result = subtract_bsp.toMesh( material );
result.geometry.computeVertexNormals();
return result;
};
I have two lights in the scene:
var light = new THREE.DirectionalLight( 0xffffff, 0.75 );
light.position.set( 0, 0, 1 );
scene.add( light );
//create a point light
var pointLight = new THREE.PointLight(0xFFFFFF);
// set its position
pointLight.position.x = 10;
pointLight.position.y = 50;
pointLight.position.z = 130;
// add to the scene
scene.add(pointLight);
EDIT: Using WestLangley's suggestion, I was able to partially fix the wall rendering. And by using material.wireframe=true; I can see that after the boolean operation my wall faces are not merged. Is there a way to merge them?
Your problems are due to two issues.
First, you should be using FlatShading.
Second, as explained in this stackoverflow post, MeshLambert material only calculates the lighting at each vertex, and interpolates the color across each face. MeshPhongMaterial calculates the color at each texel.
You need to use MeshPhongMaterial to avoid the lighting artifacts you are seeing.
three.js r.68

Resources