When I check if points are within a frustum using a Perspective camera, it works. But when I check using a Orthographic camera, the frustum box appears inaccurate. The planes are set correctly. Is there something else I'm overlooking that requires changing if using Orthographic camera?
Here's how I'm setting the planes..
if((this.mouse.screenx>mouse.screen.x && this.mouse.screeny<mouse.screen.y)||
(this.mouse.screenx<mouse.screen.x && this.mouse.screeny>mouse.screen.y)){
topPlane.setFromCoplanarPoints(camera.position, topRight, topLeft );
rightPlane.setFromCoplanarPoints(camera.position, bottomRight, topRight );
bottomPlane.setFromCoplanarPoints(camera.position, bottomLeft,bottomRight );
leftPlane.setFromCoplanarPoints(camera.position, topLeft, bottomLeft );
}else{
topPlane.setFromCoplanarPoints(camera.position, topLeft , topRight );
rightPlane.setFromCoplanarPoints(camera.position, topRight , bottomRight );
bottomPlane.setFromCoplanarPoints(camera.position,bottomRight , bottomLeft );
leftPlane.setFromCoplanarPoints(camera.position, bottomLeft , topLeft );
}
nearPlane.setFromNormalAndCoplanarPoint(vector,camera.position);
vector.set( 0, 0, 1 );
vector.applyQuaternion( camera.quaternion );
var vector2 = new THREE.Vector3( 0, 0, -config.camera.far );
vector2.applyQuaternion( camera.quaternion );
vector2.add(camera.position);
farPlane.setFromNormalAndCoplanarPoint(vector,vector2);
Okay, I figured the issue out.
I was creating my planes so that they extend from the camera's position, however, this creates a frustum that is tapered.
In orthographic mode, there is no tapering- everything is straight, so the frustum planes shouldnt extend from the camera.
What I did was get the direction of the camera:
var vector = new THREE.Vector3( 0, 0, -1 );
vector.applyQuaternion( camera.quaternion );
And then I used this to create each plane- for example:
var top = topRight.clone(), right = bottomRight.clone(), bottom = bottomLeft.clone(), left = topLeft.clone();
top.add(vector), right.add(vector), bottom.add(vector), left.add(vector);
topPlane.setFromCoplanarPoints(top, topLeft , topRight );
rightPlane.setFromCoplanarPoints(right, topRight , bottomRight );
bottomPlane.setFromCoplanarPoints(bottom,bottomRight , bottomLeft );
leftPlane.setFromCoplanarPoints(left, bottomLeft , topLeft );
I have it so the frustum could be inverted- so be sure to check if planes are facing right way.
Related
I seem to find only examples to use the raycaster with the camera, but none that just have a raycaster from Point A to Point B.
I have a working raycaster, it retrieves my Helpers, Lines etc. but it seems it does not recognize my sphere.
My first thought was my points are off, so i decided to create a line from my pointA to my pointB with a direction like so:
var pointA = new Vector3( 50, 0, 0 );
var direction = new Vector3( 0, 1, 0 );
direction.normalize();
var distance = 100;
var pointB = new Vector3();
pointB.addVectors ( pointA, direction.multiplyScalar( distance ) );
var geometry = new Geometry();
geometry.vertices.push( pointA );
geometry.vertices.push( pointB );
var material = new LineBasicMaterial( { color : 0xff0000 } );
var line = new Line( geometry, material );
This will show a line from my point (50 0 0) to (50 100 0) right trough my sphere which is at point (50, 50, 0) so my pointA and direction values are correct.
Next i add a raycaster:
To avoid conflicts with any side effects i recreated my points here:
var raycaster = new Raycaster(new Vector3( 50, 0, 0 ), new Vector3( 0, 1, 0 ).normalize());
var intersects = raycaster.intersectObject(target);
console.log(intersects);
Seems pretty straight forward to me, i also tried to use raycaster.intersectObjects(scene.children) but it gives Lines, helpers etc. but not my sphere.
What am i doing wrong? I am surely missing something here.
IMG of the line and the sphere:
What you see is explained in the following github issue:
https://github.com/mrdoob/three.js/issues/11449
The problem is that the ray emitted from THREE.Raycaster does not directly hit a face but its vertex which results in no intersection.
There are several workarounds to solve this issue e.g. slightly shift the geometry or the ray. For your case:
var raycaster = new THREE.Raycaster( new THREE.Vector3( 50, 0, 0 ), new THREE.Vector3( 0, 1, 0.01 ).normalize() );
However, a better solution is to fix the engine and make the test more robust.
Demo: https://jsfiddle.net/kzwmoug2/3/
three.js R106
I want to separate each face of icosahedron as shown in above image. Can anybody point me to related example or any idea about making it work.In my example icosahedron each face has different image texture as a material so i cannot use Shadermaterial.
Thanks
You can use ExplodeModifier to convert your geometry into so-called "triangle soup", and then translate the vertices as you wish.
var geometry = new THREE.IcosahedronGeometry( 4, 2 );
var modifier = new THREE.ExplodeModifier();
modifier.modify( geometry );
var normal = new THREE.Vector3();
for ( var i = 0, l = geometry.faces.length; i < l; i ++ ) {
var face = geometry.faces[ i ];
normal.copy( face.normal ).multiplyScalar( 1 );
geometry.vertices[ face.a ].add( normal );
geometry.vertices[ face.b ].add( normal );
geometry.vertices[ face.c ].add( normal );
};
examples/js/modifiers/ExplodeModifier.js must be explicitly included in your project.
three.js r.87
I want to aim for objects with cameras' vision (as the user would look at the object, not point at it with mouse).
I'm casting a ray from the camera like this
rotation.x = camera.rotation.x;
rotation.y = camera.rotation.y;
rotation.z = camera.rotation.z;
raycaster.ray.direction.copy( direction ).applyEuler(rotation);
raycaster.ray.origin.copy( camera.position );
var intersections = raycaster.intersectObjects( cubes.children );
This gets me the intersections but it seems to wander off sometimes. So I'd like to add aim (crosshair). That would be somekind on object (mesh) at the end or in the middle of the ray.
How can I add it? When I created a regular line it was in front of the camera so the screen would go black.
You can add a crosshair constructed from simple geometry to your camera like this:
var material = new THREE.LineBasicMaterial({ color: 0xAAFFAA });
// crosshair size
var x = 0.01, y = 0.01;
var geometry = new THREE.Geometry();
// crosshair
geometry.vertices.push(new THREE.Vector3(0, y, 0));
geometry.vertices.push(new THREE.Vector3(0, -y, 0));
geometry.vertices.push(new THREE.Vector3(0, 0, 0));
geometry.vertices.push(new THREE.Vector3(x, 0, 0));
geometry.vertices.push(new THREE.Vector3(-x, 0, 0));
var crosshair = new THREE.Line( geometry, material );
// place it in the center
var crosshairPercentX = 50;
var crosshairPercentY = 50;
var crosshairPositionX = (crosshairPercentX / 100) * 2 - 1;
var crosshairPositionY = (crosshairPercentY / 100) * 2 - 1;
crosshair.position.x = crosshairPositionX * camera.aspect;
crosshair.position.y = crosshairPositionY;
crosshair.position.z = -0.3;
camera.add( crosshair );
scene.add( camera );
Three.js r107
http://jsfiddle.net/5ksydn6u/2/
In case you dont have a special usecase where you need to retrieve the position and rotation from your camera like you are doing, I guess your "wandering off" could be fixed by calling your raycaster with these arguments.:
raycaster.set( camera.getWorldPosition(), camera.getWorldDirection() );
var intersections = raycaster.intersectObjects( cubes.children );
Cast visible ray
Then you can visualize your raycast in 3D space by drawing an arrow with the arrow helper. Do this after your raycast:
scene.remove ( arrow );
arrow = new THREE.ArrowHelper( camera.getWorldDirection(), camera.getWorldPosition(), 100, Math.random() * 0xffffff );
scene.add( arrow );
i try since weeks to raycast from the centre of a perspectivecamera. without event.client.x and y, because i use the device orientation:
var raycaster = new THREE.Raycaster();
var vector = new THREE.Vector3( 0, 0, 0 ); // instead of event.client.x and event.client.y
var direction = new THREE.Vector3( 0, 0, -1 ).transformDirection( camera.matrixWorld );
raycaster.set( vector, direction );
var intersects = raycaster.intersectObjects(objects);
i have a perspectivecamera on (0,0,0) … i use the device orientation instead keyboard and mouse to obtain the Rotation… and want to hit a cube with a raycast sent from the center of my cam. i try a lot of examples for a three.js raycast, but no success.
what the hell is wrong with my raycast????
For your raycast vector and direction use the cameras world vectors:
raycaster.set( camera.getWorldPosition(), camera.getWorldDirection() );
Use simple:
raycaster.setFromCamera( new THREE.Vector2(), camera );
http://jsfiddle.net/gLbkg21e/
I try to build molecule CH4 with threejs
But when I try to build 109.5 angle
methanum = function(x, y, z) {
molecule = new THREE.Object3D();
var startPosition = new THREE.Vector3( 0, 0, 0 );
molecule.add(atom(startPosition, "o"));
var secondPosition = new THREE.Vector3( -20, 10, 00 );
molecule.add(atom(secondPosition, "h"));
var angle = 109.5;
var matrix = new THREE.Matrix4().makeRotationAxis( new THREE.Vector3( 0, 1, 0 ), angle * ( Math.PI / 180 ));
var thirdPosition = secondPosition.applyMatrix4( matrix );
molecule.add(atom(thirdPosition, "h"));
var fourthPosition = thirdPosition.applyMatrix4( matrix );
molecule.add(atom(thirdPosition, "h"));
molecule.position.set(x, y, z);
molecule.rotation.set(x, y, z);
scene.add( molecule );
}
Demo: https://dl.dropboxusercontent.com/u/6204711/3d/ch4.html
But my atoms are not uniformly distributed as in the drawing
Some ideas?
Well there are 3 errors in your molecule code.
You place an oxygen as the center of the CH4 instead of a carbon
When you apply your fourth hydrogen, you specify the third position whereas you have created a fourthposition.
You are rotating around the wrong axis when you place your third hydrogen. My hints are the following: First of all , place your carbon, then move along the Z-axis, place your first hydrogen, rotate around the X-axis of 109.5°, place your second hydrogen, rotate around the Z-axis of 120° the position of your second hydrogen, place your third hydrogen and finally rotate once again around the Z-axis of 120° the position of your third hydrogen and place your last hydrogen.
Here is the CH4 I tried:
methanum3 = function(x, y, z) {
molecule = new THREE.Object3D();
var startPosition = new THREE.Vector3( 0, 0, 0 );
molecule.add(atom(startPosition, "c"));
var axis = new THREE.AxisHelper( 50 );
axis.position.set( 0, 0, 0 );
molecule.add( axis );
var secondPosition = new THREE.Vector3( 0, 0, -40 );
molecule.add(atom(secondPosition, "h"));
var angle = 109.5;
var matrixX = new THREE.Matrix4().makeRotationAxis( new THREE.Vector3( 1, 0, 0 ), angle * ( Math.PI / 180 ));
var thirdPosition = secondPosition.applyMatrix4( matrixX );
molecule.add(atom(thirdPosition, "h"));
var matrixZ = new THREE.Matrix4().makeRotationAxis( new THREE.Vector3( 0, 0, 1 ), 120 * ( Math.PI / 180 ));
var fourthPosition = thirdPosition.applyMatrix4( matrixZ );
molecule.add(atom(fourthPosition, "h"));
var fifthPosition = fourthPosition.applyMatrix4( matrixZ );
molecule.add(atom(fifthPosition, "h"));
molecule.position.set(x, y, z);
//molecule.rotation.set(x, y, z);
scene.add( molecule );
}
//water(0,0,0);
//water(30,60,0);
methanum3(-30,60,0);
Explanation:
Let's call H1 an hydrogen and H2 another one. The given angle of 109.5° is defined in the :
---> --->
(CH1,CH2) plane. Therefore when you look in the direction of the normal of that plane, you can see the 109.5° (Cf. the right part of the image below) BUT When you look in the direction of the normal of another plane you'll get the projection of that angle on that plane. In your case when you look in the direction of the Z-axis you can see an angle of 120°.(Cf. left part of the image below).
The two angles are different according to the direction of the camera.
Hope this helps.