Plane constant definition in three.js - three.js

The documentation for the Plane constructor says that constant is the signed distance from the origin to the plane. However, if I construct a simple "floor" below the XZ plane like this:
var plane = new THREE.Plane();
plane.setFromCoplanarPoints(
new THREE.Vector3(0, -10, 0),
new THREE.Vector3(0, -10, 1),
new THREE.Vector3(1, -10, 0)
);
console.log(plane);
I get the following output:
Plane {normal: Vector3, constant: 10, constructor: Object}
normal: Vector3
x: 0
y: 1
z: 0
<constructor>: "Vector3"
constant: 10
<constructor>: "Plane"
My question: why is constant +10 and not -10? The Y-axis points up, the floor normal points up, so movement from the origin to a plane at Y = -10 should be negative shouldn't it?
Functions like Plane.distanceToPoint give the correct answer so I suspect I'm not reading the documentation correctly.
CodePen Sample

Related

How do I correctly render a texture in a quad in Metal?

My issue is because a quad is just two triangles. The texture is not rendered consistently on each triangle, and the texture is broken across the border between the two triangles. I'm using a screenshot of my lovely Minecraft house as an example texture:
Rendered textured quad
As you see, from the top left of the screenshot and the bottom right, it seem to have been cut or folded or something. It's just distorted. And the distortion I speak of is NOT from the fact that it's being applied to a trapezoid, it's that it's inconsistently being applied to the two triangles that constitute the trapezoid.
Original screenshot
So how can I fix this?
In viewDidLoad:
let VertexDescriptor = MTLVertexDescriptor()
let Attribute1Offset = MemoryLayout<simd_float3>.stride
let Attribute2Offset = Attribute1Offset+MemoryLayout<simd_float4>.stride
VertexDescriptor.attributes[0].format = .float3
VertexDescriptor.attributes[1].format = .float4
VertexDescriptor.attributes[1].offset = Attribute1Offset
VertexDescriptor.attributes[2].format = .float2
VertexDescriptor.attributes[2].offset = Attribute2Offset
VertexDescriptor.layouts[0].stride = Attribute2Offset+MemoryLayout<simd_float2>.stride
PipelineDescriptor.vertexDescriptor = VertexDescriptor
let TextureLoader = MTKTextureLoader(device: Device)
Texture = try? TextureLoader.newTexture(URL: Bundle.main.url(forResource: "Texture.png", withExtension: nil)!)
Vertices:
//First four = position, second four = color, last two = texture coordinates
let Vertices: [Float] = [-0.5, 0.5, 0, 0, 1, 1, 0, 1, 0, 0,
0.5, 0.5, 0, 0, 0, 1, 1, 1, 1, 0,
1, -1, 0, 0, 0, 1, 0, 1, 1, 1,
-1, -1, 0, 0, 0, 0, 1, 1, 0, 1]
Types in Shaders.metal
typedef struct {
float4 Position [[attribute(0)]];
float4 Color [[attribute(1)]];
float2 TexCoord [[attribute(2)]];
} VertexIn;
typedef struct {
float4 Position [[position]];
float4 Color;
float2 TexCoord;
} VertexOut;
Bear with me, I use PascalCase because I think camelCase is ugly. I just don't like it. Well anyways, how do I correctly place a texture in a quad made of two triangles so it won't look all weird?
As you know, Metal performs perspective-correct vertex attribute interpolation on your behalf by using the depth information provided by the z coordinate of your vertex positions.
You're subverting this process by distorting the "projected" shape of the quad without providing perspective information to the graphics pipeline. This means that you need to pass along a little extra information in order to get correct interpolation. Specifically, you need to include "depth" information in your texture coordinates and perform the "perspective" divide manually in the fragment shader.
For a fully-general solution, consult this answer, but for a simple fix in the case of symmetrical scaling about the vertical axis of the quad, use float3 texture coordinates instead of float2 and set the x and z coordinates such that z is the scale factor introduced by your pseudo-perspective projection and when x is divided by z, the result is what x would have been without the divide.
For example, if the distance between the top two vertices is half that of the bottom two vertices (as appears to be the case in your screenshot), set the upper-left texture coordinate to (0, 0, 0.5) and the upper-right texture coordinate to (0.5, 0, 0.5). Pass these "3D" texture coordinates through to your fragment shader, then divide by z before sampling:
half4 color = myTexture.sample(mySampler, in.texCoords.xy / in.texCoords.z);

Problem in understanding three.js coordinate and axes system

I have following code:
// coordinate values
var x1 = -815723.5125568421;
var y1 = 20538442.534868136;
var z1 = -17.439584224846456;
var x2 = -815723.5125568421;
var y2 = 20538443.164575472;
var z2 = -16.620415776398275;
// make a rectangular face parallel to y-z plane
var dummySquare = new THREE.Geometry();
dummySquare.vertices.push(new THREE.Vector3(x1,y1,z1));
dummySquare.vertices.push(new THREE.Vector3(x1,y1,z2));
dummySquare.vertices.push(new THREE.Vector3(x2,y2,z1));
dummySquare.vertices.push(new THREE.Vector3(x2,y2,z2));
dummySquare.faces.push(new THREE.Face3(0,1,2));
dummySquare.faces.push(new THREE.Face3(1,2,3));
var dummySquareMaterial = new THREE.MeshBasicMaterial( { color: "#0000FF", side: THREE.DoubleSide } );
var dummySquareMesh = new THREE.Mesh(dummySquare, dummySquareMaterial);
So, I am making a rectangular face parallel to y-z plane.
During debugging I observe following:
vertices: Array(4)
0: p {x: -815723.5125568421, y: 20538442.534868136, z:
-17.439584224846456}
1: p {x: -815723.5125568421, y: 20538442.534868136, z:
-16.620415776398275}
2: p {x: -815723.5125568421, y: 20538443.164575472, z:
-17.439584224846456}
3: p {x: -815723.5125568421, y: 20538443.164575472, z:
-16.620415776398275}
position: p {x: 0, y: 0, z: 0}
So vertices are as expected. But position is at (0,0,0). I expected position to be mid point of the plane defined by above four vertices.
What is missing here in my understanding?
Another observation is as follows.
I make two faces just like above(same vertices).
For one of the two faces, I determine centre of geometry, move geometry it to origin(translate by negative of centre), create a mesh with it which then I move back to original position:
var face = new THREE.Geometry();
....add vertices as code snippet above
var faceCentre = new THREE.Vector3();
face.boundingBox.getCenter(faceCentre );
face.translate(-faceCentre .x,-faceCentre .y,-faceCentre .z);
//make mesh
var faceMaterial = new THREE.MeshBasicMaterial( { color: "#FF0000", side:
THREE.DoubleSide } );
var faceMesh= new THREE.Mesh(face, faceMaterial);
// move mesh back by setting its position to original centre of face
faceMesh.position.x = faceCentre .x;
faceMesh.position.y = faceCentre .y;
faceMesh.position.z = faceCentre .z;
Unmoved face has same vertices as for face above, as expected.
But other face has now totally different vertices, even though both are displayed at same position and in same orientation.
Why this difference in vertices?
THREE.js uses a hierarchical representation of objects and their translation. In particular, the .position of an Object3D is not generally, as you say you expected, its middle point in world space, but it can be viewed as a variable that stores the current translation of the object. It is called its local position. This translation is (0,0,0) on default.
So when you define an object by the vertices of its geometry, the object's vertices will render at these positions. However, if you .translate() it by a factor (dx, dy, dz), then a vertex (vx, vy, vz) will render at position (vx+dx, vy+dy, vz+dz).
Similarly, other transformations are also stored as members of the object. The vertices of the geometry do not change when an object is transformed, but instead the object keeps track of its current local transformations, which are applied, typically as a series of matrix multiplications, to the vertices. Using this logic, you can define a tree of objects inside each other, which have local transformations in relation to its parent, which in turn may be transformed in relation to its parent etc. This sort of representation proves very useful for slightly more complicated scenes.
This should explain your results. For example, in your first test, you are successfully creating an object exactly where you want it, but its position is still (0,0,0) because it has undergone no transformations.

Three.js - Can I 'apply' position, rotation, and scale to the geometry?

I'd like to edit an object's position, rotation, and scale vectors, then 'apply' them to the geometry, which would zero-out those vectors, but retain the transformation.
For example, let's say I import a cube with a side-length of 1. The min and max vertices of the cube are located at (0, 0, 0) and (1, 1, 1). I set the object's scale to (2, 2, 2) and then apply the transformation to the geometry.
After applying, the scale is reset to (1, 1, 1) and the min and max vertices of the cube are (0, 0, 0) and (2, 2, 2), respectively.
Is there some built-in way to do this?
You can apply an object's transform to the object's geometry directly, and then reset the position, rotation, and scale like so:
object.updateMatrix();
object.geometry.applyMatrix( object.matrix );
object.position.set( 0, 0, 0 );
object.rotation.set( 0, 0, 0 );
object.scale.set( 1, 1, 1 );
object.updateMatrix();
three.js r.69
Another solution is to create a wrapper object, where the child object will have the default rotation, position and scale:
const ChildObject = load(/*...*/);
ChildObject.rotation.set(90, 0, 0); // default rotation goes here
const MainObject = new THREE.Object3D();
MainObject.add(ChildObject);

Rotate camera based on angle

I would like to rotate an object on a certain angle along Y axis.
Based on this answer How to rotate a Three.js Vector3 around an axis? I suppose to get an updated vector.
My code is :
var vec = new THREE.Vector3( 0,0,0 );
var axis = new THREE.Vector3( 0,1,0 );
var angle = Math.PI / 2;
vec.applyAxisAngle( axis, angle );
I'm using r67 and it returns me 0,0,0. I've tried r69 as well and it is returns me the same. I'm not quiet ready to move to r69. Could you guys tell me please how to do the same thing but using r67. Thanks.
Your are rotating vector (0, 0, 0) which is center and whatever angle you use to rotate center around any axis you will always get (0, 0, 0). Just imagine you are doing simple 2d rotation. After all, rotation around Y axis can be viewed as 2d rotation in X-Z plane.
Try with some other values for vec variable, for example (1, 0, 0) or (0, 0, 1) and you will see results

Rotate a plane on its edge in three.js

I wanted to rotate a plane, but I can't figure out how to set the rotation axis. I'd like to rotate a plane around its edge.
I've seen solutions suggesting matrix transformations but they lacked explanation so I couldn't apply them.
Ok I figured it out. What you have to do is create a parent 3D object and add the plane to it. Once, added, you have to translate it by 50% and start rotating the parent object.
var object = THREE.SceneUtils.createMultiMaterialObject( new THREE.PlaneGeometry( 200, 50, 4, 4 ), [material] );
var parent = new THREE.Object3D();
object.applyMatrix( new THREE.Matrix4().makeTranslation( 0, 25, 0 ) );
parent.add(object);
parent.rotation.x = 1;
scene.add(parent)

Resources