Cesium primitives (triangles) shading - opengl-es

I use this code to draw triangles in Cesium:
var mypositions = Cesium.Cartesian3.fromDegreesArrayHeights(triangles);
// unroll 'mypositions' into a flat array here
var numPositions = mypositions.length;
var pos = new Float64Array(numPositions * 3);
var normals = new Float32Array(numPositions * 3);
for (var i = 0; i < numPositions; ++i) {
pos[i * 3] = mypositions[i].x;
pos[i * 3 + 1] = mypositions[i].y;
pos[i * 3 + 2] = mypositions[i].z;
normals[i * 3] = 0.0;
normals[i * 3 + 1] = 0;
normals[i * 3 + 2] = 1.0;
}
console.log(normals)
var geometry = new Cesium.Geometry({
vertexFormat : Cesium.VertexFormat.ALL,
attributes: {
position: new Cesium.GeometryAttribute({
componentDatatype: Cesium.ComponentDatatype.DOUBLE, // not FLOAT
componentsPerAttribute: 3,
values: pos
}),
normal: new Cesium.GeometryAttribute({
componentDatatype: Cesium.ComponentDatatype.FLOAT,
componentsPerAttribute: 3,
values: normals
})
},
// Don't need the following line if no vertices are shared.
// indices: new Uint32Array([0, 1, 2, 3, 4, 5]),
primitiveType: Cesium.PrimitiveType.TRIANGLES,
boundingSphere: Cesium.BoundingSphere.fromVertices(pos)
});
var myInstance = new Cesium.GeometryInstance({
geometry: geometry,
attributes: {
color: new Cesium.ColorGeometryInstanceAttribute(0.0039215697906911,
0.7333329916000366,
0,
1)
},
show: new Cesium.ShowGeometryInstanceAttribute(true)
});
var TIN = viewer.scene.primitives.add(new Cesium.Primitive({
geometryInstances: [myInstance],
asynchronous: false,
appearance: new Cesium.PerInstanceColorAppearance({
closed: true,
translucent: false,
flat: false
//,
//vertexShaderSource: "",
//fragmentShaderSource: ""
})
}));
This is what I get:
I would like to enable shading, so the result should be similar as on figure below:
I tried to write Vertex and Fragment GLSL shader but without success. I am not familiar with GLSL and I was getting a compiling error. Is there any another way to create this kind of shading?
Thanks!

Regardless of the fact that you haven't posted your GLSL shaders or gotten that to work, your problem (once you eventually figure out the GLSL stuff) is that you're setting all your normals to point in the +Z direction instead of actually being normal to each triangle, like the second screenshot of yours shows.
var normals = new Float32Array(numPositions * 3);
for (var i = 0; i < numPositions; ++i) {
pos[i * 3] = mypositions[i].x;
pos[i * 3 + 1] = mypositions[i].y;
pos[i * 3 + 2] = mypositions[i].z;
normals[i * 3] = 0.0;
normals[i * 3 + 1] = 0;
normals[i * 3 + 2] = 1.0;
}
What you need to do instead is set the positions and then for the normals, operate on sets of 3 vertices (a triangle) instead of individual vertices. This way you can actually calculate a surface normal. This is one of many explanations of how to do that:
https://math.stackexchange.com/questions/305642/how-to-find-surface-normal-of-a-triangle
I'm not familiar with Cesium, but I'd imagine there'd be some "default" shader that did basic lighting. If not, then the basis for simple lighting like this is called Lambertian reflectance. Your vertex shader would output a color that is calculated as dot(N, L) where N is the normal of your vertex and L is the vector from that vertex to your light source (or just the negative of the direction of your light if it's a directional/environment/sun/etc light). The fragment shader would simply pass that color back out.

Related

How to convert `geometry.faceVertexUvs` in `THREE.js` current version?

const geometry = new THREE.SphereGeometry(100, 64, 64, Math.PI / 2, Math.PI, 0, Math.PI);
const uvs = geometry.faceVertexUvs[0];
const axis = 'x';
for (let i = 0; i < uvs.length; i += 1) {
for (let j = 0; j < 3; j += 1) {
uvs[i][j][axis] *= 0.5;
}
}
geometry.faceVertexUvs is depreciated.
How to convert this reference to THREE.js current version?
The new method uses BufferGeometry instead of Geometry. This stores each vertex attribute (position, normal, uv) in arrays, so you can fetch the UVs with BufferGeometry.getAttribute("uv");.
Once you've retreived the attribute, you'll end up with a BufferAttribute, where you can access the .array property for each individual component:
const geometry = new THREE.SphereGeometry(100, 64, 64, Math.PI / 2, Math.PI, 0, Math.PI);
const uvAttribute = geometry.getAttribute("uv");
const uvArray = uvAttribute.array;
// Loop through all UVs
// UVs have 2 components, so we jump by 2 on each iteration
for (let i = 0; i < uvAttribute.length; i += 2) {
uvArray[i + 0] = uvX;
uvArray[i + 1] = uvY;
}
// Now we set the update flag to true so the GPU gets the new values
uvAttribute.needsUpdate = true;

ThreeJS - THREE.BufferGeometry.computeBoundingSphere() Gives Error: NaN Position Values

I am creating a simple THREE.PlaneBufferGeometry using Threejs. The surface is a geologic surface in the earth.
This surface has local gaps or 'holes' in it represented by NaN's. I have read another similar, but older, post where the suggestion was to fill the position Z component with 'undefined' rather than NaN. I tried that but get this error:
THREE.BufferGeometry.computeBoundingSphere(): Computed radius is NaN. The "position" attribute is likely to have NaN values.
PlaneBufferGeometry {uuid: "8D8EFFBF-7F10-4ED5-956D-5AE1EAD4DD41", name: "", type: "PlaneBufferGeometry", index: Uint16BufferAttribute, attributes: Object, …}
Here is the TypeScript function that builds the surface:
AddSurfaces(result) {
let surfaces: Surface[] = result;
if (this.surfaceGroup == null) {
this.surfaceGroup = new THREE.Group();
this.globalGroup.add(this.surfaceGroup);
}
surfaces.forEach(surface => {
var material = new THREE.MeshPhongMaterial({ color: 'blue', side: THREE.DoubleSide });
let mesh: Mesh2D = surface.arealMesh;
let values: number[][] = surface.values;
let geometry: PlaneBufferGeometry = new THREE.PlaneBufferGeometry(mesh.width, mesh.height, mesh.nx - 1, mesh.ny - 1);
var positions = geometry.getAttribute('position');
let node: number = 0;
// Surfaces in Three JS are ordered from top left corner x going fastest left to right
// and then Y ('j') going from top to bottom. This is backwards in Y from how we do the
// modelling in the backend.
for (let j = mesh.ny - 1; j >= 0; j--) {
for (let i = 0; i < mesh.nx; i++) {
let value: number = values[i][j];
if(!isNaN(values[i][j])) {
positions.setZ(node, -values[i][j]);
}
else {
positions.setZ(node, undefined); /// This does not work? Any ideas?
}
node++;
}
}
geometry.computeVertexNormals();
var plane = new THREE.Mesh(geometry, material);
plane.receiveShadow = true;
plane.castShadow = true;
let xOrigin: number = mesh.xOrigin;
let yOrigin: number = mesh.yOrigin;
let cx: number = xOrigin + (mesh.width / 2.0);
let cy: number = yOrigin + (mesh.height / 2.0);
// translate point to origin
let tempX: number = xOrigin - cx;
let tempY: number = yOrigin - cy;
let azi: number = mesh.azimuth;
let aziRad = azi * Math.PI / 180.0;
// now apply rotation
let rotatedX: number = tempX * Math.cos(aziRad) - tempY * Math.sin(aziRad);
let rotatedY: number = tempX * Math.sin(aziRad) + tempY * Math.cos(aziRad);
cx += (tempX - rotatedX);
cy += (tempY - rotatedY);
plane.position.set(cx, cy, 0.0);
plane.rotateZ(aziRad);
this.surfaceGroup.add(plane);
});
this.UpdateCamera();
this.animate();
}
Thanks!
I have read another similar, but older, post where the suggestion was to fill the position Z component with 'undefined' rather than NaN.
Using undefined will fail in the same way like using NaN. BufferGeometry.computeBoundingSphere() computes the radius based on Vector3.distanceToSquared(). If you call this method with a vector that contains no valid numerical data, NaN will be returned.
Hence, you can't represent the gaps in a geometry with NaN or undefined position data. The better way is to generate a geometry which actually represents the geometry of your geologic surface. Using ShapeBufferGeometry might be a better candidate since shapes do support the concept of holes.
three.js r117
THREE.PlaneBufferGeometry:: parameters: {
width: number;
height: number;
widthSegments: number;
heightSegments: number;
};
widthSegments or heightSegments should be greater 1 ,if widthSegments < 1 ,widthSegments may be equal 0 or nan.
In my case, it was happening when I tried to create a beveled shape based on a single vector or a bunch of identical vectors - so there was only a single point. Filtering out such shapes solved the issue.

THREE.JS glDrawElements: attempt to access out of range vertices in attribute 1

I'm having this error while adding my own geometry's attribute.
I've already read this WebGL GL ERROR :GL_INVALID_OPERATION : glDrawElements: attempt to access out of range vertices in attribute 1 , and I understand what is the problem, but I can't figure out why.
I'm building a BufferGeometry, a tree, starting from 1000 object. 300 objects are using a LeafGeometry, 700 objects are using a BoxGeometry.
I want to fulfill a buffer, containing a value that tells if a vertex belongs to the trunk or to the foliage. What I'm doing is the follow:
1) First I calculate the dimension of the buffer (and here I think I'm doing it wrong) calling : getTotNumVertices(LeafGeometry.new(options), BoxGeometry.new(options), 1000, 3000)
function getTotNumVertices(foliage_geometry, trunk_geometry, tot_objects, foliage_start_at){
let n_vertices_in_leaf = foliage_geometry.vertices.length * 3;
let n_vertices_in_trunk = trunk_geometry.vertices.length * 3;
let n_vertices_in_leafs = foliage_start_at * n_vertices_in_leaf;
let n_vertices_in_stam = (tot_objects - foliage_start_at) * n_vertices_in_trunk;
return{
tot_vertices: (n_vertices_in_stam + n_vertices_in_leafs),
n_vertices_leaf: n_vertices_in_leaf,
n_vertices_trunk: n_vertices_in_trunk
};
}
2)Once I've got the total number of vertex, I create the buffer
function createBuffers(n_vert){
// I'm returnin an array becuase in my real code I'm returning
// more than one buffer
return {
isLeafBuffer: new Float32Array(n_vert)
};
}
3) Then I build my BufferGeometry, merging together the 1000 objects:
let hash_vertex_info = getTotNumVertices(leafGeom, geometries["box"], 1000, 300);
let buffers = createBuffers(hash_vertex_info.tot_vertices);
let geometry = new THREE.Geometry();
let objs = buildTheTree(1000, 300);
for (let i = 0; i < objs.length; i++){
// here code that fullfills the buffers
let mesh = objs[i];
mesh.updateMatrix();
geometry.merge(mesh.geometry, mesh.matrix);
}
let bufGeometry = new THREE.BufferGeometry().fromGeometry(geometry);
console.log(bufGeometry.attributes.position.count);
console.log(hash_vertex_info.tot_vertices);
And here the problem, the value of is bufGeometry.attributes.position.count is 623616, the value of hash_vertex_info.tot_vertices is 308940.
When drawing, WebGL try do access a value bigger than 308940 and then the error.
What am I doing wrong?
///////////EDIT AFTER A WHILE
Basically, I'm having facing the same problem explained in this question
Does converting a Geometry to a BufferGeometry in Three.js increase the number of vertices?
I need to calculate the total number of vertices in order to create a buffer that will contain values for my shader. This is my code, the number of vertices it is still different between the merged geometry and the buffer geometry obtained from it.
let tot_objects = 100;
let material = new THREE.MeshStandardMaterial( {color: 0x00ff00} );
let geometry = new THREE.BoxGeometry(5, 5, 5, 4, 4, 4);
let objs = populateGroup(geometry, material, tot_objects);
//let's merge all the objects in one geometry
let mergedGeometry = new THREE.Geometry();
for (let i = 0; i < objs.length; i++){
let mesh = objs[i];
mesh.updateMatrix();
mergedGeometry.merge(mesh.geometry, mesh.matrix);
}
let bufGeometry = new THREE.BufferGeometry().fromGeometry(mergedGeometry);
let totVerticesMergedGeometry = (mergedGeometry.vertices.length ) + (mergedGeometry.faces.length * 3);
console.log(bufGeometry.attributes.position.count); // 57600
console.log(totVerticesMergedGeometry); // 67400 !!!
scene.add(new THREE.Mesh(bufGeometry, material));
function populateGroup(selected_geometry, selected_material, tot_objects) {
let objects = [];
for (var i = 0; i< tot_objects; i++) {
let coord = {x:i, y:i, z:i};
let object = new THREE.Mesh(selected_geometry, selected_material);
object.position.set(coord.x, coord.y, coord.z);
object.rotateY( (90 + 40 + i * 100/tot_objects) * -Math.PI/180.0 );
objects.push(object);
}
return objects;
}
The number of totVerticesMergedGeometry and bufGeometry.attributes.position.count should be the same, but is is still different.
Is my way of counting vertices wrong? actually it is the same used here https://github.com/mrdoob/three.js/blob/master/src/core/DirectGeometry.js#L166, meaning (geometry.vertices.length) + (geometry.faces.length * 3).
What I was doing wrong was the way to calculate the number of vertices.
The number of vertices used for the buffer is calculate with MyObjectGeometry.faces.lenght * 3 * NumberOfObjectThatWillBeMerged
A more detailed answer is here Why the number of vertices in a merged Geometry differs from the number of vertices in the BufferedGeometry obtained from it?
I had this error, because I was calling the constructor with the values instead of an array of values:
- var colors = new Float32Array(
+ var colors = new Float32Array( [
1.0, 0.0, 0.0,
0.0, 1.0, 0.0,
0.0, 0.0, 1.0,
1.0, 0.0, 1.0,
0.0, 1.0, 1.0,
1.0, 1.0, 1.0,
- );
+ ] );

Correct transformation order for scene graph

I am working on a quick WebGL Engine with a scene graph to quickly prototype my game idea on reddit (https://www.reddit.com/r/gameideas/comments/3dsy8m/revolt/). Now, after I have got some basic rendering done, I can't figure out the correct order, well the one that looks right to most people, that I am supposed to use in order to transform the nodes in the scene graph.
It's hard to explain what is happening but I hope you get a understanding that it just isn't rotating the way that most would expect it to happen in any other engine.
Here is a simplified version of what I am currently doing.
Mat4 = glMatrix 0.9.5
Utils = Custom Utilitys
Node(Render):
#param {parentMatrix}
// Create Local Matrix
self.lMatrix = mat4.create();
mat4.identity(self.lMatrix);
mat4.translate(self.lMatrix, self.position);
mat4.rotate(self.lMatrix, self.rotation[0], [1, 0, 0]);
mat4.rotate(self.lMatrix, self.rotation[1], [0, 1, 0]);
mat4.rotate(self.lMatrix, self.rotation[2], [0, 0, 1]);
var wMatrix = mat4.create();
mat4.identity(wMatrix);
if(parentMatrix){
mat4.multiply(self.lMatrix, parentMatrix, wMatrix);
}
else{
mat4.set(self.lMatrix, wMatrix);
}
// Render
var children = this.children,
numChildren = children.length,
child;
for(var i = 0; i < numChildren; i++){
child = children[i];
child.render(wMatrix);
}
Entity(Render):
#param {parentMatrix}
// Set Transformation matrix
var tMatrix = mat4.create();
mat4.identity(tMatrix);
mat4.translate(tMatrix, self.position);
mat4.rotate(tMatrix, self.rotation[0], [1, 0, 0]);
mat4.rotate(tMatrix, self.rotation[1], [0, 1, 0]);
mat4.rotate(tMatrix, self.rotation[2], [0, 0, 1]);
mat4.scale(tMatrix, self.scale || [1, 1, 1]);
var wMatrix = mat4.create();
mat4.identity(wMatrix);
mat4.multiply(tMatrix, parentMatrix, wMatrix);
Utils.loadTMatrix(wMatrix);
this.texture.bind();
this.mesh.render();
The usual order is SRT, or scale, rotate then translate.
Also I am not sure if you can just do
mat4.rotate(tMatrix, self.rotation[0], [1, 0, 0]);
mat4.rotate(tMatrix, self.rotation[1], [0, 1, 0]);
mat4.rotate(tMatrix, self.rotation[2], [0, 0, 1]);
with euler angles and get the correct result orientation. I dont use euler angles so I dont fully grasp the details. Somebody please correct me if Im wrong. See this page for conversions between euler angle and rotation matrix: http://www.euclideanspace.com/maths/geometry/rotations/conversions/eulerToMatrix/.
I didn't find the original way that I was hoping for because I was previously caching matrices, and was hoping to continue doing it, but now I have found a much easier way after recreating my old engine from scratch.
Engine.prototype.NODE.prototype.render = function(parentMatrix){
var children = this.children,
numChildren = children.length,
child, pos, rot, scale;
// If has set matrix to a copy of it
if(parentMatrix){
this.matrix = mat4.clone(parentMatrix);
}
else{
// Else set it to a identity matrix
mat4.identity(this.matrix);
}
// If matrix needs updating reconstruct it
pos = [this.position.x,
this.position.y,
this.position.z];
rot = [this.rotation.x,
this.rotation.y,
this.rotation.z];
scale = [this.scale.x,
this.scale.y,
this.scale.z];
// Recreate Transformation matrix
mat4.translate(this.matrix, this.matrix, pos);
mat4.rotate(this.matrix, this.matrix, rot[0], [1, 0, 0]);
mat4.rotate(this.matrix, this.matrix, rot[1], [0, 1, 0]);
mat4.rotate(this.matrix, this.matrix, rot[2], [0, 0, 1]);
mat4.scale(this.matrix, this.matrix, scale);
// Render Children with this matrix
for(var i = 0; i < numChildren; i++){
child = children[i];
child.render(this.matrix);
}
}
what I am basically doing is that, if the matrix has a parent (it isn't the root node) then I am starting the matrix off as a clone of its parent, else I am setting the matrix to it's identity matrix. Then applying the regular transformations to it. If I find a way in order to continue caching matrices I will uploaded it as soon as possible.

Shading on PlaneBufferGeometry in Three.JS

I'm generating a random plane that animates movement in the vertices to give a crystalline effect. When I use regular PlaneGeometry, shading is not a problem: http://codepen.io/shshaw/pen/GJppEX
However, I tried to switch to PlaneBufferGeometry to see if I could get better performance, but the shading disappeared.
http://codepen.io/shshaw/pen/oXjyJL?editors=001
var planeGeometry = new THREE.PlaneBufferGeometry(opts.planeSize, opts.planeSize, opts.planeDefinition, opts.planeDefinition),
planeMaterial = new THREE.MeshLambertMaterial({
color: 0x555555,
emissive: 0xdddddd,
shading: THREE.NoShading
}),
plane = new THREE.Mesh(planeGeometry, planeMaterial),
defaultVertices = planeGeometry.attributes.position.clone().array;
function randomVertices() {
var vertices = planeGeometry.attributes.position.clone().array;
for (var i = 0; i <= vertices.length; i += 3) {
// x
vertices[i] = defaultVertices[i] + (rand(-opts.variance.x, opts.variance.x));
// y
vertices[i + 1] = defaultVertices[i + 1] + (rand(-opts.variance.y, opts.variance.y));
// z
vertices[i + 2] = rand(-opts.variance.z, -opts.variance.z);
}
return vertices;
}
plane.geometry.attributes.position.array = randomVertices();
As I saw suggested in this answer to 'Shading on a plane', I tried:
plane.geometry.computeVertexNormals();
On render, I have tried all of the following attributes for the geometry to make sure it's updating the normals & vertices, like I've done on the working example with PlaneGeometry:
plane.geometry.verticesNeedUpdate = true;
plane.geometry.normalsNeedUpdate = true;
plane.geometry.computeVertexNormals();
plane.geometry.computeFaceNormals();
plane.geometry.normalizeNormals();
What has happened to the shading? Can I bring it back on a PlaneBufferGeometry mesh, or do I need to stick with PlaneGeometry?
Thanks!

Resources