Is there a way to reproject/interpolate vertex colors with OpenMesh? - openmesh

I'm using OpenMesh to remesh/manage some mesh objects.
With subdivide/decimate/smooth and other tools from OpenFlipper, I can change the mesh topology.
This however results in vertex colors loosing their meaning, as new vertices will all have black color and there is no interpolation when mesh topology changes, resulting in visual artifacts.
Is there a way to tell OpenMesh to reproject vertex colors back to the old mesh to interpolate the vertex color?
If not, what would be a good way to do it manually? Is there any state of the art for vertex back-projection?

In OpenFlipper using requestTriangleBsp() you can request a BSP tree for your original mesh object. (You will have to keep a copy of your original mesh as long as you want to use that BSP tree.) Whenever you want to project a point onto your original mesh, you can then use the nearest() member function on the BSP tree in order to get the closest face to the supplied point. After that it's only a matter of projecting your point into that face, computing barycentric coordinates and interpolating the vertex colors.

I think you want to get this information for output mesh: VertexInfo = {origin face id, barycentric coordinate}. You can project vertex to original mesh to compute VertexInfo. However, it is not recommended to compute topology info from geometry info. Just thinking you have a box mesh that is nearly flat, I don't think you can get the right VertexInfo by reprojecting. The best way to get VertexInfo is to compute it in each concrete command by topology info, like you mentioned subdivide/decimate, etc.

Related

ThreeJS and non-triangular mesh faces

I need to run some calculations on meshes using ThreeJS.
The calculation should involve the faces of the mesh, but not the triangular ones.
for example in the attached image, I'd like to consider both of the triangles of the top faces as a single face.
Is there a way to know which triangles go together?
I've seen that the geometry has a "groups" property.
https://threejs.org/docs/#api/en/core/BufferGeometry.groups
But it just says it is used to split the rendering.
Can I rely on it to determine that the vertices in the group form the "face" that I need?
Is there any other way to get it?

Surface mesh triangles: query space within

I have a surface mesh of triangles. Assume, the surface mesh is closed.
I intend to do a spatial query to figure out whether a space is within my surface mesh or not. Space unit can be represented by a bounding box, a voxel or any other spatial tool.
What data structures are available to do the above query?
What algorithms are available to implement the query from scratch?
Are any ready-to-use libraries available?
Thanks =)
I don't think an R-tree will help directly to find what's inside a closed mesh.
If the data has separate "bubbles", chunks of space enclosed by meshes, those could be represented by bounding boxes and put in an R-tree index. That would help find which bubbles may intersect the query object, so that further checking can be done (really it would eliminate the bubbles that could not intersect, so they don't need to be checked).
If you can somehow break up the space inside your mesh into smaller chunks, those could be indexed. OK if they overlap or extend outside the mesh.
If the mesh is totally closed, and for a single point, you can use Ray Tracing to shoot a ray from your point to any direction and see how many times it hits the mesh. If it hits an odd number of times, it's inside, if it hits an even number it's outside. For other shapes, however, you might need collision detection.
Existing libraries will depend on which platform/programming language you're doing this for, but if you have freedom to choose, maybe start with Unity?
As Antonin mentioned in their answer, an R-Tree index will help you with that but won't directly check if a point or other shape is inside your mesh. If you can break up the space inside your mesh into boxes, R-Trees will help you do "quick checks" for the positive case where your point or shape is inside the mesh.
VDB: hollow vs filled
In the following video, it is demonstrated how we can create two type of VDB by Houdini:
Distance: hollow volume: creates a shell of voxels on geometry outside
Fog: solid volume: fills the geometry with voxels
https://youtu.be/jqNRu8BYYXo?t=259
Implication
This implies that it is possible to tag hollow and filled voxels by VDB. But I don't know how to do it programmatically with voxel code.

Three.js Map Texture to irregular Geometry (UW-Coordinates)

I have a problem with mapping a texture in THREE.js which is possibly related to creating custom UV-Coordinates as extensive search indicates.
The following picture shows a geometry which was created from THREE.BoxGeometry by manipulating the lower vertices of the box. The texture on the side looks stretched (although this is correct I guess).
picture1
Is there a way of "projecting" the texture onto the side, e.g. by creating custom uv-coordinates to look like in the second (photoshopped) picture?
picture2
Thanks for you help!
You will need to remap your vertices manually to perform what is called a "box mapping" or a "triplanar mapping".
Here is an example I threw together: https://codesandbox.io/s/qxk8xvnrvj
It creates a cube with some subdivisions.. perturbs those vertices if they are on top... and then does the iterations through the faces uvs and vertices to remap each faces UVs with a box mapping, by finding the dominant axis the face normal points along... and then using the other 2 axis' as the U and V axis for unwrapping.

Creating Heatmap Over 3D Model From Vector 3 Point Data

I am attempting to render a flat, dynamically created heatmap on top of a 3D model that is loaded from an OBJ (or STL).
I am currently loading and rendering an OBJ with Three.js. I have vector3 points that I am currently drawing as simple red cubes (image below). These data points are all raycasted to my OBJs mesh and are lying on the surface. The vector3 points are loaded from an external data source and will change depending on what data is being viewed/collected.
I would like to render my vector3 point data into a heatmap on the surface of my OBJ. Here are some examples illustrating the type of visual effects I am trying to achieve:
I feel like vertex coloring is the method of achieving this, but my issue is that my OBJ model does not have enough tessellation to do this. As you can see many red dots fall on each face. I am struggling to find a way to draw over my object's mesh with colors exactly where my red point data is. I was assuming I would need to convert my random vector3 points into a mesh, but cannot find a method to do so.
I've looked at the possibility of generating a texture, but 1) I do not have a UV map for my OBJs and do not see a way to programmatically generate them and 2) I am a bit lost on how I would correlate vector3 point data to UV points.
I've looked at using shaders, but my vector3 point data appears to be too large for using a shader (could be hundreds of thousands of points). I also feel it is not the right approach to render the heatmap every frame and would rather only render it once on load.
I've looked into isosurfaces with point clouds and the marching cubes algorithm, but I didn't think this was the right direction since only my data is a bit like a point cloud, and I am unsure as to how I would keep this smooth along the surface of my OBJ mesh.
Although I would prefer to keep everything in JavaScript for viewing in the browser, I am open to doing server side processing in any language/program with REST so long as it can be automated without human intervention, and pushed back to the browser for rendering.
Any suggestions or guidance is appreciated.
I'm only guessing but it seems like first you need to have UV coordinates that map every triangle to a texture. Rather than do this by hand I'd suggest using a modeling package. Most modeling packages have some way of automatically and uniformly mapping every triangle to a texture. For example in Blender
Next to put the heatmap in the texture by computing which triangles are affected by each dot (your raycasting), looking up their texture coordinates, projecting that dot into texture space and then putting the colors in that part of the texture. I'm only guessing that you need to not just do exact points but probably need to consider adjacent triangles since some heat info that hits near the edge of a triangle needs to bleed over into the adjacent triangle but that adjacent triangle might be using a completely different part of the texture.

How to set one coordinate of a geometry from another mesh in three.js

What I'm trying to do is to "drape" some points on a PlaneGeometry. I have the planar coordinates of the points in the same coordinate system of my plane geometry, what I need i sto get the "height" from the plane to position the points on top of it.
What's the best way to achieve it? Querying the planar mesh in Javascript would be to heavy. Should it be done (and could it be done) using the vertex shader?
EDIT
Probably using a ray caster is the right solution, something like shown in this example: http://threejs.org/examples/#webgl_geometry_terrain_raycast
EDIT2
Raycasting does the job, but it's quite slow for a lot of objects. I suppose there are more efficient ways to do that...
https://dl.dropboxusercontent.com/u/13861666/2014-01-17%2011_27_15-firenze.png

Resources