I have a problem with mapping a texture in THREE.js which is possibly related to creating custom UV-Coordinates as extensive search indicates.
The following picture shows a geometry which was created from THREE.BoxGeometry by manipulating the lower vertices of the box. The texture on the side looks stretched (although this is correct I guess).
picture1
Is there a way of "projecting" the texture onto the side, e.g. by creating custom uv-coordinates to look like in the second (photoshopped) picture?
picture2
Thanks for you help!
You will need to remap your vertices manually to perform what is called a "box mapping" or a "triplanar mapping".
Here is an example I threw together: https://codesandbox.io/s/qxk8xvnrvj
It creates a cube with some subdivisions.. perturbs those vertices if they are on top... and then does the iterations through the faces uvs and vertices to remap each faces UVs with a box mapping, by finding the dominant axis the face normal points along... and then using the other 2 axis' as the U and V axis for unwrapping.
Related
I need to run some calculations on meshes using ThreeJS.
The calculation should involve the faces of the mesh, but not the triangular ones.
for example in the attached image, I'd like to consider both of the triangles of the top faces as a single face.
Is there a way to know which triangles go together?
I've seen that the geometry has a "groups" property.
https://threejs.org/docs/#api/en/core/BufferGeometry.groups
But it just says it is used to split the rendering.
Can I rely on it to determine that the vertices in the group form the "face" that I need?
Is there any other way to get it?
I am using THREE and I am trying to intersect a box mesh with a custom geometry I am creating and converting it to geometry using :
const g = new THREE.Geometry().fromBufferGeometry(shape3d)
I aim to add faces to the custom geometry, that is why I do that. So I expect to get back from the intersection my custom geometry + polygons that the box has.
I get that indeed, though I get also some holes as you can see in the below image :
I used many csg versions that are out there, the manthrax one, the ThreeCSG etc but no luck!
thank you
I suggest you set bevelEnabled:false to your mesh extrusion, because I am psychic and I can see your code in my head. :D
I am attempting to render a flat, dynamically created heatmap on top of a 3D model that is loaded from an OBJ (or STL).
I am currently loading and rendering an OBJ with Three.js. I have vector3 points that I am currently drawing as simple red cubes (image below). These data points are all raycasted to my OBJs mesh and are lying on the surface. The vector3 points are loaded from an external data source and will change depending on what data is being viewed/collected.
I would like to render my vector3 point data into a heatmap on the surface of my OBJ. Here are some examples illustrating the type of visual effects I am trying to achieve:
I feel like vertex coloring is the method of achieving this, but my issue is that my OBJ model does not have enough tessellation to do this. As you can see many red dots fall on each face. I am struggling to find a way to draw over my object's mesh with colors exactly where my red point data is. I was assuming I would need to convert my random vector3 points into a mesh, but cannot find a method to do so.
I've looked at the possibility of generating a texture, but 1) I do not have a UV map for my OBJs and do not see a way to programmatically generate them and 2) I am a bit lost on how I would correlate vector3 point data to UV points.
I've looked at using shaders, but my vector3 point data appears to be too large for using a shader (could be hundreds of thousands of points). I also feel it is not the right approach to render the heatmap every frame and would rather only render it once on load.
I've looked into isosurfaces with point clouds and the marching cubes algorithm, but I didn't think this was the right direction since only my data is a bit like a point cloud, and I am unsure as to how I would keep this smooth along the surface of my OBJ mesh.
Although I would prefer to keep everything in JavaScript for viewing in the browser, I am open to doing server side processing in any language/program with REST so long as it can be automated without human intervention, and pushed back to the browser for rendering.
Any suggestions or guidance is appreciated.
I'm only guessing but it seems like first you need to have UV coordinates that map every triangle to a texture. Rather than do this by hand I'd suggest using a modeling package. Most modeling packages have some way of automatically and uniformly mapping every triangle to a texture. For example in Blender
Next to put the heatmap in the texture by computing which triangles are affected by each dot (your raycasting), looking up their texture coordinates, projecting that dot into texture space and then putting the colors in that part of the texture. I'm only guessing that you need to not just do exact points but probably need to consider adjacent triangles since some heat info that hits near the edge of a triangle needs to bleed over into the adjacent triangle but that adjacent triangle might be using a completely different part of the texture.
I'm currently working on a project that uses shadowtextures to render shadows.
It was pretty easy for spotlights, since only 1 texture in the direction of the spotlight is needed, but its a little more difficult since it needs either 6 textures in all directions or 1 texture that somehow renders all the obects around the pointlight.
And thats where my problem is. How can I generate a Projection matrix that somehow renders all the object in a 360 angle around the pointlight?
Basicly how do create a fisheye (or any other 360 degree camera) vertex shader?
How can I generate a Projection matrix that somehow renders all the object in a 360 angle around the pointlight?
You can't. A 4x4 projection matrix in homogenous space cannot represent any operation which would result in bending the edges of polygons. A straight line stays a straight line.
Basicly how do create a fisheye (or any other 360 degree camera) vertex shader?
You can't do that either, at least not in the general case. And this is not a limit of the projection matrix in use, but a general limit of the rasterizer. You could of course put the formula for fisheye distortion into the vertex shader. But the rasterizer will still rasterize each triangle with straight edges, you just distort the position of the corner points of each triangle. This means that it will only be correct for tiny triangles covering a single pixel. For larger triangles, you completely screw up the image. If you have stuff like T-joints, this even results in holes or overlaps in objects which actually should be perfectly closed.
It was pretty easy for spotlights, since only 1 texture in the direction of the spotlight is needed, but its a little more difficult since it needs either 6 textures in all directions or 1 texture that somehow renders all the obects around the pointlight.
The correct solution for this would be using a single cube map texture, with provides 6 faces. In a perfect cube, each face can then be rendered by a standard symmetric perspective projection with a field of view of 90 degrees both horizontally and vertically.
In modern OpenGL, you can use layered rendering. In that case, you attach each of the 6 faces of the cube map as a single layer to an FBO, and you can use the geometry shader to amplify your geomerty 6 times, and transform it according to the 6 different projection matrices, so that you still only need one render pass for the complete shadow map.
There are some other vendor-specific extensions which might be used to further optimize the cube map rendering, like Nvidia's NV_viewport_swizzle (available on Maxwell and newer GPUs), but I only mention this for completness.
I'm trying to make an environment map which is in the form of a cube that has images mapped onto particular faces to give the illusion of being in the area (sorta like google's street view)
I'm trying to do It in glgehowever, with my limited experience, I only know how to map one texture to a whole mesh (Which is what I'm doing at the moment). If I were to create 6 different textures, would it possible for me to specify the faces that those textures should be loaded to?
You could generate the six faces of the cube as separate objects and use a different texture for each. Alternative is to set different texture coordinates for the different faces of the cube.
If you want ready-to-run code, three.js has a couple of skybox examples. E.g. http://mrdoob.github.com/three.js/examples/webgl_panorama_equirectangular.html
You should look at "UV Mapping". Check this example. Roughly, UVs describe how the polygons are mapped (in x,y) on the texture.
Sounds like you want a cube map texture — it takes six separate images, and you lookup in it with a direction vector rather than (u,v) coordinates. They are the usual way to do environments. Cube map textures are available in WebGL.