Threejs remap UV after CSG subtract - three.js

I was able to subtract a mesh fron my main mesh. However the texture is not well mapped in the engraved part.
The texture inside the engraved part is quite small in comparison of my main mesh.
Is there a way to redo the UV Mapping, or any other solution to solve that ?

Related

How can I use a whole sphere image to texture a portion of a sphere in THREE.js?

I'm simulating an earth using THREE.SphereGeometry with only 32x16 or 64x32 segments. It has very large flat spots several km deep. 1024x512 is nicer but not really an option. For the area I'm working in, I'd like to fill that area in with more segments so that the surface and the graphics match within centimeters.
I could use the phi and theta of SphereGeometry to define the portion I want...
But then I want to texture it. When I use the whole sphere texture for a portion, it squishes the whole image into that portion. Is there a way to just get the texture of the portion I want?
I'm new to UV mapping - are there options with that that can help?
This is more about geometry than level of detail. The LOD is very crude and I'm fine with that for now.

ThreeJS Texture fit UV Map

I'm tring to developing a configurator. It's about cups. These should be displayed in 3D. A design should be uploaded. It works by uploading a texture like this.
Otherwise the design will not fit. Is there a way to load a full-size rectangular image as a texture? The Texture may like to be stretched. The texture should not be made cubic by the user, but automatically in the background maybe.. I hope you understand me.
This is the OBJ-File
Your UV mapping looks difficult to apply a texture to. Especially because it has so much empty space, and is skewed in an arc, so you would need to warp all your textures for them to fit nicely.
You should make the UV mapping work for you. Why don't you use the built-in CylinderBufferGeometry class to apply a texture on top of your cup geometry? You could use its attributes to match the side of your cup's shape:
CylinderBufferGeometry(
radiusTop,
radiusBottom,
height,
radialSegments,
heightSegments,
openEnded,
thetaStart,
thetaLength
);
With this approach, you could leave your cup geometry untouched, then apply a "sticker" texture on top of it. It could wrap all the way around the cup if you wanted, or it could be constrained to only the front. You could scale it up, rotate it around, and it would be independent of a baked-in UV mapping done in Blender. Another benefit is that this approach occupies the entire [0, 1] UV range, so you could simply use square textures, and you wouldn't be wasting data with empty space.
Look at this demo to see how you can play with the geometry's configuration.

THREE.js adding bullets as sprites and rotating each individually

I have been working on programming a game where everything is rendered in 3d. Though the bullets are 2d sprites. this poses a problem. I have to rotate the bullet sprite by rotating the material. This turns every bullet possessing that material rather than the individual sprite I want to turn. It is also kind of inefficient to create a new sprite clone for every bullet. is there a better way to do this? Thanks in advance.
Rotate the sprite itself instead of the texture.
edit:
as OP mentioned.. the spritematerial controls the sprites rotation.y, so setting it manually does nothing...
So instead of using the Sprite type, you could use a regular planegeometry mesh with a meshbasic material or similar, and update the matrices yourself to both keep the sprite facing the camera, and rotated toward its trajectory..
Then at least you can share the material amongst all instances.
Then the performance bottleneck becomes the number of drawcalls.. (1 per sprite)..
You can improve on that by using a single BufferGeometry, and computing the 4 screen space vertices for each sprite, each frame. This moves the bottleneck away from drawCalls, and will be limited by the speed at which you can transform vertices in javascript, which is slow but not the end of the world. This is also how many THREE.js particle systems are implemented.
The next step beyond that is to use a custom vertex shader to do the heavy vertex computation.. you still update the buffergeometry each frame, but instead of transforming verts, you're just writing the position of the sprite into each of the 4 verts, and letting the vertex shader take care of figuring out which of the 4 verts it's transforming (possibly based on the UV coordinate, or stored in one of the vertex color channels..., .r for instace) and which sprite to render from your sprite atlas (a single texture/canvas with all your sprites layed out on a grid) encoded in the .g of the vertex color..
The next step beyond that, is to not update the BufferGeometry every frame, but store both position and velocity of the sprite in the vertex data.. and only pass a time offset uniform into the vertex shader.. then the vertex shader can handle integrating the sprite position over a longer time period. This only works for sprites that have deterministic behavior, or behavior that can be derived from a texture data source like a noise texture or warping texture. Things like smoke, explosions, etc.
You can extend these techniques to draw gigantic scrolling tilemaps. I've used these techniques to make multilayer scrolling/zoomable hexmaps that were 2048 hexes square, (which is a pretty huge map)(~4m triangles). with multiple layers of sprites on top of that, at 60hz.
Here the original stemkoski particle system for reference:
http://stemkoski.github.io/Three.js/Particle-Engine.html
and:
https://stemkoski.github.io/Three.js/ParticleSystem-Dynamic.html

Three.js Merge objects and textures

My question is related to this article:
http://blog.wolfire.com/2009/06/how-to-project-decals/
If my understanding is correct, a mesh made from the intersection of the original mesh and a cube is added to the scene to make a decal appear.
I need to save the final texture. So I was wondering if there is a way to 'merge' the texture of the original mesh and the added decal mesh?
You'd need to do some tricky stuff to convert from the model geometry space into UV coordinate space so you could draw the new pixels into the texture map. If you want to be able to use more than one material that way, you'd also probably need to implement some kind of "material map" similar to how some deferred rendering systems work. Otherwise you're limited to at most, one material per face, which wouldn't work for detailed decals with alpha.
I guess you could copy the UV coordinates from the original mesh into the decal mesh, and the use that information to reproject the decal texture into the original texture

Fixed texture size in Three.js

I am building quite a complex 3D environment in Three.js (FPS-a-like). For this purpose I wanted to structure the loading of textures and materials in an object oriƫnted way. For example; materials.wood.brownplank is a reusable material with a certain texture and other properties. Below is a simplified visualisation of the process where models uses materials and materials uses textures.
loadTextures();
loadMaterials();
loadModels();
//start doing stuff in the scene
I want to use that material on differently sized objects. However, in Three.js you can't (AFAIK) set a certain texture scale. You will have to set the repeat to scale it appropiate to your object. But I don't want to do that for every plane of every object I use.
Here is how it looks now
As you can see, the textures are not uniform in size.
Is there an easy way achieve this? So cloning the texture and/or material every time and setting the repeat according to the geometry won't do :)
I hope someone can help me.
Conclusion:
There is no real easy way to do this. I ended up changing my loading methods, where things like materials.wood.brownplank are now for example getMaterial('wood', 'brownplank') In the function new objects are instantiated
You should be able to do this by modifying your geometry UV coordinates according to the "real" dimensions of each face.
In Three.js, UV coordinates are relative to the face and texture (as in, 0.0 = one edge, 1.0 = other edge), no matter what the actual size of texture or face is. But by modifying the UVs in geometry (multiply them by some factor based on face physical size), you can use the same material and texture in different sizes (and orientations) per face.
You just need to figure out the mapping between UVs, geometry scale and your desired working units (eg. mm or m). Sorry I don't have, or know a ready algorithm to do it, but that's the approach you probably need to take. Should be quite doable with a bit of experimentation and google-fu.

Resources