My question is related to this article:
http://blog.wolfire.com/2009/06/how-to-project-decals/
If my understanding is correct, a mesh made from the intersection of the original mesh and a cube is added to the scene to make a decal appear.
I need to save the final texture. So I was wondering if there is a way to 'merge' the texture of the original mesh and the added decal mesh?
You'd need to do some tricky stuff to convert from the model geometry space into UV coordinate space so you could draw the new pixels into the texture map. If you want to be able to use more than one material that way, you'd also probably need to implement some kind of "material map" similar to how some deferred rendering systems work. Otherwise you're limited to at most, one material per face, which wouldn't work for detailed decals with alpha.
I guess you could copy the UV coordinates from the original mesh into the decal mesh, and the use that information to reproject the decal texture into the original texture
Related
I was able to subtract a mesh fron my main mesh. However the texture is not well mapped in the engraved part.
The texture inside the engraved part is quite small in comparison of my main mesh.
Is there a way to redo the UV Mapping, or any other solution to solve that ?
When using a cube camera one normally sets the envMap of the material to the cubeCamera.renderTarget, e.g.:
var myMaterial = new THREE.MeshBasicMaterial({color:0xffffff,
envMap: myCubeCamera.renderTarget,
side: THREE.DoubleSide});
This works great for meshes that are meant to reflect or refract what the cube camera sees. However, I'd like to simply create a texture and apply that to my mesh. In other words, I don't want my object to reflect or refract. I want the face normals to be ignored.
I tried using a THREE.WebGLRenderTarget, but it won't handle a cube camera. And using a single perpspective camera with WebGLRenderTarget does not give me a 360 texture, obviously.
Finally, simply assigning the cubeCamera.renderTarget to the 'map' property of the material doesn't work either.
Is it possible to do what I want?
r73.
Edit: this is not what the author of the question is looking for, I'll keep my answer below for other people
Your envmap is already a texture so there's no need to apply it as a map. Also, cubemaps and textures are structurally different, so it won't be possible to swap them, or if you succeed in doing that the result is not what you probably you might expect.
I understand from what you're asking you want a static envmap instead to be updated at each frame, if that's the case simply don't run myCubeCamera.updateCubeMap() into your render function. Instead place it at the end of your scene initialization with your desired cube camera position, your envmap will show only that frame.
See examples below:
Dynamic Cubemap Example
Static Cubemap Example
The answer is: Set the refractionRatio on the material to 1.0. Then face normals are ignored since no refraction is occurring.
In a normal situation where the Cube Camera is in the same scene as the mesh, this would be pointless because the mesh would be invisible. But in cases where the Cube Camera is looking at a different scene, then this is a useful feature.
If I have a geometry, say
THREE.PlaneGeometry(400,400);
or
THREE.MeshBasicMaterial({map:new THREE.MeshFaceMaterial(materials)});
//multiple textures on only one face
How would I make it so that I have multiple textures on the same side of the plane?
Furthermore, how would I go about setting the coordinates of the texture and position of the texture on the Plane (or face)?
It should look something like this:
You can use shader material with textures as uniforms or look other approaches there, there and there.
I needed to refactor my custom mesh creation a bit
from:
create mesh of unified sizes (SIZE,SIZE,SIZE), than scale them as needed (setting scale for each axis)
to:
create mesh with correct size, do not scale later
meshes are custom generated (vertices, faces, normals, uvs), nothing of this process was altered, worked like a charm before
=> resulting meshes are the same size, position, etc.
The whole scene setup stays the same: lights, shadowing, materials, yet when using the second approach the whole lighting is very very bright and super reflective, is that a known issue?
material used is MeshPhongMaterial with map, bumMap, specMap, envMap
using three.js r68, no error/warning in console
before:
https://cloud.githubusercontent.com/assets/3647854/3876053/76b8f260-2158-11e4-9e96-c8de55eaec9a.png
after:
https://cloud.githubusercontent.com/assets/3647854/3876052/76b7fa86-2158-11e4-9393-8f3eece04c0b.png
Did you rescale the normals in the mesh?
The mesh format probably needs normalized normals, in which case, the new normals are now incorrect, but would've been correct, if you hadn't rescaled.
Alternately, you say the lights haven't been changed, maybe they need to be appropriately redirected in the scene. (Assuming you're applying different scaling factors in each axis.)
Is it possible to apply texture to mesh without specifying UV's in geometry in three.js ?
There are classes such as THREE.CubeGeometry, THREE.SphereGeometry, etc. that automatically generate the UV coordinates for you. However, if you are creating your own geometry from scratch (i.e., specifying vertex locations, creating faces, etc.) then the answer is no. Either you need to set the UV coordinates manually when creating the geometry, or you need to write a custom shader which determines the UV coordinates for any given point. Think about it this way: if you don't specify UV coordinates, the points on your geometry have no idea which point on your texture they should display.