I have an existing 3D application that was developed back in the days of the ThreeJS r92 that made use of a separate alpha map to simulate holes cut in wooden panels. This allowed re-use of a high quality woodgrain texture across all models, while using easily compressible black and white images to create the 'holes' in the wooden panels.
Now that I have begun migration of the project to the current glTF format, I find that the base color and alpha map now have to be combined. The result for my project is that now each of 130+ wooden panels will need their own "wood grain + alpha" texture, rather than being able to share a single wood grain texture.
From all my research, it seems like there are no obvious options to this situation using glTF.. my question now is - does anyone know of ANY workaround using glTF that allows me to separate the base color texture from the alpha map texture?
At this point the best way ahead sadly seems to be to avoid glTF and go back to using ObjectLoader, which is a pain as the binary objects of glTF are a huge plus.
You could export the GLTF with only the base color. Once imported into Three.js, you can manually assign the black/white texture as the material's .alphaMap property.
// Fetch texture
const texLoader = new THREE.TextureLoader();
const alphaTexture = texLoader.load("path/to/alphaTexture");
gltfLoader.load("path/to/model", function(gltf){
// Find the mesh you want to assign an alphaMap to
const myMesh = gltf.getObjectByName("meshName");
// Now just bind your texture to the alphaMap property
myMesh.material.alphaMap = alphaTexture;
myMesh.material.transparent = true;
});
The docs state that only the green channel is used so it doesn't have to be black and white.
Related
I am new to three.js and I created a classic sphere, wrapped with a world color map & bump map and an alpha map for clouds, and directional sunlight. How can I now add an earth at night texture only on the shadow side of the globe? The globe is rotating, so I couldn't just create 2 half-spheres.
I tried adding this grayscale mask to the texture, but it is also visible in daytime. I then tried illuminating the map with a different light aimed at the dark side, but couldn't selectively target only one material. I didn't quite understand if I need to use "emissiveMap".
Could I shine a light through a semi-transparent map/mask from the center of the earth to make the cities visible, or is there some type of "black light" to only make selected color/map areas shine in the dark?
I don't need any glowing effects, I just want it to be visible. Will I have to create individual light points or learn to use fragment shaders?
Coded using:
Using ThreeJS v0.130.1
Framework: Angular 12, but that's not relevant to the issue.
Testing on Chrome browser.
I am building an application that gets more than 100K points. I use these points to render a THREE.Points object on the screen.
I found that default THREE.PointsMaterial does not support lighting (the points are visible with or without adding lights to the scene).
So I tried to implement a custom ShaderMaterial. But I could not find a way to add lighting to the rendered object.
Here is a sample of what my code is doing:
Sample App on StackBlitz showing my current attempt
In this code, I am using sample values for point cloud data, normals and color but everything else is similar to my actual application. I can see the 3D object, but need more proper lighting using normals.
I need help or guidance to implement the following:
Add lighting to custom shader material. I have Googled and tried many things, no success so far.
Using normals, show the effects of lighting (In this sample code, the normals are fixed to Y-axis direction, but I am calculating them based on some vector logic in actual application). So calculating normals is already done, but I want to use them to show light shine/shading effect in the custom shader material.
And in this sample, color attribute is set to fixed red color, but in actual application I am able to apply colors using UV range from a texture to color attribute.
Please advise how/if I can get lighting based on normals for Point Cloud. Thanks.
Note: I looked at this Stackoveflow question but it only deals with changing the alpha/transparency of points and not lighting.
Adding lighting to a custom material is a very complex process. Especially since you could use Phong, Lambert, or Physical lighting methods, and there's a lot of calculations that need to pass from the vertex to the fragment shader. For instance, this segment of shader code is just a small part of what you'd need.
Instead of trying to re-create lighting from scratch, I recommend you create a PlaneGeometry with the material you'd like (Phong, Lambert, Physical, etc...) and use an InstancedMesh to create thousands of instances, just like in this example.
Based on that example, the pseudo-code of how you could achieve a similar effect is something like this:
const count = 100000;
const geometry = new PlaneGeometry();
const material = new THREE.MeshPhongMaterial();
mesh = new THREE.InstancedMesh( geometry, material, count );
mesh.instanceMatrix.setUsage( THREE.DynamicDrawUsage ); // will be updated every frame
scene.add( mesh );
const dummy = new THREE.Object3D();
update() {
// Sets the rotation so it's always perpendicular to camera
dummy.lookAt(camera);
// Updates positions of each plane
for (let i = 0; i < count; i++){
dummy.position.set( x, y, z );
dummy.updateMatrix();
mesh.setMatrixAt( i ++, dummy.matrix );
}
}
The for() loop would be the most expensive part of each frame, so if you need to update it on each frame, you might want to calculate this in the vertex shader, but that's another question altogether.
The map that I am attempting to create is isometric and is plotted in 44x44 pixel tiles. Each tile material is a loaded image.
Material with ID 1 might be reused at various x/y positions within the view. Material with ID 2 might only be used once. The same goes for any remaining tiles that need to be plotted.
I haven’t found anything in the docs that would be helpful and suggestions found on the web to use createMultiMaterialObject seem to be outdated.
If you have only a single geometry and want to apply multiple materials on it, you have to define so called groups data. These data allow you to render different parts of the geometry with different materials. After defining groups, you can create your (multi-material) mesh like so:
const mesh = new THREE.Mesh( geometry, [ material1, material2, material3 ] );
three.js R111
Hello i am new to ThreeJS and texture mapping,
Let's say I have a 3D-Plane with the size of (1000x1000x1). When I apply a texture to it, it will be repeated or it will be scaled, to atleast filling the full plane.
What I try to achieve is, to change the scaling of the picture on the plane at runtime. I want the Image to get smaller and stop fitting the full plane.
I know there is a way to map each face to a part of a picture, but is it also possible to map it to a negative number in the picture, so it will be transparent?
My question is:
I UV-Mapped a Model in Blender and imported it with the UV-Coords into my ThreeJS-Code. Now i need to scale the texture down, like described before. Do I have to remap the UV-Cords or do i have to manipulate the image and add an transparent edge?
Further, will I be able on the same way to move the image on the picture?
I already achieved this kind of usage in java3d by manipulating bufferedImages and drawing them onto transparent ones. I am not sure this will be possible using javascript, so i want to know if it is possible by texture-mapping.
Thank you for your time and your suggestions!
This can be done using mapping the 3d -plane to a canvas ,where the image is drawn (fabric.js can be used for canvas drawings).Inshort set the canvas as texture for the 3d model
yourmodel.material.map = document.getElementById("yourCanvas");
Hope it helps :)
Yes. In THREE, there are some controls on the texture object..
texture.repeat, and texture.offset .. they are both Vector2()s.
To repeat the texture twice you can do texture.repeat.set(2,2);
Now if you just want to scale but NOT repeat, there is also the "wrapping mode" for the texture.
texture.wrapS (U axis) and texture.wrapT (V axis) and these can be set to:
texture.wrapS = texture.wrapT = THREE.ClampToEdgeWrapping;
This will make the edge pixels of the texture extend off to infinity when sampling, so you can position a single small texture, anywhere on the surface of your uv mapped object.
https://threejs.org/docs/#api/textures/Texture
Between those two options (including texture.rotation) you can position/repeat a texture pretty flexibly.
If you need something even more complex.. like warping the texture or changing it's colors, you may want to change the UV's in your modeller, or draw your texture image into a canvas, modify the canvas, and use the canvas as your texture image, as described in ArUns answer. Then you can modify it at runtime as well.
I am using a custom shader which creates a sort of environment map effect by referencing from a texture based on the vector normals and the result is a perfect reflection.
I would now like to be able to add color and specular highlights so that the material is not always 100% reflective. Is there any way to do this without reinventing the wheel and creating a new phong shader from scratch?
If I can set the color or allow lighting, how would I then "mix" it with the texture based output? In Three.js there are lots of mixing methods and it would be nice if I could reuse those too so that I can test each method.
Update:
Found something very similar, just looking for a way to use color, ambient & emissive properties.
Using
var phongShader = THREE.ShaderLib.phong;
var uniforms = THREE.UniformsUtils.clone(phongShader.uniforms);
http://jsfiddle.net/edgxtnmv/1/