Three.JS add color and specular highlights to ShaderMaterial - three.js

I am using a custom shader which creates a sort of environment map effect by referencing from a texture based on the vector normals and the result is a perfect reflection.
I would now like to be able to add color and specular highlights so that the material is not always 100% reflective. Is there any way to do this without reinventing the wheel and creating a new phong shader from scratch?
If I can set the color or allow lighting, how would I then "mix" it with the texture based output? In Three.js there are lots of mixing methods and it would be nice if I could reuse those too so that I can test each method.
Update:
Found something very similar, just looking for a way to use color, ambient & emissive properties.
Using
var phongShader = THREE.ShaderLib.phong;
var uniforms = THREE.UniformsUtils.clone(phongShader.uniforms);
http://jsfiddle.net/edgxtnmv/1/

Related

simulate separate alpha map in glTF

I have an existing 3D application that was developed back in the days of the ThreeJS r92 that made use of a separate alpha map to simulate holes cut in wooden panels. This allowed re-use of a high quality woodgrain texture across all models, while using easily compressible black and white images to create the 'holes' in the wooden panels.
Now that I have begun migration of the project to the current glTF format, I find that the base color and alpha map now have to be combined. The result for my project is that now each of 130+ wooden panels will need their own "wood grain + alpha" texture, rather than being able to share a single wood grain texture.
From all my research, it seems like there are no obvious options to this situation using glTF.. my question now is - does anyone know of ANY workaround using glTF that allows me to separate the base color texture from the alpha map texture?
At this point the best way ahead sadly seems to be to avoid glTF and go back to using ObjectLoader, which is a pain as the binary objects of glTF are a huge plus.
You could export the GLTF with only the base color. Once imported into Three.js, you can manually assign the black/white texture as the material's .alphaMap property.
// Fetch texture
const texLoader = new THREE.TextureLoader();
const alphaTexture = texLoader.load("path/to/alphaTexture");
gltfLoader.load("path/to/model", function(gltf){
// Find the mesh you want to assign an alphaMap to
const myMesh = gltf.getObjectByName("meshName");
// Now just bind your texture to the alphaMap property
myMesh.material.alphaMap = alphaTexture;
myMesh.material.transparent = true;
});
The docs state that only the green channel is used so it doesn't have to be black and white.

ThreeJS Points (Point Cloud) with Lighting using custom Shader Material

Coded using:
Using ThreeJS v0.130.1
Framework: Angular 12, but that's not relevant to the issue.
Testing on Chrome browser.
I am building an application that gets more than 100K points. I use these points to render a THREE.Points object on the screen.
I found that default THREE.PointsMaterial does not support lighting (the points are visible with or without adding lights to the scene).
So I tried to implement a custom ShaderMaterial. But I could not find a way to add lighting to the rendered object.
Here is a sample of what my code is doing:
Sample App on StackBlitz showing my current attempt
In this code, I am using sample values for point cloud data, normals and color but everything else is similar to my actual application. I can see the 3D object, but need more proper lighting using normals.
I need help or guidance to implement the following:
Add lighting to custom shader material. I have Googled and tried many things, no success so far.
Using normals, show the effects of lighting (In this sample code, the normals are fixed to Y-axis direction, but I am calculating them based on some vector logic in actual application). So calculating normals is already done, but I want to use them to show light shine/shading effect in the custom shader material.
And in this sample, color attribute is set to fixed red color, but in actual application I am able to apply colors using UV range from a texture to color attribute.
Please advise how/if I can get lighting based on normals for Point Cloud. Thanks.
Note: I looked at this Stackoveflow question but it only deals with changing the alpha/transparency of points and not lighting.
Adding lighting to a custom material is a very complex process. Especially since you could use Phong, Lambert, or Physical lighting methods, and there's a lot of calculations that need to pass from the vertex to the fragment shader. For instance, this segment of shader code is just a small part of what you'd need.
Instead of trying to re-create lighting from scratch, I recommend you create a PlaneGeometry with the material you'd like (Phong, Lambert, Physical, etc...) and use an InstancedMesh to create thousands of instances, just like in this example.
Based on that example, the pseudo-code of how you could achieve a similar effect is something like this:
const count = 100000;
const geometry = new PlaneGeometry();
const material = new THREE.MeshPhongMaterial();
mesh = new THREE.InstancedMesh( geometry, material, count );
mesh.instanceMatrix.setUsage( THREE.DynamicDrawUsage ); // will be updated every frame
scene.add( mesh );
const dummy = new THREE.Object3D();
update() {
// Sets the rotation so it's always perpendicular to camera
dummy.lookAt(camera);
// Updates positions of each plane
for (let i = 0; i < count; i++){
dummy.position.set( x, y, z );
dummy.updateMatrix();
mesh.setMatrixAt( i ++, dummy.matrix );
}
}
The for() loop would be the most expensive part of each frame, so if you need to update it on each frame, you might want to calculate this in the vertex shader, but that's another question altogether.

Apply a material over an existing one

I'm trying to add a texture to the 3d object, but if the object has a material with a white background I can see the texture correctly, while if the object is dark or transparent the texture becomes dark or transparent like the object.
I can't find a way to solve this problem, I thought of applying another material over the object but I don't see how it is possible.

How can I light emission per vertex and per vertex lighting in ThreeJS?

I want to see a chart with color specified per vertex and to get little bit of shading too.
But if I use MeshBasicMaterial I only get VertexColor with no dynamic shading.
On the other hand, if I use MeshPhongMaterial I just get shading but without emissiveness from my vertex colors.
As the THREE.JS PhongMaterial supports vertexColors, giving you a nice combination of dynamic lighting and vertex colors, I'm not quite sure I understand your question. Perhaps that is something you should investigate more?
However, as an alternative to writing a custom shader you could try rendering your model in multiple passes.
This will not give you as much control over the way the vertex colors and phong lighting are combined as a shader would, but often a simple add/multiply blend can give pretty decent results.
Algorithm:
- create two meshes for the BufferGeometry, one with the BasicMaterial and one with the PhongMaterial
- for the PhongMaterial, set
depthFunc = THREE.EqualDepth
transparent = true;
blending = THREE.AdditiveBlending(or MultiplyBlending)
- render the first mesh
- render the second mesh at the exact same spot

three.js create texture from cubecamera

When using a cube camera one normally sets the envMap of the material to the cubeCamera.renderTarget, e.g.:
var myMaterial = new THREE.MeshBasicMaterial({color:0xffffff,
envMap: myCubeCamera.renderTarget,
side: THREE.DoubleSide});
This works great for meshes that are meant to reflect or refract what the cube camera sees. However, I'd like to simply create a texture and apply that to my mesh. In other words, I don't want my object to reflect or refract. I want the face normals to be ignored.
I tried using a THREE.WebGLRenderTarget, but it won't handle a cube camera. And using a single perpspective camera with WebGLRenderTarget does not give me a 360 texture, obviously.
Finally, simply assigning the cubeCamera.renderTarget to the 'map' property of the material doesn't work either.
Is it possible to do what I want?
r73.
Edit: this is not what the author of the question is looking for, I'll keep my answer below for other people
Your envmap is already a texture so there's no need to apply it as a map. Also, cubemaps and textures are structurally different, so it won't be possible to swap them, or if you succeed in doing that the result is not what you probably you might expect.
I understand from what you're asking you want a static envmap instead to be updated at each frame, if that's the case simply don't run myCubeCamera.updateCubeMap() into your render function. Instead place it at the end of your scene initialization with your desired cube camera position, your envmap will show only that frame.
See examples below:
Dynamic Cubemap Example
Static Cubemap Example
The answer is: Set the refractionRatio on the material to 1.0. Then face normals are ignored since no refraction is occurring.
In a normal situation where the Cube Camera is in the same scene as the mesh, this would be pointless because the mesh would be invisible. But in cases where the Cube Camera is looking at a different scene, then this is a useful feature.

Resources