I have copied the example given in the official three js repo of how to load the DAE Collada model, modified it to remove the animations, and loaded my own Collada model.
The problem is that the model loads in black as you can see in this codesandbox, and I wasn't able to change the material color!
You can see that the same model file(after I converted it from Collada to GLTF) is parsed with blue color!
Can you please tell me how can I load the color correctly? thanks in advance.
I looked at your geometry with console.log(child.geometry) and noticed that it doesn't have any normals data. It only has position, uv, & uv2, but you need normals to calculate the behavior of light reflecting from your faces.
There are 3 things you can do to solve this:
1 Add your own normals:
Re-export your Collada file with normals data from your 3D editor software.
2 Flat shading:
Use flat-shading so every face is flat, without smooth or rounded edges. You don't need normals data for this because the shader just assumes every triangle's normal is pointing perpendicular to its surface.
avatar.traverse(function (child) {
if (child.isMesh) {
child.material.flatShading = true;
}
});
3 Calculate normals:
You could ask Three.js to calculate the normals for you. Just be aware that it does this by averaging connected vertices, so faces might accidentally look too smooth or too hard in places you don't expect.
avatar.traverse(function (child) {
if (child.isMesh) {
child.geometry.computeVertexNormals();
}
});
Related
Coded using:
Using ThreeJS v0.130.1
Framework: Angular 12, but that's not relevant to the issue.
Testing on Chrome browser.
I am building an application that gets more than 100K points. I use these points to render a THREE.Points object on the screen.
I found that default THREE.PointsMaterial does not support lighting (the points are visible with or without adding lights to the scene).
So I tried to implement a custom ShaderMaterial. But I could not find a way to add lighting to the rendered object.
Here is a sample of what my code is doing:
Sample App on StackBlitz showing my current attempt
In this code, I am using sample values for point cloud data, normals and color but everything else is similar to my actual application. I can see the 3D object, but need more proper lighting using normals.
I need help or guidance to implement the following:
Add lighting to custom shader material. I have Googled and tried many things, no success so far.
Using normals, show the effects of lighting (In this sample code, the normals are fixed to Y-axis direction, but I am calculating them based on some vector logic in actual application). So calculating normals is already done, but I want to use them to show light shine/shading effect in the custom shader material.
And in this sample, color attribute is set to fixed red color, but in actual application I am able to apply colors using UV range from a texture to color attribute.
Please advise how/if I can get lighting based on normals for Point Cloud. Thanks.
Note: I looked at this Stackoveflow question but it only deals with changing the alpha/transparency of points and not lighting.
Adding lighting to a custom material is a very complex process. Especially since you could use Phong, Lambert, or Physical lighting methods, and there's a lot of calculations that need to pass from the vertex to the fragment shader. For instance, this segment of shader code is just a small part of what you'd need.
Instead of trying to re-create lighting from scratch, I recommend you create a PlaneGeometry with the material you'd like (Phong, Lambert, Physical, etc...) and use an InstancedMesh to create thousands of instances, just like in this example.
Based on that example, the pseudo-code of how you could achieve a similar effect is something like this:
const count = 100000;
const geometry = new PlaneGeometry();
const material = new THREE.MeshPhongMaterial();
mesh = new THREE.InstancedMesh( geometry, material, count );
mesh.instanceMatrix.setUsage( THREE.DynamicDrawUsage ); // will be updated every frame
scene.add( mesh );
const dummy = new THREE.Object3D();
update() {
// Sets the rotation so it's always perpendicular to camera
dummy.lookAt(camera);
// Updates positions of each plane
for (let i = 0; i < count; i++){
dummy.position.set( x, y, z );
dummy.updateMatrix();
mesh.setMatrixAt( i ++, dummy.matrix );
}
}
The for() loop would be the most expensive part of each frame, so if you need to update it on each frame, you might want to calculate this in the vertex shader, but that's another question altogether.
Trying to mimic the behaviour of this site.
http://experience.mausoleodiaugusto.it/it/chapter-02
I can successfully load in a dae model with a texture and have this displayed in the 3JS scene. I have also played around a bit with the tesselate and explode modifiers with reasonable success however the method I am using takes the dae geometry, creates faces from this, which are assigned a colour and turned into new mesh and added to the scene alongside my original dae model. So I end up with two objects - my original dae model with texture and the new exploded mesh with coloured faces
How would I map the texture from the dae file to each individual face in the new mesh so that when it explodes it looks accurate as per the example - or am I going about this the wrong way? Perhaps each face needs its inidvidual texture already mapped in the dae file?
Any suggestions would be welcome.
I try to render both sides of a transparent object with three.js. Other objects located within the transparent object should show too. Sadly I get artifacts I don't know too handle. Here is a test page: https://dl.dropbox.com/u/3778149/webgl_translucency/test.html
Here is an image of the said artifact. They seem to stem from the underlying sphere geometry.
Interestingly the artifacts are not visible for blending mode THREE.SubtractiveBlending = 2.
Any help appreciated!
Alex
Self-transparency is particularly difficult in WebGL and three.js. You just have to really understand the issues, and then adapt your code to achieve the effect you want.
You can achieve the look of a double-sided, transparent sphere in three.js, with a trick: You need to render two transparent spheres -- one with material.side = THREE.BackSide, and one with material.side = THREE.FrontSide.
Using such methods is generally required if you want self-transparency without artifacts -- especially if you allow the camera or object to move.
three.js r.143
Generally to do transparent objects you need to sort them front to back (I'm guessing three.js already does this). If your object is convex (like both of those are) then you can sometimes get by by rendering each object twice, once with gl.cullFace(gl.CCW) and again with gl.cullFace(gl.CW). So for example if the cube is inside the sphere you'd effectively do
gl.enable(gl.CULL_FACE);
gl.cullFace(gl.CW);
drawSphere(); // draws the back of the sphere
drawCube(); // draws the back of the cube
gl.cullFace(gl.CCW);
drawCube(); // draws the front of the cube.
drawSphere(); // draws the front of the sphere.
I have no idea how to do that in three.js
This only handles objects that are convex and not intersecting (one object is contained entirely inside the other).
To render that scene correctly with alpha blending, the triangles would have to be rendered from back to front each frame. Your scene is particularly challenging since you have one object inside another, and rendering both sides, which would require rendering part of the sphere, then the cube, then the rest of the sphere. I doubt three.js (or any other scene graph library) can handle this case.
Additive or subtractive blending will work without sorting, but doesn't look as nice.
Make a clon of the original mesh and flip its normals; then make two identical "one sided" material for each with different name. Not the most classy approach but it worked just fine. I struggled with the same problem, this is what I did :P
The .json file looks like this:
{
"materials":[
{ "name":"ext", "texture":"f_03.jpg", "ambient":[255.0,255.0,255.0], "diffuse":[255.0,255.0,255.0], "specular":[255.0,255.0,255.0], "opacity":0.7 },
{ "name":"int", "texture":"f_03.jpg", "ambient":[255.0,255.0,255.0], "diffuse":[255.0,255.0,255.0], "specular":[255.0,255.0,255.0], "opacity":0.7 }
],
"meshes":[
{
"name":"Cylinder001",
"material":"ext", ...
{
"name":"Cylinder002",
"material":"int", ...
I've been struggling with this one for hours, and found nothing either in the docs or here on SO that would point me to the right direction to achieve what I aim at.
I'm loading a scene containing several meshes. The first one is used as an actual mesh, rendered on the scene, the other ones are just used as morph targets (their geometries, properly speaking).
loader.load("scene.json", function (loadedScene) {
camera.lookAt( scene.position );
var basis = loadedScene.getObjectByName( "Main" ).geometry;
var firstTarget = loadedScene.getObjectByName( "Target1" ).geometry;
// and so on for the rest of the "target" meshes
basis.morphTargets[0] = {name: 'fstTarget', vertices: firstTarget.vertices};
var MAIN = new THREE.Mesh(basis);
This works very well, and I can morph the MAIN mesh with no hassle by playing with the influence values. The differences between the basis mesh and the target are not huge, basically they're just XY adjustments (2D shape variations).
Now, I'm using a textures material: UVs are properly projected (Blender export) and the result is nice with the MAIN mesh as is.
The problem comes when the basis shape is morphed towards a target geometry: as expected, the texture (UVs) adapts automatically, but this is not what I need to achieve => I'd need to have the UVs "morph" towards the morph target's UVs for the texture to look the same.
Here is an example of what I have now (left: basis mesh, right: morphTargetInfluences = 1 for the first morph target)
morph target and texture
What I'd like to have is the exact same texture projection on the final, morphed mesh...
I can't figure out how to do that the right way. Should I reassign the target UVs to the MAIN mesh (and how would I do that)?
The result would be like having a cloth below which a shape is morphed, and the cloth being "shrinked-wrapped" all the time against that underlying shape => you can actually see the shape changes, but the cloth itself is not deformed, just wrapping itself properly and consistently around the shape...
Any help would be much appreciated! Thanks in advance :)
I try to render both sides of a transparent object with three.js. Other objects located within the transparent object should show too. Sadly I get artifacts I don't know too handle. Here is a test page: https://dl.dropbox.com/u/3778149/webgl_translucency/test.html
Here is an image of the said artifact. They seem to stem from the underlying sphere geometry.
Interestingly the artifacts are not visible for blending mode THREE.SubtractiveBlending = 2.
Any help appreciated!
Alex
Self-transparency is particularly difficult in WebGL and three.js. You just have to really understand the issues, and then adapt your code to achieve the effect you want.
You can achieve the look of a double-sided, transparent sphere in three.js, with a trick: You need to render two transparent spheres -- one with material.side = THREE.BackSide, and one with material.side = THREE.FrontSide.
Using such methods is generally required if you want self-transparency without artifacts -- especially if you allow the camera or object to move.
three.js r.143
Generally to do transparent objects you need to sort them front to back (I'm guessing three.js already does this). If your object is convex (like both of those are) then you can sometimes get by by rendering each object twice, once with gl.cullFace(gl.CCW) and again with gl.cullFace(gl.CW). So for example if the cube is inside the sphere you'd effectively do
gl.enable(gl.CULL_FACE);
gl.cullFace(gl.CW);
drawSphere(); // draws the back of the sphere
drawCube(); // draws the back of the cube
gl.cullFace(gl.CCW);
drawCube(); // draws the front of the cube.
drawSphere(); // draws the front of the sphere.
I have no idea how to do that in three.js
This only handles objects that are convex and not intersecting (one object is contained entirely inside the other).
To render that scene correctly with alpha blending, the triangles would have to be rendered from back to front each frame. Your scene is particularly challenging since you have one object inside another, and rendering both sides, which would require rendering part of the sphere, then the cube, then the rest of the sphere. I doubt three.js (or any other scene graph library) can handle this case.
Additive or subtractive blending will work without sorting, but doesn't look as nice.
Make a clon of the original mesh and flip its normals; then make two identical "one sided" material for each with different name. Not the most classy approach but it worked just fine. I struggled with the same problem, this is what I did :P
The .json file looks like this:
{
"materials":[
{ "name":"ext", "texture":"f_03.jpg", "ambient":[255.0,255.0,255.0], "diffuse":[255.0,255.0,255.0], "specular":[255.0,255.0,255.0], "opacity":0.7 },
{ "name":"int", "texture":"f_03.jpg", "ambient":[255.0,255.0,255.0], "diffuse":[255.0,255.0,255.0], "specular":[255.0,255.0,255.0], "opacity":0.7 }
],
"meshes":[
{
"name":"Cylinder001",
"material":"ext", ...
{
"name":"Cylinder002",
"material":"int", ...