I've got a question regarding the exporting of Blender scenes to be loaded into Three.js, with the focus on lighting.
We're using Blender to create our 3D environments, interiors in this case. In Blender the scenes look like they should. Here's an example I've put together with a single point light with Energy: 50 and Distance: 30. I've made these values so high so the problem is visible clearly inside Three.js. Here is a screen from blender:
Now, when exported using the Three.js exporter for Blender and imported using the SceneLoader, the result in Three.js is:
Don't might the ugly brightness, the problem is the fact that it lights only parts of the scene. It looks like Three.js incorrectly lights different triangles of an object. Our 3D artist makes the objects in Blender using quads.
To make sure the problem lies not within the exporting and importing process, I've created a PointLight within Three with the same position, distance and brightness. This gives the exact same result as above.
I've tried using different lights as well. So far only the Sun light (Directional in Three) seems to give the correct result. The other lights don't work at all when exported from blender, but that is a problem outside of the scope of this post.
My question is: Is it in fact the triangles Three.js creates that cause the problem? Would making the triangles in Blender to begin with fix this problem, or is there a different approach that might fix the problem?
EDIT: Using Phong mats fixed it, but lighting seems incorrectly divided over objects individually:
Related
After simplifying glb successfully with the answer in this post.
The textures in are not being applied to the model any more (it appears completely black and unreflective(no material)
How would I programmatically get the textures to work with this new simplified geometry?
I think its something to do with the uv's but im not to sure how to have it work according to simplified geometry, if even possible.
THREE.SimplifyModifier currently does not preserve UVs in the geometry, which you'll need for textures to work. See: https://github.com/mrdoob/three.js/issues/14058. There is a workaround suggested in that issue (via changes to SimplifyModifier) but as discussed there, some artifacts will likely be visible in the result. If you can do the simplification in Blender or another modeling tool, you may have more control over the process and can get better results.
I'm trying to load a skinned mesh from blender into a THREE.js scene, but it... looks a little odd:
There is a skeleton that was loaded from the animation only, the mesh modified for the game, and there is a little one that is directly loaded from the three.js ObjectLoader.
It's supposed to look more like this (from blender):
I found it!
when exporting from blender, I modified the number of influences to 4. It appears that some vertices were being modified by more than one bone and therefore when only two influencing bones were exported it distorted.
I've been fighting with a three.js issue for a few 12 hour days trying to determine why some outward pointed object faces are missing. It only seems to happen if I've modified the mesh model, extruded a plane, or knife projected a hole into a mesh.
I've found a few solutions online that don't seem to be working for me: I've added in the double-sided hack for all materials and this does allow me to see into objects with holes so it is partially working. I've fiddled with different importers (JSONLoader, OBJLoader) which all seem to have the same issues as listed above, so it leads me to believe it is indeed the model itself.
The research I've seen online says that modifying a mesh can leave the normals screwed up so any faces I CAN'T see on my model viewer I just flip and do the UV Map again, but this doesn't fix it.
I'm hoping someone who knows Blender and three.js will know what the problem is. I know it's simple and I'm just missing a step because I'm new.
Here's a link to the demo site and code: http://guitar.dodgestat.com/objloader/
It seems that OBJMTLLoader can handle only triangles and quadrangles, but obj files can describe faces with any number of vertices, but the faces should be convex.
If you check your model with http://3dviewer.net, you can see that every face exists, but there are some issues with non-convex faces.
So I recommend you to triangulate your model before export.
I've tried to figure out, how three.js is working and have tried some shader debugger for it.
I've added two simple planes with basic material (single color without any shading model), which are rotating within rendering process.
First of all, my question was... Why is three.js using a single shader program (look at the WebGL context function .useProgram()) for both meshes.
I suppose, that objects are the same, and that's why for performance reasons a single shader program is using for similar objects.
But... I have changed my three.js application source code, and now there are a plane and a cube in scene, which are rotating.
And let's look in shader debugger again:
Here you can see, that three.js is using again one shader program, but the objects are different right now. And this moment is not clear for me.
If to look at that shader, it seems to be very generic and huge shader program, and there are also two different shader programs, which were compiled, but not used.
So, why is three.js using a single shader program? What are those correct (or maybe not) reasons?
Most of the work done in a shader is related to the material part of the mesh, not the geometry.
In webgl (or opengl for that matter) the geometry as you understand it (if it is a cube, a sphere, or whatever) is pretty irrelevant.
It would be a little bit more relevant if you talk about how the geometry is constructed. But in these days where faces of more than 3 vertices are gone, and triangle strips are seldom used, that are few different geometries... face3 geometries, line geometries, particle geometries, and buffer geometries.
Most of the time, the key difference to use a different shader will be in the material.
I'm wondering why the mesh in lesson 10 looks more three-dimensional then mine. My meshes look like they have no surface and no depth. Here is an example picture:
Any suggestions? I don't see if there is a difference in loading the meshes (xtk's version compared to mine). I think it doesn't depend on the (type of) data because in ParaView it looks more three-dimensional.
It is because your mesh files have no normals.
Paraview will create normals if you don't provide it - not XTK -
You can generate normals for your meshes fairly easily with VTK:
http://www.vtk.org/doc/nightly/html/classvtkPolyDataNormals.html
1-vtkPolyDataReader
2-vtkPolyDataNormals
3-vtkPolyDataWriter
Maybe you can export the meshes from Paraview or Slicer? Maybe the exported meshes will contain the normals...