After simplifying glb successfully with the answer in this post.
The textures in are not being applied to the model any more (it appears completely black and unreflective(no material)
How would I programmatically get the textures to work with this new simplified geometry?
I think its something to do with the uv's but im not to sure how to have it work according to simplified geometry, if even possible.
THREE.SimplifyModifier currently does not preserve UVs in the geometry, which you'll need for textures to work. See: https://github.com/mrdoob/three.js/issues/14058. There is a workaround suggested in that issue (via changes to SimplifyModifier) but as discussed there, some artifacts will likely be visible in the result. If you can do the simplification in Blender or another modeling tool, you may have more control over the process and can get better results.
Related
I worked with a freelancer to create a few 3D models and I'm now trying to apply materials to them in Xcode, but some of the models are behaving weirdly. When I import the .dae file, instead of the default gray color models usually have, they are part green, part black, and part transparent. When I try to apply a material, the material only applies to the green parts, and any color seems to blend with the green.
The freelancer is at a loss because he says there shouldn't be anything different about the models that are working and the ones that aren't.
I've attached a screenshot of one of the broken models and one of the working ones, both with the same material applied.
Any help would be much appreciated!
I tried playing around with different material settings and searching for materials that were secretly applied to broken model, but to no avail.
I fixed this issue by removing a Colors item that was in Geometry Sources. I'm still not sure exactly what it is (I'm a SceneKit noob), but removing it did the trick.
This answer is just to illustrate the issue you might have with the texture coordinates. If you use one single color and not textures, you'll be good. But if you plan to add some texturing, you might want to unwrap your model in a different way. the following images should illustrate that.
This model shows texture coordinates that looks like a perspective projection and you loose a lot of space. It cannot be textured in a good manner. it should be unwrapped differently (illustration from Blender).
This model here is perfectly unwrapped with non overlapping textures and alomst every free space on the image texture is used for anything (illustration from Blender).
hope I could help in some manner.
I have a photo-realistic scene already created in 3ds max. I want to render the scene on the web using WebGl and three.js. TO get the realistic effects created in 3ds max using mental ray renderer, I tried to bake the light maps from 3ds max to JPEG files and then map objects in three.js to the texture(exported JPEG) files. But the efeects in three.js seem to be stretched out and not positioned properly. Is my approach correct in the first place? If yes, could it be a problem with the UV mapping from the 3ds Max? Please provide some links if possible to map UVs properly in 3ds max while baking if that's the issue.
Also, do I need to use any custom shaders to get such effects? (I honestly know nothing about shaders if this question seems silly)
Thanks in advance.
I would highly recommend using the THREEjs exporter:
https://github.com/mrdoob/three.js/tree/dev/utils/exporters/max
I have had a lot of trouble with Maya and other programs using any of the built in export options. Face winding, UVs and other stuff seems pretty iffy.The exporter helps.
Once you've done that, there's something else to keep in mind - THREEjs allows two sets of UVs only per piece of geometry. One for the map, bumpmap, displacementmap, etc, and another for the lightmap. So if those two UV sets end up different from one another, you might need to swap which you assign as map and which as lightmap.
Link a fiddle with what results you have so far and we might be able to help more. The only thing i can recommend is to use the THREEjs exporter without seeing what code you're using.
Exporter for 3ds Max has been dropped from the official Three.js repos, you should use glTF format instead. See this official page for the list of glTF-compatible Max exporters:
https://github.com/KhronosGroup/glTF
I've got a question regarding the exporting of Blender scenes to be loaded into Three.js, with the focus on lighting.
We're using Blender to create our 3D environments, interiors in this case. In Blender the scenes look like they should. Here's an example I've put together with a single point light with Energy: 50 and Distance: 30. I've made these values so high so the problem is visible clearly inside Three.js. Here is a screen from blender:
Now, when exported using the Three.js exporter for Blender and imported using the SceneLoader, the result in Three.js is:
Don't might the ugly brightness, the problem is the fact that it lights only parts of the scene. It looks like Three.js incorrectly lights different triangles of an object. Our 3D artist makes the objects in Blender using quads.
To make sure the problem lies not within the exporting and importing process, I've created a PointLight within Three with the same position, distance and brightness. This gives the exact same result as above.
I've tried using different lights as well. So far only the Sun light (Directional in Three) seems to give the correct result. The other lights don't work at all when exported from blender, but that is a problem outside of the scope of this post.
My question is: Is it in fact the triangles Three.js creates that cause the problem? Would making the triangles in Blender to begin with fix this problem, or is there a different approach that might fix the problem?
EDIT: Using Phong mats fixed it, but lighting seems incorrectly divided over objects individually:
I've always found lighting and shading to be the most difficult part of WebGL / Three.js to get right – I feel like I'm making it up.
What combination of lighting and shading do you think I would need to achieve a similar look and feel to the following image? It's so soft and yet well-defined, whereas my attempts always come out harsh and haphazard.
http://threejs.org/examples/#webgl_shadowmap
http://threejs.org/examples/#webgl_morphtargets_horse
This examples use the same lightning effect, check out the Code.
I'm struggling with a visualization I'm working on that involves a stream of repeated images. I have it working with a single sprite with a ParticleSystem, but I can only apply a single material to the system. Since I want to choose between textures I tried creating a pool of Particle objects so that I could choose the materials individually, but I can't get an individual Particle to show up with the WebGL renderer.
This is my first foray into WebGL/Three.js, so I'm probably doing something bone-headed, but I thought it would be worth asking what the proper way to go about this is. I'm seeing three possibilities:
I'm using Particle wrong (initializing with a mapped material, adding to the scene, setting position) and I need to fix what I'm doing.
I need a ParticleSystem for each sprite I want to display.
What I'm doing doesn't fit into particles at all and I really should be using another object type.
All the examples I see using the canvas renderer use Particle directly, but I can't find an example using the WebGL renderer that doesn't use ParticleSystem. Any hints?
Ok, I am going from what I have read elsewhere on this github issues page. You should start by reading it. It seems that the Particle is simply for the Canvas Renderer, and it will become Sprite in a further edition of Three.JS. ParticleSystem, however is not going to fulfill your needs either it seems. I don't think these classes are going to help you accomplish this in WebGL in 3D. Depending on what you are doing you might be better off with the CanvasRenderer anyway. ParticleSystem will only allow you to apply a single material which will serve as the material for each particle in the system as you suggested.
Short answer:
You can render THREE.Particle using THREE.CanvasRenderer only.