I have a photo-realistic scene already created in 3ds max. I want to render the scene on the web using WebGl and three.js. TO get the realistic effects created in 3ds max using mental ray renderer, I tried to bake the light maps from 3ds max to JPEG files and then map objects in three.js to the texture(exported JPEG) files. But the efeects in three.js seem to be stretched out and not positioned properly. Is my approach correct in the first place? If yes, could it be a problem with the UV mapping from the 3ds Max? Please provide some links if possible to map UVs properly in 3ds max while baking if that's the issue.
Also, do I need to use any custom shaders to get such effects? (I honestly know nothing about shaders if this question seems silly)
Thanks in advance.
I would highly recommend using the THREEjs exporter:
https://github.com/mrdoob/three.js/tree/dev/utils/exporters/max
I have had a lot of trouble with Maya and other programs using any of the built in export options. Face winding, UVs and other stuff seems pretty iffy.The exporter helps.
Once you've done that, there's something else to keep in mind - THREEjs allows two sets of UVs only per piece of geometry. One for the map, bumpmap, displacementmap, etc, and another for the lightmap. So if those two UV sets end up different from one another, you might need to swap which you assign as map and which as lightmap.
Link a fiddle with what results you have so far and we might be able to help more. The only thing i can recommend is to use the THREEjs exporter without seeing what code you're using.
Exporter for 3ds Max has been dropped from the official Three.js repos, you should use glTF format instead. See this official page for the list of glTF-compatible Max exporters:
https://github.com/KhronosGroup/glTF
Related
What is the current solution in r136 to blend lights, shadows and color in a ShaderMaterial ? I already found the solution for the fog support.
I found some examples in previous revision (r108) like this codesandbox.
Actually, I'm looking for this kind of result : codesandbox.
Should I copy MeshPhongMaterial shaders as code base for my own shaders ?
The usage of custom shaders is mandatory in my projects, that's why i'm not using built-in materials.
Any idea or example ?
Thanks !
This question is huge, and does not have a single answer. Creating lights, shadows, and color varies from material to material, and includes so many elements that it would require a full course to learn.
However, you can look at the segments of shader code used by Three.js in this folder called /ShaderChunk. If you look up "light", you'll see shader segments (or "chunks"), for each material, like toon, lambert, physical, etc. Some materials need parameters to be defined at the beginning of the shader code, (those are the _pars files), some are calculated in the vertex shader, some in fragment, some need to split the code between _begin and _end, etc:
Shadows are even more complex because they require a separate render pass to build the shadowmap. Like I said, re-building your own lights, shadows, and color is a huge undertaking, and it would need a full course to learn. I hope this answer at least points you in the right direction.
After simplifying glb successfully with the answer in this post.
The textures in are not being applied to the model any more (it appears completely black and unreflective(no material)
How would I programmatically get the textures to work with this new simplified geometry?
I think its something to do with the uv's but im not to sure how to have it work according to simplified geometry, if even possible.
THREE.SimplifyModifier currently does not preserve UVs in the geometry, which you'll need for textures to work. See: https://github.com/mrdoob/three.js/issues/14058. There is a workaround suggested in that issue (via changes to SimplifyModifier) but as discussed there, some artifacts will likely be visible in the result. If you can do the simplification in Blender or another modeling tool, you may have more control over the process and can get better results.
I'm struggling to find the most suitable workflow for 3DS Max and THREE.JS.
I've been using the Max exporter:
https://github.com/mrdoob/three.js/blob/master/utils/exporters/max/ThreeJSExporter.ms
but it seems it's unable to export more than one UV set (channel).
I can't use OBJ format because it doesn't support multiple UV channels at all.
I suppose I could save a file to .fbx, open it in blender and use the blender exporter:
https://github.com/mrdoob/three.js/tree/master/utils/exporters/blender
which seems to support multiple UV channels, but this is quite a long path and I'm not sure the graphic team would follow it through.
Is there an easier way of doing this?
I create a THREE.PlaneGeometry with heights, in the highest point locate a THREE.PointLight, but this illuminates areas that are not seen from this point.
Why?
I want light from a point only the areas that are viewed from.
By default, the appearance of any given point on a surface is calculated using the lights, their properties and of course the material properties - it does not take the rest of the scene into account, as that would be very computationally expensive. Various ray tracing renderers do this, but they are really slow, and that's not how WebGL and Three.js work.
What you want is shadows. Three.js is capable of rendering shadows using Shadow Map method. There are various examples on using shadow maps both on the net and Three.js examples folder.
A word of warning though, getting shadows to work well can be hard if you don't have the basics down well - you may need to do some studying. Shadows can slow your application down (specially with many lighs) and look ugly if not properly configured and fine-tuned. Also I think shadow maps are only supported for SpotLight and DirectionalLight, PointLights are trickier.
I'm struggling with a visualization I'm working on that involves a stream of repeated images. I have it working with a single sprite with a ParticleSystem, but I can only apply a single material to the system. Since I want to choose between textures I tried creating a pool of Particle objects so that I could choose the materials individually, but I can't get an individual Particle to show up with the WebGL renderer.
This is my first foray into WebGL/Three.js, so I'm probably doing something bone-headed, but I thought it would be worth asking what the proper way to go about this is. I'm seeing three possibilities:
I'm using Particle wrong (initializing with a mapped material, adding to the scene, setting position) and I need to fix what I'm doing.
I need a ParticleSystem for each sprite I want to display.
What I'm doing doesn't fit into particles at all and I really should be using another object type.
All the examples I see using the canvas renderer use Particle directly, but I can't find an example using the WebGL renderer that doesn't use ParticleSystem. Any hints?
Ok, I am going from what I have read elsewhere on this github issues page. You should start by reading it. It seems that the Particle is simply for the Canvas Renderer, and it will become Sprite in a further edition of Three.JS. ParticleSystem, however is not going to fulfill your needs either it seems. I don't think these classes are going to help you accomplish this in WebGL in 3D. Depending on what you are doing you might be better off with the CanvasRenderer anyway. ParticleSystem will only allow you to apply a single material which will serve as the material for each particle in the system as you suggested.
Short answer:
You can render THREE.Particle using THREE.CanvasRenderer only.