It might be me, as I'm not very experienced with Three.js, but the Three.js instancing example seems very vague.
I thought BufferGeometries were automatically created from Geometries in the new Three.js, and I don't understand why a shader is being added to the example.
The documentation says nothing about instancing. Searching up 'inst' in the Three.js docs already gives 0 results.
Is there someone who would give me a simple example?
Related
My scene is pretty simple. I have a (Ground) Plane and a sphere on top of it. I am setting an emissive material to the sphere. Now, I want reflection of the emitted light by the sphere on the plane.
Looks like that is not possible in a straight-forward way. (I totally wish THREE.js was like Maya or Blender). And from what I have looked around, point lights were suggested. But I am having a gradient emission map. So a point light will not satisfy my needs here.
Some other answers have suggested something on the lines of capturing the scene by a reflection surface and other have pointed me to cube camera rendering....I do not totally understand what they mean - am quite new to THREE.js and this low a level of graphics. Especially this answer to a beginner like me looks daunting (I am just starting to mess with shaders). Emissive ligth and reflection in Three.js
If anyone can explain the method or provide resources where I can learn about them, I would be grateful.
After simplifying glb successfully with the answer in this post.
The textures in are not being applied to the model any more (it appears completely black and unreflective(no material)
How would I programmatically get the textures to work with this new simplified geometry?
I think its something to do with the uv's but im not to sure how to have it work according to simplified geometry, if even possible.
THREE.SimplifyModifier currently does not preserve UVs in the geometry, which you'll need for textures to work. See: https://github.com/mrdoob/three.js/issues/14058. There is a workaround suggested in that issue (via changes to SimplifyModifier) but as discussed there, some artifacts will likely be visible in the result. If you can do the simplification in Blender or another modeling tool, you may have more control over the process and can get better results.
do I need to write a shader for phong,
Well I know I do but, when using THREE.js do i need one?
I have been reading this book (http://shop.oreilly.com/product/9781849699792.do) it explains how to write a phong shader.
I thought I would try it out, but then found this: http://threejs.org/docs/#Reference/Materials/MeshPhongMaterial
Are there any differences between the two? How about if I had a large detailed model, as opposed to a sphere?
Thanks
You do not need to write a phone shader. The Phong material provided by three.js is an implementation of the Blinn–Phong shading model, and you can use it for any of your models.
Implementing your own would not result in any difference, but it likely would be great to further your understanding.
I've always found lighting and shading to be the most difficult part of WebGL / Three.js to get right – I feel like I'm making it up.
What combination of lighting and shading do you think I would need to achieve a similar look and feel to the following image? It's so soft and yet well-defined, whereas my attempts always come out harsh and haphazard.
http://threejs.org/examples/#webgl_shadowmap
http://threejs.org/examples/#webgl_morphtargets_horse
This examples use the same lightning effect, check out the Code.
For a while now I've been having difficulty correctly applying textures to three.js custom objects. Specifically extruded elliptical objects. I have looked around and from what I can see I need to generate the objects UV mapping by using it's bounding box. However, I am unsure exactly how to do this and I can't seem to find any detailed explanation on how this works. Can anyone help explain this process to me in detail?
Thanks,
zaidi92