Using three.js am trying to create a floor that reflects the objects that sit upon it. Preferably the floor material should reflect not like a mirror but in a more 'matte' or diffused way.
To achieve this I looked to Jaume Sanchez Elias who has made a great example using a cube camera: Look for the "smooth material" example on this page:
http://www.clicktorelease.com/blog/making-of-cruciform
Here is my attempt using the same technique. But as you see the reflections are misplaced, they do not appear underneath the mountain objects as expected.
http://dev.udart.dk/stackoverflow_reflections/
I am looking to correct this or to use any other technique that will achieve a more correct diffused reflection.
There are three.js examples using the cube camera technique but they all create mirror-like effects not a soft reflection.
Vibber. Parallax-corrected cubemaps, the technique used in cru·ci·form, only works for closed volumes, like cubes. It works really well to simulate correct reflections inside a room, but not so much for outdoors or open/large scenes. They also can't reflect anything that it's inside the cubemap, you'd have to split the volume in many sub-volumes.
I can think of a couple of solutions for what you want to achieve:
SSR: Screen-space reflections, you can find more info in many places on the internet. It's not the most trivial of effects to implement, and you might have to change the way you render your scene.
Simpler post-processing approach: since you have a flat floor, render the mountains vertically flipped on a framebuffer object, blur it, and render the regular scene on top. For extra effect, render the depth of the flipped mountains, and use that value as the blur radius, to get diffuse reflections.
As always, there's a ton of ways to achieve the (un)expected result :)
Related
Trying to achieve this kind of effect, but not sure which direction to head to.
I have tried to use a Multi-side Refraction technique using shaders, but can’t really seem to achieve the effect. Is there a simpler approach by any chance?
What I’ll have is a plane in the background, using shaders to achieve the marquee effect. That’s all fine and working. However, I need that geometry to have some sort of frosted glass effect, and at the same time, distort the text in the background. Would using some sort of material on the geometry, and adding transparent, which some parameters work?
Hoping for some guidance
This effect (as opposed to simpler alpha blending transparency) is called "transmission", and the frosted part is called "transmission roughness". THREE.MeshPhysicalMaterial is the preferred way to do that in three.js, see these examples:
https://threejs.org/examples/?q=transmission#webgl_materials_physical_transmission
However, the material type does not yet support refraction, the distortion of the background shown above. three.js#21000 includes some discussion of supporting that in the future.
I'm trying to render a fairly complex lamp using Three.js: https://sayduck.com/3d/xhcn
The product is split up in multiple meshes similar to this one:
The main issue is that I also need to use transparent PNG textures (in order to achieve the complex shape while keeping polygon counts low) like this:
As you can see from the live demo, this gives really weird results, especially when rotating the camera around the lamp - I believe due to z-ordering of the meshes.
I've been reading answers to similar questions on SO, like https://stackoverflow.com/a/15995475/5974754 or https://stackoverflow.com/a/37651610/5974754 to get an understanding of the underlying mechanism of how transparency is handled in Three.js and WebGL.
I think that in theory, what I need to do is, each frame, explicitly define a renderOrder for each mesh with a transparent texture (because the order based on distance to camera changes when moving around), so that Three.js knows which pixel is currently closest to the camera.
However, even ignoring for the moment that explicitly setting the order each frame seems far from trivial, I am not sure I understand how to set this order theoretically.
My meshes have fairly complex shapes and are quite intertwined, which means that from a given camera angle, some part of mesh A can be closer to the camera than some part of mesh B, while somewhere else, part of mesh B are closer.
In this situation, it seems impossible to define a closer mesh, and thus a proper renderOrder.
Have I understood correctly, and this is basically reaching the limits of what WebGL can handle?
Otherwise, if this is doable, is the approach with two render scenes (one for opaque meshes first, then one for transparent ones ordered back to front) the right one? How should I go about defining the back to front renderOrder the way that Three.js expects?
Thanks a lot for your help!
I'm currently rendering a skybox to a THREE.CubeCamera target and am then using that target as the environment map on a material. The idea being that I want to have the colour of a cube affected by the colour of the sky around it, though not fully reflecting it (like how a white matte cube would look in the real world).
For example, here is what I have so far applying the environment map to a THREE.LambertMaterial or THREE.PhongMaterial with reflectivity set to 0.7 (same results):
Notice in the first image that the horizon line is clearly visible (this is at sunset when it's most obvious) and that the material is very reflective. The second image shows the same visible horizon line, which moves with the camera as you orbit. The third image shows the box at midday with blue sky above it (notice how the blue is reflected very strongly).
The effect I'm trying to aim for is a duller, perhaps blurred representation of what we can already see working here. I want the sky to affect the cube but I don't want to fully reflect it, instead I want each side of the cube to have a much more subtle effect without a visible horizon line.
I've experimented with the reflection property of the materials without much luck. Yes, it reduces the reflection effect but it also removes most of the colouring taken from the skybox. I've also tried the shininess property of THREE.PhongMaterial but that didn't seem to do much, if anything.
I understand that environment maps are meant to be reflections, however my hope is that there is a way to achieve what I'm after. I want a reflection of the sky, I just need it to be much less precise and instead more blurred / matte.
What could I do to achieve this?
I achieve this writing my own custom shader based on physically based rendering shading model.
I use cook-torrance model that consider roughness of the material for specular contribution. It's not an easy argument that I can talk in this answer, you can find great references here http://graphicrants.blogspot.it/ at the specular BRDF article.
In this question you can find how I achieve the blurry reflection depending on material roughness.
Hope it can help.
I solved this by passing a different set of textures that were blurred to be the cubemap for the object.
Does Three.JS have a function or capability of AI( Artificial intelligence )? Specifically let's say a FPS game. I want enemies to look for me and try to kill me, is it possible in three.js? Do they have a functionality or a system of such?
Webgl
create buffer
bind buffer
allocate data
set up state
issue draw call
run GLSL shaders
three.js
create a 3d context using WebGL
create 3 dimensional objects
create a scene graph
create primitives like spheres, cubes, toruses
move objects around, rotate them scale them
test for intersections between rays, triangles, planes, spheres, etc.
create 'materials' (rather than shaders)
javascript
write algorithms
I want enemies to look for me and try to kill me
Yes, three.js is capable of doing this, you just have to write an algorithm using three's classes. Your enemies would be 3d objects, casting rays, intersecting with other objects, etc.
You would be building a game engine, and you could use three.js as your rendering framework within that engine. Rendering is just one part of it. Think of a 2d shooter, you could make it using a 2d context, but you could also enhance it and make it 2.5d, by working with a 3d context. Everything else can stay the same.
any webgl engine that might have it ? or is it just not a webgl thing
Unity probably has everything you can possibly think of. Unity is capable of outputting WebGL, so it could be considered a 'webgl engine'.
Bablyon.js is more engine like.
Three Js is the best and most powerfull WebGL 3d engine that has no equal on the market , and its missing out on such an ability
Three.js isn't exactly a 3d engine. Wikipedia says:
Three.js is a lightweight cross-browser JavaScript library/API used to
create and display animated 3D computer graphics on a Web browser.
Three.js uses WebGL.
so if i need to just draw a car, or a spinning logo, i don't need them to come looking for me, or try to shoot me. I just need them to stay in one place, and rotate.
For a graphics demo you don't even need this - with a few draw instructions, you could render a full screen quad with a very elaborate pixel shader. Three gives you a ton of options, especially if you consider all the featured examples.
It works both ways, while you can expand three.js anyway you want, you can strip it down for just a very specific purpose.
If you need to build an app that needs to do image processing, and feature no '3d' graphics, you could still leverage webgl with three.js.
You don't need any vector, matrix, ray , geometry classes.
If you don't have vector3, you probably cant keep planeGeometry, but you would use bufferGeometry, and manually construct a plane. No transformations need to happen, so no need for matrix classes. You'd use shaders, and textures, and perhaps something like the EffectsComposer.
I’m afraid not. Three.js is just a engine for displaying 3d content.
Using it to create games only is one possibility. However few websites raise with pre-coded stuff like AI (among other things) to attract game creators, but using them is more restrictive than writing the exact code you need
Three.js itself doesn't however https://mugen87.github.io/yuka/ is a great AI engine that can work in collaboration with three to create AI.
They do a line if sight and a shooting game logic, as well as car logic which I've been playing around with recently, a React Three Fiber example here: https://codesandbox.io/s/loving-tdd-u1fs9o
I create a THREE.PlaneGeometry with heights, in the highest point locate a THREE.PointLight, but this illuminates areas that are not seen from this point.
Why?
I want light from a point only the areas that are viewed from.
By default, the appearance of any given point on a surface is calculated using the lights, their properties and of course the material properties - it does not take the rest of the scene into account, as that would be very computationally expensive. Various ray tracing renderers do this, but they are really slow, and that's not how WebGL and Three.js work.
What you want is shadows. Three.js is capable of rendering shadows using Shadow Map method. There are various examples on using shadow maps both on the net and Three.js examples folder.
A word of warning though, getting shadows to work well can be hard if you don't have the basics down well - you may need to do some studying. Shadows can slow your application down (specially with many lighs) and look ugly if not properly configured and fine-tuned. Also I think shadow maps are only supported for SpotLight and DirectionalLight, PointLights are trickier.