I create a THREE.PlaneGeometry with heights, in the highest point locate a THREE.PointLight, but this illuminates areas that are not seen from this point.
Why?
I want light from a point only the areas that are viewed from.
By default, the appearance of any given point on a surface is calculated using the lights, their properties and of course the material properties - it does not take the rest of the scene into account, as that would be very computationally expensive. Various ray tracing renderers do this, but they are really slow, and that's not how WebGL and Three.js work.
What you want is shadows. Three.js is capable of rendering shadows using Shadow Map method. There are various examples on using shadow maps both on the net and Three.js examples folder.
A word of warning though, getting shadows to work well can be hard if you don't have the basics down well - you may need to do some studying. Shadows can slow your application down (specially with many lighs) and look ugly if not properly configured and fine-tuned. Also I think shadow maps are only supported for SpotLight and DirectionalLight, PointLights are trickier.
Related
in Three.js, I have a standard-material mesh cube sitting between two RectAreaLight strips. By default, it renders like this:
I wanted to add shadows under the cube, so I set the renderer's shadowMap:
renderer.shadowMap.enabled = true;
When I do this, suddenly my cube is reflecting the light strips:
(to clarify, only the two horizontal strips are RectAreaLight instances. The 3 vertical strips shown are just planes.)
This effect is kind of neat, but it's too sharp: it looks pixellated because my cube is relatively low-poly, and each facet has a different angle.
My understanding is that MeshStandardMaterial's 'roughness' property can be used to diffuse and soften reflections, but neither roughness nor metalness have any effect on this effect; no matter what I set it to, these harsh reflections remain.
I'd love to find a way to soften the reflections. I've also noticed that lines in general tend to look pretty aliased, even though I've set renderer.antialias to true. Perhaps a better anti-aliasing strategy would kill two birds with one stone?
Docs mention no shadow support with rectAreaLights:
https://threejs.org/docs/#api/en/lights/RectAreaLight
I wouldn't be surprised if there were other issues with the materials as well... RectAreaLights are implemented in a different branch of the render pipeline, as indicated by the requirement of including custom uniforms, but I don't know enough about the details to give you a better answer.
I would love to see other responses to this though, because RectAreaLights can create some really cool looks...
I'm currently rendering a skybox to a THREE.CubeCamera target and am then using that target as the environment map on a material. The idea being that I want to have the colour of a cube affected by the colour of the sky around it, though not fully reflecting it (like how a white matte cube would look in the real world).
For example, here is what I have so far applying the environment map to a THREE.LambertMaterial or THREE.PhongMaterial with reflectivity set to 0.7 (same results):
Notice in the first image that the horizon line is clearly visible (this is at sunset when it's most obvious) and that the material is very reflective. The second image shows the same visible horizon line, which moves with the camera as you orbit. The third image shows the box at midday with blue sky above it (notice how the blue is reflected very strongly).
The effect I'm trying to aim for is a duller, perhaps blurred representation of what we can already see working here. I want the sky to affect the cube but I don't want to fully reflect it, instead I want each side of the cube to have a much more subtle effect without a visible horizon line.
I've experimented with the reflection property of the materials without much luck. Yes, it reduces the reflection effect but it also removes most of the colouring taken from the skybox. I've also tried the shininess property of THREE.PhongMaterial but that didn't seem to do much, if anything.
I understand that environment maps are meant to be reflections, however my hope is that there is a way to achieve what I'm after. I want a reflection of the sky, I just need it to be much less precise and instead more blurred / matte.
What could I do to achieve this?
I achieve this writing my own custom shader based on physically based rendering shading model.
I use cook-torrance model that consider roughness of the material for specular contribution. It's not an easy argument that I can talk in this answer, you can find great references here http://graphicrants.blogspot.it/ at the specular BRDF article.
In this question you can find how I achieve the blurry reflection depending on material roughness.
Hope it can help.
I solved this by passing a different set of textures that were blurred to be the cubemap for the object.
Using three.js am trying to create a floor that reflects the objects that sit upon it. Preferably the floor material should reflect not like a mirror but in a more 'matte' or diffused way.
To achieve this I looked to Jaume Sanchez Elias who has made a great example using a cube camera: Look for the "smooth material" example on this page:
http://www.clicktorelease.com/blog/making-of-cruciform
Here is my attempt using the same technique. But as you see the reflections are misplaced, they do not appear underneath the mountain objects as expected.
http://dev.udart.dk/stackoverflow_reflections/
I am looking to correct this or to use any other technique that will achieve a more correct diffused reflection.
There are three.js examples using the cube camera technique but they all create mirror-like effects not a soft reflection.
Vibber. Parallax-corrected cubemaps, the technique used in cru·ci·form, only works for closed volumes, like cubes. It works really well to simulate correct reflections inside a room, but not so much for outdoors or open/large scenes. They also can't reflect anything that it's inside the cubemap, you'd have to split the volume in many sub-volumes.
I can think of a couple of solutions for what you want to achieve:
SSR: Screen-space reflections, you can find more info in many places on the internet. It's not the most trivial of effects to implement, and you might have to change the way you render your scene.
Simpler post-processing approach: since you have a flat floor, render the mountains vertically flipped on a framebuffer object, blur it, and render the regular scene on top. For extra effect, render the depth of the flipped mountains, and use that value as the blur radius, to get diffuse reflections.
As always, there's a ton of ways to achieve the (un)expected result :)
Does Three.JS have a function or capability of AI( Artificial intelligence )? Specifically let's say a FPS game. I want enemies to look for me and try to kill me, is it possible in three.js? Do they have a functionality or a system of such?
Webgl
create buffer
bind buffer
allocate data
set up state
issue draw call
run GLSL shaders
three.js
create a 3d context using WebGL
create 3 dimensional objects
create a scene graph
create primitives like spheres, cubes, toruses
move objects around, rotate them scale them
test for intersections between rays, triangles, planes, spheres, etc.
create 'materials' (rather than shaders)
javascript
write algorithms
I want enemies to look for me and try to kill me
Yes, three.js is capable of doing this, you just have to write an algorithm using three's classes. Your enemies would be 3d objects, casting rays, intersecting with other objects, etc.
You would be building a game engine, and you could use three.js as your rendering framework within that engine. Rendering is just one part of it. Think of a 2d shooter, you could make it using a 2d context, but you could also enhance it and make it 2.5d, by working with a 3d context. Everything else can stay the same.
any webgl engine that might have it ? or is it just not a webgl thing
Unity probably has everything you can possibly think of. Unity is capable of outputting WebGL, so it could be considered a 'webgl engine'.
Bablyon.js is more engine like.
Three Js is the best and most powerfull WebGL 3d engine that has no equal on the market , and its missing out on such an ability
Three.js isn't exactly a 3d engine. Wikipedia says:
Three.js is a lightweight cross-browser JavaScript library/API used to
create and display animated 3D computer graphics on a Web browser.
Three.js uses WebGL.
so if i need to just draw a car, or a spinning logo, i don't need them to come looking for me, or try to shoot me. I just need them to stay in one place, and rotate.
For a graphics demo you don't even need this - with a few draw instructions, you could render a full screen quad with a very elaborate pixel shader. Three gives you a ton of options, especially if you consider all the featured examples.
It works both ways, while you can expand three.js anyway you want, you can strip it down for just a very specific purpose.
If you need to build an app that needs to do image processing, and feature no '3d' graphics, you could still leverage webgl with three.js.
You don't need any vector, matrix, ray , geometry classes.
If you don't have vector3, you probably cant keep planeGeometry, but you would use bufferGeometry, and manually construct a plane. No transformations need to happen, so no need for matrix classes. You'd use shaders, and textures, and perhaps something like the EffectsComposer.
I’m afraid not. Three.js is just a engine for displaying 3d content.
Using it to create games only is one possibility. However few websites raise with pre-coded stuff like AI (among other things) to attract game creators, but using them is more restrictive than writing the exact code you need
Three.js itself doesn't however https://mugen87.github.io/yuka/ is a great AI engine that can work in collaboration with three to create AI.
They do a line if sight and a shooting game logic, as well as car logic which I've been playing around with recently, a React Three Fiber example here: https://codesandbox.io/s/loving-tdd-u1fs9o
I'm struggling with a visualization I'm working on that involves a stream of repeated images. I have it working with a single sprite with a ParticleSystem, but I can only apply a single material to the system. Since I want to choose between textures I tried creating a pool of Particle objects so that I could choose the materials individually, but I can't get an individual Particle to show up with the WebGL renderer.
This is my first foray into WebGL/Three.js, so I'm probably doing something bone-headed, but I thought it would be worth asking what the proper way to go about this is. I'm seeing three possibilities:
I'm using Particle wrong (initializing with a mapped material, adding to the scene, setting position) and I need to fix what I'm doing.
I need a ParticleSystem for each sprite I want to display.
What I'm doing doesn't fit into particles at all and I really should be using another object type.
All the examples I see using the canvas renderer use Particle directly, but I can't find an example using the WebGL renderer that doesn't use ParticleSystem. Any hints?
Ok, I am going from what I have read elsewhere on this github issues page. You should start by reading it. It seems that the Particle is simply for the Canvas Renderer, and it will become Sprite in a further edition of Three.JS. ParticleSystem, however is not going to fulfill your needs either it seems. I don't think these classes are going to help you accomplish this in WebGL in 3D. Depending on what you are doing you might be better off with the CanvasRenderer anyway. ParticleSystem will only allow you to apply a single material which will serve as the material for each particle in the system as you suggested.
Short answer:
You can render THREE.Particle using THREE.CanvasRenderer only.