we are trying to create a sphere/globe where we send out rays that are translucent. at the surface the ray is solid white but as it goes further into space it becomes more and more transparent. You can see in the image. What we have is at the moment shown on the left but what we want is depicted on the right. Also, we want the effect to be like a cone of light that disperses out the further it goes into space.
Related
I want to render a room with a floor + roof that is open to one side. The room contains a point light and the "outside" it lit by an ambient light (the sun). There is one additional requirement: The user should be able to look inside the room to see whats going on. But I cannot simply remove the roof because then the room is fully lit by the ambient light.
I think my problem could be solved by having 3d objects that are transparent by still are blocking the light.
To give you an idea about my current scene, this is how it looks like:
The grey thing is the wall of my room. The black thing is the floor of the room. The green thing is the ground of the scene. The room contains a point light.
I am currently using two scenes (see Exclude Area from Directional/Ambient Lighting) because I wanted the inside of the room to be unaffected by the ambient light. But now my lights can only affect either the inside of my room (the point light) OR the outside (the ambient light) but not both.
A runnable sample of my scene can be found here:
https://codesandbox.io/s/confident-worker-64kg7m?file=/src/index.js
Again: I think that my problem could be solved by having transparent objects that still block the light. If I had that I would simply have a 3d plane on top of my room (as the roof) and make it transparent... It would block the light that is inside of the (but still let it go outside if the room is open) and it would also block the ambient light (partially - if the room is open)...
Maybe there is also another solution that I am not seeing.
Just use one scene instead of two, then enable shadows across the relevant meshes so a light doesn't cross from inside to outside. Once you're using only one scene, the steps to take in your demo are:
Disable AmbientLight, and use DirectionalLight only, since AmbientLight illuminates everything indiscriminately, and that's not what you want.
Place the directional light above your structure, so it shines from the top-down.
Enable shadow-casting on the walls
Add a ceiling mesh with the material's side set to side: THREE.BackSide. This will only render the back side of the Mesh, which means it won't be visible from above, but it will still cast shadows.
const roomCeilMat = new MeshStandardMaterial({
side: BackSide
});
const roomCeiling = new Mesh(roomFloorGeo, roomCeilMat);
roomCeiling.position.set(0, 0, 1);
roomCeiling.castShadow = true;
scene1.add(roomCeiling);
See here for a working copy of your demo:
https://codesandbox.io/s/stupefied-williams-qd7jmi?file=/src/index.js
I would assign a flat, emissive material to the room. Or a depth gradient if it becomes terrain. Since ambient light doesn't cast shadow. It saves a light and extra geometry or groups. Plus web model viewer(s) would probably render it better. If you're doing a reveal transition, use a clip plane or texture alpha mask.
It depends on the presentation versus the output format. Also it depends on the complexity of the final floorplan. If your process is simple it will run Sims Lite on a Raspberry Voxel.
I'm trying to make a very simple shadow shader which consists of a plane with a shader showing a radial gradient on colors and alpha.
Beneath this shadow lies another plane with the same kind of shader but linear.
And as a background of all this, a linear gradient from dark blue to light blue.
The problem is that when my camera approaches the ground, the plane of the shadow masks the floor.
Why does it happen and what can I do to prevent that?
https://codesandbox.io/s/epic-sun-po9j3
https://po9j3.csb.app/
You'd need to post code to check for sure but it likely happens because three.js sorts the order it draws things based on the center of the objects and their distance from the camera.
You can force a different order by setting Object3D.renderOrder
three.js also generally draws opaque things before transparent things so my guess is your ground plane and your shadow plane are both set to transparent: true but the ground can be set to transparent: false in which case it will be drawn first.
You might find this article useful. It shows a similar example.
As for why there is a hole it's because of the depth buffer. If something in front gets drawn first then the pixels behind are not drawn. So if the shadow happens to be drawn first it ends up looking like a hole because the pixels of plane behind it are not drawn.
See this
I am currently using three.js and trying to create a 3D experience with a semi-transparent material that can be viewed from all angles. I've noticed that depending on the camera angle, only certain portions of the mesh are semi-transparent and will show the content behind them. In this example below I've created two half cylinders and applied the same transparent material with the stack overflow logo. The half cylinder on the left properly shows the logo on the closest surface, as well as the surface behind it. The half cylinder on the right only shows the logo on the closest surface and fails to render the logo that wraps behind it. However, it does properly render the background image so the material is still treated correctly as transparent:
If I spin the orbital camera around 180 degrees the side that originally failed to see through now works and the other side exhibits the wrong behavior. This leads me to believe it's related to the camera position / depth sorting. The material in this case is a standard MeshPhongMaterial with transparent set to true, side as DoubleSide, and a single map for the transparent stack overflow logo. The geometry is formed from an opened ended CylinderGeometry. Any help would be greatly appreciated!
Basically I have a hexagonal mesh, on XY plane, upon which I draw a pseudo-randomly generated landscape.
Then to decide which face is going to be water and which land, I check for white pixels per face. If white pixels > black pixels, it's land, otherwise water.
The way I do it right now is render the buffer offscreen, and then for each pixel on the canvas, I ray cast to know which face the pixel belongs to, and then sum up all the pixels per face.
Problem is... the canvas is 1000x700 pixels, and it takes AGES to raycast 700,000 pixels.
So the question is... is there any faster/easier way to know which face is located at arbitrary (x,y) pixel on the canvas, without having to raycast the entire mesh to death.
I've found another solution which performs much faster, 10-15 seconds for 1000x700 viewport, instead of 7+ minutes.
I render to offscreen buffer, then I calculate center of each face in screen coordinates. Then I just run a simple flood fill algorithm, where yellow wireframe pixels are bounds, starting from each face center. That way I account for every pixel in every face.
I'm trying to create a shadow in my orthographic scene in three.js. I'd like to have a directional light so the shadow is offset from all objects equally in the scene. I am however having problems using DirectionalLight.
My first problem is that I can't get the shado to cover the entire scene, only part of it ever has a shadow. I played with the light's frustum settings, but can't figure out how to get it to cover the scene. Ideally I'd want the frustrum to match that of the camera.
The second problem is that the shadows aren't "clean". If I use a SpotLight the shadows have nice crisp borders (but obviously not the universal directionality I want). When I use a DirectionalLight the borders are misshappen and blurry.
In the samples the tile is a simply box created with CubeGeometry.
How can I create an ortographic directional light source for my scene?