Properly scaling textures in three.js / proctree.js - three.js

Apologies for the vague title, I'm not sure how to describe my issue.
I'm trying to create a forest in three.js with the very cool proctree.js. The library seems to create a 3d model of the tree's trunk and main branches, then adds simple flat textures for the leaves (or 'twigs').
The resulting tree looks very nice from up close but as you zoom out the leaves visually disappear almost entirely. This is a problem as I'm trying to create a dense looking forest. See the following two screengrabs (or this online viewer):
Is there a way to prevent the leaves from becoming very pixelated and thin looking from a distance? Or, to phrase the question differently, how would one create good-looking leaves that look as dense from a distance as they do from up close?
The material used looks like this:
this.twigMaterial = new THREE.MeshStandardMaterial({
color: this.config.twigColor,
roughness: 1.0,
metalness: 0.0,
map: this.textureLoader.load('assets/twig-1.png'),
alphaTest: 0.9
});

Your problem sounds very similar to this one
I'm pretty certain that the smaller-resolution mipmaps (when you zoom out) are blending your leaf textures and changing the alpha values to which the alphaTest threshold compares to. The further away you are, the more area of your texture is considered "transparent".
You can modify your texture properties as follows to disable mipmaps:
var texture = this.textureLoader.load('assets/twig-1.png');
texture.minFilter = THREE.LinearFilter;
this.twigMaterial = new THREE.MeshStandardMaterial({
color: this.config.twigColor,
roughness: 1.0,
metalness: 0.0,
map: texture,
alphaTest: 0.9
});
However, this might give your leaves an aliased look. Alternatively, you could create your own .mipmaps and pass them to your texture as an array to have a more custom look.

Related

Weird effects with AddEquation

I'm trying to render multiple radial gradients, with the goal of their colors adding up where they overlap. I'm doing a very simple three.js code for that, see fiddle here.
However, I'm getting an unexpected effect - the borders of overlapping gradients seem to be making the other gradients darker, see the dark lines in this screenshot:
I don't understand why this is happening. If I understand the OpenGL documentation correctly, GL_FUNC_ADD should simply add the component values (and, I assume, clamp to 1.0). I'm using GL_SRC_ALPHA and GL_ONE (or rather, their equivalents in three.js) for source/destination factors, e.g.
const mat = new THREE.MeshBasicMaterial({
alphaMap: grad2,
blending: THREE.CustomBlending, // Similar with THREE.AdditiveBlending
blendEquation: THREE.AddEquation,
blendSrc: THREE.SrcAlphaFactor,
blendDst: THREE.OneFactor,
});
What am I missing?
It is doing what you want, I think. The borders are not darker, it is other areas that are brighter.

Workaround of disabling depth testing for transparent objects?

In the scene I've got only transparent objects, thus with enabled depth testing it causes objects hiding each other. I know depth testing doesn't consider any transparency, it just writes to the depth buffer looking at values of z. Then how to render correctly two transparent objects ?
I did this renderer.context.disable(renderer.context.DEPTH_TEST); but nothing changed
illustration of my concrete problem:
the cube is MeshLambertMaterial({color: ..., transparent: true, opacity: 0.6})
and the plane is MeshLambertMaterial({color: ..., transparent: true, opacity: 0.4})
cube is rendered after plane, but if the cube is opaque then whole of it will be rendered correctly without any discarding (also look at the points they are also opaque hence are visible).
So how to get it consider transparency and don't care about the order of rendering as well so two transparent objects don't hide each other ?
In three.js, you can turn off depth testing by setting
material.depthTest = false;
Don't be surprised if you have artifacts when the camera position is changed.
You might also want to read this answer.
three.js r.80

Threejs doesn't render pointcloud shading, but only flat colors

I am trying to show a point cloud in Threejs, but the result is always flat (not affected by light), and no shading is rendered, is there a way to make it more realistic with shading and shadows (something like Meshlab for example). or is it a limitation in Threejs?
I am using THREE.Points object, with THREE.PointsMaterial material. I tried to use the option vertexColors: THREE.VertexColors, but only flat colors appear.
points = new THREE.Points(geometry, new THREE.PointsMaterial({
size: 1.2,
vertexColors: THREE.VertexColors
}));
Compare threejs rendering to the left, with meshlab rendering to the right

three.js create texture from cubecamera

When using a cube camera one normally sets the envMap of the material to the cubeCamera.renderTarget, e.g.:
var myMaterial = new THREE.MeshBasicMaterial({color:0xffffff,
envMap: myCubeCamera.renderTarget,
side: THREE.DoubleSide});
This works great for meshes that are meant to reflect or refract what the cube camera sees. However, I'd like to simply create a texture and apply that to my mesh. In other words, I don't want my object to reflect or refract. I want the face normals to be ignored.
I tried using a THREE.WebGLRenderTarget, but it won't handle a cube camera. And using a single perpspective camera with WebGLRenderTarget does not give me a 360 texture, obviously.
Finally, simply assigning the cubeCamera.renderTarget to the 'map' property of the material doesn't work either.
Is it possible to do what I want?
r73.
Edit: this is not what the author of the question is looking for, I'll keep my answer below for other people
Your envmap is already a texture so there's no need to apply it as a map. Also, cubemaps and textures are structurally different, so it won't be possible to swap them, or if you succeed in doing that the result is not what you probably you might expect.
I understand from what you're asking you want a static envmap instead to be updated at each frame, if that's the case simply don't run myCubeCamera.updateCubeMap() into your render function. Instead place it at the end of your scene initialization with your desired cube camera position, your envmap will show only that frame.
See examples below:
Dynamic Cubemap Example
Static Cubemap Example
The answer is: Set the refractionRatio on the material to 1.0. Then face normals are ignored since no refraction is occurring.
In a normal situation where the Cube Camera is in the same scene as the mesh, this would be pointless because the mesh would be invisible. But in cases where the Cube Camera is looking at a different scene, then this is a useful feature.

Create a concave halfsphere with Three.js

I'm a web developer with a good JavaScript experience, and I'm currently exploring Three.js possibilities. However, I'm not very familiar with 3D shapes and vocabulary, and I can't figure out how to build the shape I need.
I want to create a halfsphere, and be able to project a video inside this sphere. I have a panoramic spherical video, and I need to distort it to make it look like "plane".
Thanks to Paul's tutorial, I have drawn a sphere and projected my video on it. But the external sphere surface is convex, and I need a concave one! How can I to achieve that? Extruding a smaller sphere out of my initial one?
You can create a half-sphere by setting the additional SphereGeometry parameters:
const geometry = new THREE.SphereGeometry( radius, widthSegments, heightSegments, phiStart, phiLength, thetaStart, thetaLength )
Experiment until you get exactly what you want.
You will also have to set the side property of the material you use for the sphere to either be THREE.BackSide or THREE.DoubleSide.
material.side = THREE.DoubleSide;
three.js r.143
You can use SphereBufferGeometry, to create a half sphere (hemisphere). The last argument does it: 0.5 * Math.PI. Also to be able to see something, you need to use THREE.DoubleSide for material.
var geometry = new THREE.SphereBufferGeometry(5, 8, 6, 0, 2*Math.PI, 0, 0.5 * Math.PI);
...
material.side = THREE.DoubleSide;

Resources