three.js transparent objects in multiple scenes not working - three.js

I am using multiple scenes as a workaround for selective lighting. Now, I meet a difficulty in using transparent objects.
For simplity, I created a jsfiddle illustration:
[1]: https://jsfiddle.net/curisiro/w9ke75ma/2/
I have two transparent squares which are in different scenes. The problem is I can see the blue square behind the red square (figure 1) but I can NOT see the red square behind the blue square (figure 2).
With material, by using other effects, depthTest and depthWrite must be set to true as default.
Do you have any solution to solve this problem?

Edit: If you insist on using two scenes, you can fix this problem by clearing the depth between the renders:
function render() {
requestAnimationFrame(render);
this.renderer.clear();
renderer.render(scene, camera);
renderer.clearDepth(); // <--- Like this
renderer.render(scene1, camera);
}
However, this is limiting if you plan to add more complexity to the scene and need depth testing to take place between them. Alternatively, just render to the same scene:
let geometry = new THREE.BoxGeometry(1, 1, 1);
let material = new THREE.MeshStandardMaterial({color: 0x0000ff, transparent: true, opacity: 0.4});
let mesh = new THREE.Mesh(geometry, material);
scene.add(mesh);
let geometry1 = new THREE.BoxGeometry(1, 1, 1);
let material1 = new THREE.MeshStandardMaterial({color: 0xff0000, transparent: true, opacity: 0.4});
let mesh1 = new THREE.Mesh(geometry1, material1);
mesh1.position.z = 2;
scene.add(mesh1);
(see forked fiddle). In this case, you would handle selective lighting some other way (layers, or custom materials perhaps, depending on what you need).

Related

Attempts to load a texture show no error but the texture does not display

I have a model, a background sky and a ground surface. Texturing the surface results in no surface.
I've tried the basic approach and come to the conclusion that it is probably that the scene is being rendered before the texture has finished loading. Having searched and found various possible solutions, I have tried several of them, without really understanding how they are supposed to work. None of them has worked. One problem is that it is an old problem and most of the suggestions involve outdated versions of the three.js library.
// Ground
// create a textured Ground based on an answer in Stackoverflow.
var loader = new THREE.TextureLoader();
loader.load('Textures/Ground128.jpg',
function (texture) {
var groundGeometry = new THREE.PlaneBufferGeometry(2000, 2000, 100, 100);
const groundMaterial = new THREE.MeshLambertMaterial({map: texture});
var ground = new THREE.Mesh(groundGeometry, groundMaterial);
ground.receiveShadow = true; //Illumination addition
ground.rotation.x = -0.5 * Math.PI; // rotate into the horizontal.
scene.add(ground);
}
);
// This variation does not work either
http://lhodges.users37.interdns.co.uk/me/downloads/Aphaia/Temple.htm
http://lhodges.users37.interdns.co.uk/me/downloads/Aphaia/Temple7jsV0.15b.htm
The first of the above is the complete page in which the ground is a plain billiard table green. The second is the page containing the above code.
There appear to be no error (Last time I tried.)
By the time your texture loads and you add the ground, your scene has already rendered (and there is no other render call).
You need to call renderer.render(scene, camera); after adding the ground to the scene.
// Ground
// create a textured Ground based on an answer in Stackoverflow.
var loader = new THREE.TextureLoader();
loader.load('Textures/Ground128.jpg',
function (texture) {
var groundGeometry = new THREE.PlaneBufferGeometry(2000, 2000, 100, 100);
const groundMaterial = new THREE.MeshLambertMaterial({map: texture});
var ground = new THREE.Mesh(groundGeometry, groundMaterial);
ground.receiveShadow = true; //Illumination addition
ground.rotation.x = -0.5 * Math.PI; // rotate into the horizontal.
scene.add(ground);
renderer.render(scene, camera); // <--- add this line
}
);

ThreeJS: White PNG image loaded as texture, used as material and rendered as plane has grey edges

I'm having an issue when rendering a white material in ThreeJS version 87.
Here are the steps to replicate:
A white PNG image that is loaded as texture
This texture is used to create a MeshBasicMaterial (passed as parameter map)
The MeshBasicMaterial is used along a plane Geometry to create a Mesh
The Mesh is added to an empty Scene and rendered on a WebGLRenderer with alpha: true and clearColor as white
The problem is that the rendered texture now has grey edges on parts that should be fully white.
This happens with any image with white edges. I've also tried many different configurations for the renderer and the material but to no avail.
I've made a very simple CodePen that replicates the behavior as simple as possible. Does anyone know how can this problem be solved?
CodePen:
https://codepen.io/ivan-i1/pen/pZxwZX
var renderer, width, height, scene, camera, dataUrl, threeTexture, geometry, material, mesh;
width = window.innerWidth;
height = window.innerHeight;
dataUrl = '//data url from image';
threeTexture = new THREE.ImageUtils.loadTexture(dataUrl);
material = new THREE.MeshBasicMaterial({
map: threeTexture,
transparent: true,
alphaTest: 0.1
});
material.needsUpdate = true;
geometry = new THREE.PlaneGeometry(5, 5);
mesh = new THREE.Mesh(geometry, material);
mesh.position.z = -5;
scene = new THREE.Scene();
scene.add(mesh);
camera = new THREE.PerspectiveCamera( 70, window.innerWidth / window.innerHeight, 1, 1000 );
renderer = new THREE.WebGLRenderer({
alpha: true
});
document.body.appendChild( renderer.domElement );
renderer.setSize(width, height);
renderer.setClearColor( 0xffffff, 1 );
//renderer.render(scene, camera);
function render() {
//Finally, draw to the screen
requestAnimationFrame(render);
renderer.render(scene, camera);
}
render();
Any help is truly appreciated.
ThreeJS/87
Edit:
I think I'm lacking more precision on my post.
This is the original full alpha image:
It might not show because its all white
And this is the same image with different transparencies on 4 quadrants:
This one too might not show because its all white
I got a helpful answer where I was told to make the alphaTest higher, but the problem is that doing that wipes out the transparent parts out of the images, and I need to conserve those parts.
Here is a copy of the codepen with the updated images and showing the same (but slight) grey edges:
codepen
Sorry for not being as precise the first time, any further help is even more appreciated.
Set alphaTest to 0.9.. or higher.. observe the improvement.
Your star texture has gray or black in the area outside the star, which is why you're seeing a gray halo. You can fix it by filling the image with white, (but not changing the alpha channel) in your image editing tool.
Also, you should upgrade to latest three.js (r95)
edit:
I'm not sure what your exact expectation is.. but there are many different settings that control alpha blending in THREE. There is renderer.premultipliedAlpha = true/false (defaults to true) and material.transparent = true/false; material.alphaTest is a threshold value to control at what level alpha is ignored completely. There are also the material.blending, .blendEquation .blendEquation, .blendEquationAlpha, blendDst and blendSrc. etc. etc. You probably need to read up on those.
https://threejs.org/docs/#api/materials/Material
For instance.. here is your texture with:
renderer.premultipliedAlpha = false;
notice the black border on one quadrant of your texture.
https://codepen.io/manthrax/pen/KBraNB

Displacement map on BoxGeometry only moves edges?

I'm just starting to get my bearings with threejs and I'm having an issue using Displacement Maps.
http://codepen.io/jpschwinghamer/pen/BWPebJ
I have a simple BoxGeometry that I'm trying to apply textures to a phong material. All seem to work correctly except for the displacement map. I made sure to add segments to the BoxGeometry instantiation. Is there some bit of magic that I'm missing to make my displacement map work correctly?
Consider this code:
var animate, camera, displacement, geometry, light, light1, map, material, mesh, normal, reflection, renderer, roughness, textureLoader;
renderer = new THREE.WebGLRenderer({
canvas: document.querySelector('canvas'),
antialiased: true
});
renderer.setClearColor(0xfff000);
renderer.setSize(window.innerWidth, window.innerHeight);
renderer.setPixelRatio(window.devicePixelRatio);
camera = new THREE.PerspectiveCamera(35, window.innerWidth / window.innerHeight, 0.1, 3000);
window.scene = new THREE.Scene();
light = new THREE.AmbientLight(0xffffff, 0.5);
scene.add(light);
light1 = new THREE.PointLight(0xffffff, 0.6);
scene.add(light1);
textureLoader = new THREE.TextureLoader();
textureLoader.setCrossOrigin("anonymous");
map = textureLoader.load("https://s3-us-west-2.amazonaws.com/s.cdpn.io/65874/WoodFlooring044_COL_2K.jpg");
normal = textureLoader.load("https://s3-us-west-2.amazonaws.com/s.cdpn.io/65874/WoodFlooring044_NRM_2K.jpg");
roughness = textureLoader.load("https://s3-us-west-2.amazonaws.com/s.cdpn.io/65874/WoodFlooring044_GLOSS_2K.jpg");
reflection = textureLoader.load("https://s3-us-west-2.amazonaws.com/s.cdpn.io/65874/WoodFlooring044_REFL_2K.jpg");
displacement = textureLoader.load("https://s3-us-west-2.amazonaws.com/s.cdpn.io/65874/WoodFlooring044_DISP_2K.jpg");
geometry = new THREE.BoxGeometry(100, 100, 100, 10, 10, 10);
material = new THREE.MeshPhongMaterial({
map: map,
normalMap: normal,
normalScale: new THREE.Vector2(30, -1),
roughnessMap: roughness,
reflectionMap: reflection,
displacementMap: displacement,
displacementScale: 1,
displacementBias: 0
});
mesh = new THREE.Mesh(geometry, material);
mesh.position.set(0, 0, -700);
scene.add(mesh);
animate = function() {
mesh.rotation.x += 0.01;
mesh.rotation.y += 0.01;
requestAnimationFrame(animate);
return renderer.render(scene, camera);
};
animate();
The problem with your displacement map is that there is very little variation in the shades of grey (see it here).
The lowest points should be black and the highest should be white (or the other way around, I forget).
You would also need to have enough verticies for this to affect.
I think though that this map should be assigned to the bump map.
But... you probably don't want to use a bump map and normal map. it's usually one or the other.
But, having a look at the normal map you are loading, it is having no effect because it is all the same color. (see it here), so therefore, it's only the bump map that you need.
Also, MeshPhongMaterial does not seem to have roughnessMap or reflectionMap uniforms/properties, so these are doing nothing.
So basically, you only need to load you map and your displacement map, but put the displacement map on the bump map instead.
Maybe also your roughness image putting it on your specularMap.
EDIT.
Embedding code here does not seem to run because of CORS on the images.
See a FIDDLE HERE
textureLoader = new THREE.TextureLoader();
textureLoader.setCrossOrigin("anonymous");
map = textureLoader.load("https://s3-us-west-2.amazonaws.com/s.cdpn.io/65874/WoodFlooring044_COL_2K.jpg");
//normal = textureLoader.load("https://s3-us-west-2.amazonaws.com/s.cdpn.io/65874/WoodFlooring044_NRM_2K.jpg");
roughness = textureLoader.load("https://s3-us-west-2.amazonaws.com/s.cdpn.io/65874/WoodFlooring044_GLOSS_2K.jpg");
//reflection = textureLoader.load("https://s3-us-west-2.amazonaws.com/s.cdpn.io/65874/WoodFlooring044_REFL_2K.jpg");
displacement = textureLoader.load("https://s3-us-west-2.amazonaws.com/s.cdpn.io/65874/WoodFlooring044_DISP_2K.jpg");
// Create material
material = new THREE.MeshPhongMaterial({
map: map,
//normalMap: normal,
//normalScale: new THREE.Vector2(30, -1),
//roughnessMap: roughness,
//reflectionMap: reflection,
bumpMap: displacement,
bumpScale: 100,
//displacementBias: 0
specularMap: roughness
});
I believe the problem is that the BoxGeometry only contains 4 vertices per side. When you apply a displacement map, you actually move the positions of the vertices where a normal map would "fake" this on a fragment level.
To solve your problem, I would try with a geometry with a higher number of vertices. A sphere works fine for testing. Then import/generate a box geometry with more vertices.
Hope this helps!

Why does this ThreeJs plane appear to get a kink in it as the camera moves down the y-axis?

I have an instance of THREE.PlaneBufferGeometry that I apply an image texture to like this:
var camera, scene, renderer;
var geometry, material, mesh, light, floor;
scene = new THREE.Scene();
THREE.ImageUtils.loadTexture( "someImage.png", undefined, handleLoaded, handleError );
function handleLoaded(texture) {
var geometry = new THREE.PlaneBufferGeometry(
texture.image.naturalWidth,
texture.image.naturalHeight,
1,
1
);
var material = new THREE.MeshBasicMaterial({
map: texture,
overdraw: true
});
floor = new THREE.Mesh( geometry, material );
floor.material.side = THREE.DoubleSide;
scene.add( floor );
camera = new THREE.PerspectiveCamera( 75, window.innerWidth / window.innerHeight, 1, texture.image.naturalHeight * A_BUNCH );
camera.position.z = texture.image.naturalWidth * 0.5;
camera.position.y = SOME_INT;
camera.lookAt(floor.position);
renderer = new THREE.CanvasRenderer();
renderer.setSize(window.innerWidth,window.innerHeight);
appendToDom();
animate();
}
function handleError() {
console.log(arguments);
}
function appendToDom() {
document.body.appendChild(renderer.domElement);
}
function animate() {
requestAnimationFrame(animate);
renderer.render(scene,camera);
}
Here's the code pen: http://codepen.io/anon/pen/qELxvj?editors=001
( Note: ThreeJs "pollutes" the global scope, to use a harsh term, and then decorates THREE using a decorator pattern--relying on scripts loading in the correct order without using a module loader system. So, for brevity's sake, I simply copy-pasted the source code of a few required decorators into the code pen to ensure they load in the right order. You'll have to scroll down several thousand lines to the bottom of the code pen to play with the code that instantiates the plane, paints it and moves the camera. )
In the code pen, I simply lay the plane flat against the x-y axis, looking straight up the z-axis, as it were. Then, I slowly pan the camera down along the y-axis, continuously pointing it at the plane.
As you can see in the code pen, as the camera moves along the y-axis in the negative direction, the texture on the plane appears to develop a kink in it around West Texas.
Why? How can I prevent this from happening?
I've seen similar behaviour, not in three.js, not in a browser with webGL but with directX and vvvv; still, i think you'll just have to set widthSegments/heightSegments of your PlaneBufferGeometry to a higher level (>4) and you're set!

How to draw a costom cube in three.js?

I just want to draw one cube, such as one building(I got the places on the ground which is consisted of four points),and I also knew the height of the building. So how to draw? Thanks very much.
This is pretty straightforward:
http://threejs.org/docs/#Reference/Extras.Geometries/CubeGeometry
var building = new THREE.CubeGeometry(width, height, depth, 1, 1, 1);
var material = new THREE.MeshLambertMaterial({ color: 0xFF0000 });
var mesh = new THREE.Mesh(building, material);
scene.add(mesh);

Resources