Does someone know how to make the spotlight reflection here: http://web251.merkur.ibone.ch/webgl/three/ look like the one here: http://web251.merkur.ibone.ch/webgl ? i.e. that it gets reflected to the camera? It's weird that it doesn't do so automatically. If you move around the planet with the mouse you'll notice. In this scene everything is static but the camera, and with camera movement also the eye E gets moved around, right? So what I'd expect is the spotLight reflection on the planet is rerendered/recalculated all the time, e.g. with Blinn's Halfvector, leading to a reflection on the planet inbetween E and the spotlight.
Help is really appreciated, we've searched for hours but couldn't find a clue what was wrong with our code!
Thanks in advance
Doidel
The first bit of starting code I used for playing with this was:
http://mrdoob.github.com/three.js/examples/webgl_materials_shaders.html
The key is adding a specularity map with the specularMap property for the (Phong) Material.
Such can be done as follows:
var MySpecularMap = THREE.ImageUtils.loadTexture( "MySpecularImage.jpg" );
var Color = THREE.ImageUtils.loadTexture( "MyColorImage.jpg" );
var mappedTexture = new THREE.MeshPhongMaterial( { color: 0xffffff, map: Color, specular: 0xffffff, specularMap: MymapSpecular} );
sphere = new THREE.SphereGeometry( 600, 32, 32 );
globe = new THREE.Mesh( sphere, mappedTexture );
scene.add( globe );
Also, for this type of demo OrbitControls seem to be the best.
Here's a finished example with {ColorMap, SpecularMap, BumpMap, Clouds, SkyDome}:
http://randompast.github.io/randomtests/three.js/earth/1/index.html
Related
I'm trying to render shadows using the latest version of three.js (r102), and I'm not sure what I am doing wrong. I am using MeshPhongMaterial with castShadow and receiveShadow set on all relevant meshes, and a directional light facing towards the scene content. Could someone take a look at this and help me figure out how to get these shadows working? Thanks!
Live demo (toggle shadows in the menu):
https://argonjs.github.io/three-web-layer/
Source:
https://github.com/argonjs/three-web-layer
If you add a small cube in front of your WebLayer3D, it correctly casts shadows on rendered DOM layers:
//in app.ts just after light with shadow camera:
let geometryBox = new THREE.BoxBufferGeometry( 0.01, 0.01, 0.01 )
let materialRed = new THREE.MeshPhongMaterial( {color: 0xff0000} )
let cubeSmall = new THREE.Mesh( geometryBox, materialRed )
cubeSmall.position.set( 0.1, -0.03, 0.1 )
cubeSmall.castShadow = true
cubeSmall.receiveShadow = true
scene.add( cubeSmall )
So, only the planes produced by the WebLayer3D do not cast shadows, the setup for the scene / camera / light is correct.
Update: the explanation below is not the reason, see the solution with material.shadowSide in another answer.
If you look at the tree of objects in three.js realm (i.e. traversing through children[]), starting with todoLayer - a lot of them will have castShadow at "false". You will have to re-think your strategy here. Also note, castShadow=false on parent Object3D turns it off for the children.
I figured it out after the hint from Alex (thanks Alex!).
Basically, as strange as it seems, a plane in three.js will not cast shadows unless it is double-sided (Update: Or unless material.shadowSide is set to THREE.FrontSide). Once I set THREE.DoubleSide on the plane material, it worked as expected. Basically, for a textured plane to cast shadows, the following is needed (as of three.js r102):
var mesh = new THREE.Mesh(
new THREE.PlaneGeometry(1,1,2,2),
new THREE.MeshPhongMaterial({
map: texture,
side: THREE.DoubleSide, // important!
alphaTest: 0.1,
})
mesh.customDepthMaterial = new THREE.MeshDepthMaterial({
map: texture
depthPacking: THREE.RGBADepthPacking,
alphaTest: 0.1
})
I also had to adjust the light's shadow bias in order to eliminate artifacts.
I just discovered this piece of code, which produces a nice Halo effect:
Shader-Halo
I'm planning to remove the texture, and replace the SphereGeometry to a pipe or a line or something which will then act as a part of the lightning (I planing to chain a couple of glowing pipes/lines to appear as a lightning.)
So my question is: how to change this part in the code to draw a pipe/line instead of a sphere:
var ballGeometry = new THREE.SphereGeometry( 120, 32, 16 );
var ball = new THREE.Mesh( ballGeometry, customMaterial );
scene.add( ball );
I have some code similar to the following...
this.texture = new THREE.ImageUtils.loadTexture( 'spritesheet.png' );
this.material = new THREE.MeshBasicMaterial( { map: this.texture, side:THREE.DoubleSide } );
this.geometry = new THREE.PlaneGeometry(32, 32, 1, 1);
this.sprite = new THREE.Mesh( this.geometry, this.material );
game.scene.add( this.sprite );
I've also tried along the lines of...
this.material = new THREE.SpriteMaterial( {
map: image,
useScreenCoordinates: true,
alignment: THREE.SpriteAlignment.center
} );
this.sprite = new THREE.Sprite( this.material );
These display the full spritesheet (sort of), as I would expect without further settings.
How do I align the sprite so it only displays say 32x32px starting at offset 50,60 for example ? The three.js documentation doesn't seem have much information, and the examples I've seen tend to use one image per sprite (which may be preferable, or only way possible ?)
Edit: I've spotted a material uvOffset and uvScale that I suspect is related to alignment in a Sprite object if anyone knows how these work. Will dig further.
Well, there is a "uvOffset" and "uvScale" parameter in spriteMaterial , i think you could use those but I cannot present any source code to you.
What you can of course do is using PlaneGeometry and calculate UV Coordinates for the 2 triangles (the plane). For example top-left is your offset and bottom right is calculated from a given offset and size (32x32) but using the whole image size in pixels to get the UV coordinates between 0 and 1
for example topleft is (50/imageSize, 60/imagesize) and bottom right is ( (50+32)/imgSize, (60+32)/imgSize). I think this should work, although i am not quite sure if you would get the result you want as OpenGL treats images "up side down". But you can try and go on from here. Hope this helps.
I use CubeCamera to build a simple reflection model. The setup can be seen on the picture below.
If the camera is close enough to the cube - the reflection looks fine. However, if i move away from the objects - the reflection just gets bigger. See the picture below.
This is not the way i want it. I'd like the reflection to proportionally get smaller. I tried to play with different settings, then I thought this could be achieved using a proper shader program (just squish the cube texture, kind of), so i've tried to mess with the existing PhongShader, but no luck there, i'm too newbie to this.
Also, i've noticed that if i change the width and height of the cubeCamera.renderTarget, i.e.
cubeCamera.renderTarget.width = cubeCamera.renderTarget.height = 150;
i can get the proper dimensions of the reflection, but its position on the surface is wrong. It's visible from the angle presented on the picture below, but not visible if i place the camera straight. Looks like the texture needs to be centered.
The actual code is pretty straightforward:
var cubeCamera = new THREE.CubeCamera(1, 520, 512);
cubeCamera.position.set(0, 1, 0);
cubeCamera.renderTarget.format = THREE.RGBAFormat;
scene.add(cubeCamera);
var reflectorObj = new THREE.Mesh(
new THREE.CubeGeometry(20, 20, 20),
new THREE.MeshPhongMaterial({
envMap: cubeCamera.renderTarget,
reflectivity: 0.3
})
);
reflectorObj.position.set(0, 0, 0);
scene.add(reflectorObj);
var reflectionObj = new THREE.Mesh(
new THREE.SphereGeometry(5),
new THREE.MeshBasicMaterial({
color: 0x00ff00
})
);
reflectionObj.position.set(0, -5, 20);
scene.add(reflectionObj);
function animate () {
reflectorObj.visible = false;
cubeCamera.updateCubeMap(renderer, scene);
reflectorObj.visible = true;
renderer.render(scene, camera);
requestAnimationFrame(animate);
}
Appreciate any help!
Environment mapping in three.js is based on the assumption that the object being reflected is "infinitely" far away from the reflective surface.
The reflected ray used in the environment map look-up does not emanate from the surface of the reflective material, but from the CubeCamera's center. This approximation is OK, as long as the reflected object is sufficiently far away. In your case it is not.
You can read more about this topic in this tutorial.
three.js r.58
I would like somebody to explain me how I can achieve the blue semi-transparent intermittent sphere of this example: (the one next to the intermittent red sphere)
http://threejs.org/examples/webgl_materials.html
I believe in first place that this is a matter of using the right material with the right settings (specially because the example is about materials) but not sure anyway.
Hopefully you do not feel my question does not deserve to be made here. I was trying to analyze it but definitely it is written in a non-friendly way for newbies, and I've not been able to separate this part from the rest, not I find an explanation anywhere else.
To create, for example, a partially transparent blue sphere, you could try:
var sphereGeom = new THREE.SphereGeometry( 40, 32, 16 );
var blueMaterial = new THREE.MeshBasicMaterial( { color: 0x0000ff, transparent: true, opacity: 0.5 } );
var sphere = new THREE.Mesh( sphereGeom, blueMaterial );
For more examples of creating semi-transparent materials, check out
http://stemkoski.github.io/Three.js/Translucence.html
If you want the sphere to fade in and out, you can change the transparency in your update or render function -- make the sphere a global object, also create a (global) clock object to keep track of the time in your initialization, for example, with
clock = new THREE.Clock();
and then in your update, you could, for example, write
sphere.material.opacity = 0.5 * (1 + Math.sin( clock.getElapsedTime() ) );