Make environment map scale when moving from the object - three.js

I use CubeCamera to build a simple reflection model. The setup can be seen on the picture below.
If the camera is close enough to the cube - the reflection looks fine. However, if i move away from the objects - the reflection just gets bigger. See the picture below.
This is not the way i want it. I'd like the reflection to proportionally get smaller. I tried to play with different settings, then I thought this could be achieved using a proper shader program (just squish the cube texture, kind of), so i've tried to mess with the existing PhongShader, but no luck there, i'm too newbie to this.
Also, i've noticed that if i change the width and height of the cubeCamera.renderTarget, i.e.
cubeCamera.renderTarget.width = cubeCamera.renderTarget.height = 150;
i can get the proper dimensions of the reflection, but its position on the surface is wrong. It's visible from the angle presented on the picture below, but not visible if i place the camera straight. Looks like the texture needs to be centered.
The actual code is pretty straightforward:
var cubeCamera = new THREE.CubeCamera(1, 520, 512);
cubeCamera.position.set(0, 1, 0);
cubeCamera.renderTarget.format = THREE.RGBAFormat;
scene.add(cubeCamera);
var reflectorObj = new THREE.Mesh(
new THREE.CubeGeometry(20, 20, 20),
new THREE.MeshPhongMaterial({
envMap: cubeCamera.renderTarget,
reflectivity: 0.3
})
);
reflectorObj.position.set(0, 0, 0);
scene.add(reflectorObj);
var reflectionObj = new THREE.Mesh(
new THREE.SphereGeometry(5),
new THREE.MeshBasicMaterial({
color: 0x00ff00
})
);
reflectionObj.position.set(0, -5, 20);
scene.add(reflectionObj);
function animate () {
reflectorObj.visible = false;
cubeCamera.updateCubeMap(renderer, scene);
reflectorObj.visible = true;
renderer.render(scene, camera);
requestAnimationFrame(animate);
}
Appreciate any help!

Environment mapping in three.js is based on the assumption that the object being reflected is "infinitely" far away from the reflective surface.
The reflected ray used in the environment map look-up does not emanate from the surface of the reflective material, but from the CubeCamera's center. This approximation is OK, as long as the reflected object is sufficiently far away. In your case it is not.
You can read more about this topic in this tutorial.
three.js r.58

Related

Why transparency doesn't work on my mesh?

I have two meshes on my scene. One cylinder and one classic plane in the middle.
I applied a png texture on my cylinder so we can see through. It seems to work for the cylinder.
On this screenshot you'll easily see my issue : I don't understand why my image is not visible behind my cylinder.
The code I used for my cylinder :
myCylinderMesh.material.transparent = true;
myCylinderMesh.material.side = THREE.DoubleSide;
How can I manage to see the part of the image hidden behind the cylinder ?
EDIT 1 :
I added the code that #ScieCode sent me :
myCylinderMesh.material.alphaTest = 0.5;
Here's the result :
It works better : now I can see the part missing of my image. But there's one thing missing : the opacity of my cylinder. I'm supposed to see my image behind the letters too.
Currently I have this opacity :
myCylinderMesh.material.opacity = 0.7;
Do you know what I am missing ? Thanks
EDIT 2 :
Here's the code for my two meshes :
Cylinder :
geoCylinder = new THREE.CylinderBufferGeometry( 0.4, 0.4, 2*Math.PI*0.4/(2048/128), 64, 1, true );
matCylinder = new THREE.MeshBasicMaterial( { map:texture, transparent:true, color:0x000000, alphaTest: 0.5, opacity: 0.6, side: THREE.DoubleSide } );
meshCylinder = new THREE.Mesh( geoCylinder, matCylinder );
Plane :
geoPlane = new THREE.PlaneBufferGeometry( 0.8, 0.8 );
matPlane = new THREE.MeshBasicMaterial( { map: texturePlane, transparent:true} );
meshPlane = new THREE.Mesh( geoPlane, matPlane );
This behavior happens because of how transparency rendering works internally. Transparent objects need to be sorted/rendered separately from opaque objects. This assures that objects will render as expected on the final image. (not always, though)
The problem here is that your plane geometry is inside the cylinder geometry, when sorting it will either be rendered first or later. Which, in turn, causes these artifacts you are experiencing here. The whole transparency render is a lot more complex than what I'm making it to be.
Since your plane object doesn't need to be translucent, you can simply set the alphaTest property of its material. Which will only render the fragment pixels with alpha greater than this value. This will also prevent that object from being considered transparent and it will always be rendered first, fixing the artifacts in your scene.
JSFiddle
Additional info: When using a transparent material with DoubleSide, you might experience self transparency problems. This happens for the same reason I just explained, but between faces of the same object. A possible solution for this problem is to set depthWrite = false, this prevent the object from writing to the depth buffer. So every face will get rendered, disregarding if another face occludes it.

Difficulty in sprite texture alignment

I have some code similar to the following...
this.texture = new THREE.ImageUtils.loadTexture( 'spritesheet.png' );
this.material = new THREE.MeshBasicMaterial( { map: this.texture, side:THREE.DoubleSide } );
this.geometry = new THREE.PlaneGeometry(32, 32, 1, 1);
this.sprite = new THREE.Mesh( this.geometry, this.material );
game.scene.add( this.sprite );
I've also tried along the lines of...
this.material = new THREE.SpriteMaterial( {
map: image,
useScreenCoordinates: true,
alignment: THREE.SpriteAlignment.center
} );
this.sprite = new THREE.Sprite( this.material );
These display the full spritesheet (sort of), as I would expect without further settings.
How do I align the sprite so it only displays say 32x32px starting at offset 50,60 for example ? The three.js documentation doesn't seem have much information, and the examples I've seen tend to use one image per sprite (which may be preferable, or only way possible ?)
Edit: I've spotted a material uvOffset and uvScale that I suspect is related to alignment in a Sprite object if anyone knows how these work. Will dig further.
Well, there is a "uvOffset" and "uvScale" parameter in spriteMaterial , i think you could use those but I cannot present any source code to you.
What you can of course do is using PlaneGeometry and calculate UV Coordinates for the 2 triangles (the plane). For example top-left is your offset and bottom right is calculated from a given offset and size (32x32) but using the whole image size in pixels to get the UV coordinates between 0 and 1
for example topleft is (50/imageSize, 60/imagesize) and bottom right is ( (50+32)/imgSize, (60+32)/imgSize). I think this should work, although i am not quite sure if you would get the result you want as OpenGL treats images "up side down". But you can try and go on from here. Hope this helps.

Intermittent semi-transparent sphere in Three.js

I would like somebody to explain me how I can achieve the blue semi-transparent intermittent sphere of this example: (the one next to the intermittent red sphere)
http://threejs.org/examples/webgl_materials.html
I believe in first place that this is a matter of using the right material with the right settings (specially because the example is about materials) but not sure anyway.
Hopefully you do not feel my question does not deserve to be made here. I was trying to analyze it but definitely it is written in a non-friendly way for newbies, and I've not been able to separate this part from the rest, not I find an explanation anywhere else.
To create, for example, a partially transparent blue sphere, you could try:
var sphereGeom = new THREE.SphereGeometry( 40, 32, 16 );
var blueMaterial = new THREE.MeshBasicMaterial( { color: 0x0000ff, transparent: true, opacity: 0.5 } );
var sphere = new THREE.Mesh( sphereGeom, blueMaterial );
For more examples of creating semi-transparent materials, check out
http://stemkoski.github.io/Three.js/Translucence.html
If you want the sphere to fade in and out, you can change the transparency in your update or render function -- make the sphere a global object, also create a (global) clock object to keep track of the time in your initialization, for example, with
clock = new THREE.Clock();
and then in your update, you could, for example, write
sphere.material.opacity = 0.5 * (1 + Math.sin( clock.getElapsedTime() ) );

How to make reflective materials change when camera rotates

I was able to make some nice metal and glass looking materials by using Skybox Cube / environment mapping.
I have made my own controls which allow one to both orbit and move/look around like in FirstPersonControls.
The problem is, the reflections look convincing when I move around - I can see the reflections move and change accordingly to my camera movement. However when I look around (rotate the camera / change it's target), there is no change in the reflections, they are just static.
I can see the same behaviour in for example three.js/examples/webgl_materials_cubemap_escher.html - if I modify it to use FirstPersonControls, the material does not look reflective/refractive at all when I look around.
Here's how I setup the cubemaps, to be honest it's copied from some example and I don't understand all of it. But it works, except for this one issue...
createSkyBox = function(urlPrefix) {
var sceneCube = new THREE.Scene();
var path = urlPrefix;
var format = '.jpg';
var urls = [
path + 'px' + format, path + 'nx' + format,
path + 'py' + format, path + 'ny' + format,
path + 'pz' + format, path + 'nz' + format
];
var reflectionCube = THREE.ImageUtils.loadTextureCube( urls );
reflectionCube.format = THREE.RGBFormat;
var refractionCube = new THREE.Texture( reflectionCube.image, new THREE.CubeRefractionMapping() );
refractionCube.format = THREE.RGBFormat;
// Skybox
var shader = THREE.ShaderUtils.lib[ "cube" ];
shader.uniforms[ "tCube" ].value = reflectionCube;
var material = new THREE.ShaderMaterial( {
fragmentShader: shader.fragmentShader,
vertexShader: shader.vertexShader,
uniforms: shader.uniforms,
depthWrite: false,
side: THREE.BackSide
} );
var size = 8000;
mesh = new THREE.Mesh( new THREE.CubeGeometry( size, size, size ), material );
mesh.geometry.computeBoundingBox();
sceneCube.add( mesh );
this._threejs_cube_scene = sceneCube;
this._threejs_cube_mesh = mesh;
this._threejs_envmap = reflectionCube;
this._threejs_envmap_refraction = refractionCube;
this._threejs_scene.add( sceneCube );
}
And here's the way I create the material:
var material = new THREE.MeshLambertMaterial( { color: 0xff00, ambient: 0xaaaaaa, envMap: this._threejs_envmap});
I then use the material in renderer.overrideMaterial (I'm using EffectComposer, if it makes any difference)
EDIT: now that I think about it, I'm not sure.. my brain melts.. it might be how the real life works :) At least intuitively when I see the code in action, the staticness while rotating camera doesn't feel right. But maybe it's because in real life it's hard to look around (eye.lookAt()) without also moving ever so slightly (eye.position = xyz).
you should calculate the reflection vector in world space (inside your code for 'fragmentShader' which you don't show here). If it's in object space, or view (camera) space, it won't move naturally.
Yes, this may mean some finagling with the surface normals. To convert object space normals to world space normals, use the inverse transpose of the world matrix. You'll also need to get the view vector in worldspace coordinates in order to calculate the final worldspace reflection vector.
Another thing to consider that's simpler than changing the shader may be giving your camera an offset if you want it to rotate like a human head. Add it to an Object3d and set it to be offset from the Object3d's position by a small amount (an amount equivalent to the distance from the human center to the eye) then rotate the Object3d instead of the camera.
It's sort-of hard to tell what effect you want though from your description, because when you simply turn your eyeballs, a reflection doesn't change. It's the slight tilt of your head that changes it.

Three.js custom objLoader geometry lighting

I have this object I'm loading with THREE.objLoader and then create a mesh with it like so:
mesh = new THREE.SceneUtils.createMultiMaterialObject(
geometry,
[
new THREE.MeshBasicMaterial({color: 0xFEC1EA}),
new THREE.MeshBasicMaterial({
color: 0x999999,
wireframe: true,
transparent: true,
opacity: 0.85
})
]
);
In my scene I then add a DirectionalLight, it works and I can see my object, however it's like the DirectionalLight was an ambient one. No face is getting darker or lighter as it should be.
The object is filled with the color, but no lighting is applied to it.
If someone can help me with that it would be much appreciated :)
What could I be missing ?
Jsfiddle here: http://jsfiddle.net/5hcDs/
Ok folks, thanks to Maƫl Nison and mr doob I was able to understand the few things I was missing, being the total 3d noob that I am... I believe people starting to get into the 3d may find useful a little recap:
Basic 3d concepts
A 3d Face is made of some points (Vertex), and a vector called a normal, indicating the direction of the face (which side is the front and which one is the backside).
Not having normals can be really bad, because lighting is applied on the frontside only by default. Hence the black model when trying to apply a LambertMaterial or PhongMaterial.
An OBJ file is a way to describe 3D information. Want more info on this? Read this wikipedia article (en). Also, the french page provides a cube example which can be useful for testing.
Three.js tips and tricks
When normals are not present, the lighting can't be applied, hence the black model render. Three.js can actually compute vertex and face normals with geometry.computeVertexNormals() and/or geometry.computeFaceNormals() depending on what's missing
When you do so, there's a chance Three.js' normal calculation will be wrong and your normals will be flipped, to fix this you can simply loop through your geometry's faces array like so:
/* Compute normals */
geometry.computeFaceNormals();
geometry.computeVertexNormals();
/* Next 3 lines seems not to be mandatory */
mesh.geometry.dynamic = true
mesh.geometry.__dirtyVertices = true;
mesh.geometry.__dirtyNormals = true;
mesh.flipSided = true;
mesh.doubleSided = true;
/* Flip normals*/
for(var i = 0; i<mesh.geometry.faces.length; i++) {
mesh.geometry.faces[i].normal.x = -1*mesh.geometry.faces[i].normal.x;
mesh.geometry.faces[i].normal.y = -1*mesh.geometry.faces[i].normal.y;
mesh.geometry.faces[i].normal.z = -1*mesh.geometry.faces[i].normal.z;
}
You have to use a MeshPhongMaterial. MeshBasicMaterial does not take light in account when computing fragment color.
However, when using a MeshPhongMaterial, your mesh becomes black. I've never used the OBJ loader, but are you sure your model normales are right ?
Btw : you probably want to use a PointLight instead. And its position should probably be set to the camera position (light.position = camera.position should do the trick, as it will allow the light to be moved when the camera position will be edited by the Controls).

Resources