Using the default built-in collision animator of Irrlicht I have found out that it works only on one side of the polygons of my geometry.
I have used the following code:
selector = smgr->createOctreeTriangleSelector(
q3node->getMesh(), q3node, 128);
q3node->setTriangleSelector(selector);
ICameraSceneNode* camera =
smgr->addCameraSceneNodeFPS(0, 100.0f, .3f, ID_IsNotPickable, 0, 0, true, 3.f);
ISceneNodeAnimator* anim = smgr->createCollisionResponseAnimator(
selector, camera, core::vector3df(30,50,30),
core::vector3df(0,-10,0), core::vector3df(0,30,0));
selector->drop();
camera->addAnimator(anim);
anim->drop();
Moreover I have noticed that the geometry is not textured on the other side.
Any suggestions?
Thanks in advance.
The concept of disabling the other side of the polygons is called Backface Culling
To disable it on your geometry do the following before performing any other actions:
q3node->setMaterialFlag(EMF_BACKFACE_CULLING, false);
Alternatively you may make your geometry with normals on both sides. This would even make irrlicht do less computations.
No need to disable the blackface culling then.
Related
Here a picture of my problem, like you can see my shadows aren't smooth on my ShaderMaterial wich is a copy of PhongMaterial.
For the moment I have try all of these solution find on internet but nothing seems to work :
groundGeometry.computeFaceNormals();
groundGeometry.computeVertexNormals();
groundGeometry.verticesNeedUpdate = true;
renderer.shadowMapType = THREE.PCFShadowMap; //And all options possible
renderer.shadowMapCullFace = THREE.CullFaceBack; //And the other one
renderer = new THREE.WebGLRenderer({ antialias : true });
I also play with the different parameters of my SpotLight (also try with Directional light:
shadowMapWidth
shadowMapHeight
shadowCameraLeft
shadowCameraRight
shadowCameraBottom
shadowCameraTop
shadowCameraNear
shadowCameraFar
shadowBias
To be honest I don't know what to do now :/
So any help would be very appreciate !
Probably it is causes by self-shadowing. Try to set a small value of shadowBias like 0.05 or -0.05.
If your scene permits, reduce as much as possible the shadowmap frustum, changing the value of shadowCamera options to fit exactly the objects that you want.
If the angle between the light and normal at some point of the object is nearly 90 degrees, you can try to scale the bias depending of slope. There is an approach called slope scaled bias that consist in scale the bias depending of the angle between light vector and normal vector. The formula is:
float slope_bias = bias * tan(acos(dot(normal,-lightDirection)));
I hope it helps a little bit.
In my current project, I need a way to outline a mesh.This color outline will represent the object's current state, relevant for me.
The problem is that it is a custom mesh, loaded using JSONLoader.
I've tried different approaches, following (mainly) these 2 examples: https://stemkoski.github.io/Three.js/Outline.html and
THREEx.geometricglow. In both cases, I scale the mesh outline to a bit bigger than the original. My main problem is that scaling equally in all axis will not cover my object the way I intended to.
Here is the code I'm using:
var outlineMaterial2 = new THREE.MeshBasicMaterial( { color: 0x00ff00, side: THREE.BackSide, transparent: true, opacity:0.5 } );
var outlineMesh2 = new THREE.Mesh( object.geometry, outlineMaterial2 );
outlineMesh2.position.copy(object.position);
outlineMesh2.scale.copy(object.scale);
outlineMesh2.scale.multiplyScalar(1.1)
scene.add( outlineMesh2 );
`
With a simple cube mesh, the outline will be good.
But with my custom mesh, the scale will not fit the shape correctly.
Here is a image demonstrating: http://s13.postimg.org/syujtd75z/print1.png
Also, using Stemkoski approach, the outlining mesh will also show in front of the object, not just outline (as seen in the above picture).
My question is: How should I resize the mesh? For what I know, it might have something to do with face normals.
Thanks for your time.
I have a problem with masking in Three.js.
I want to have outline around object and I did it using this tutorial
http://www.codeproject.com/Articles/8499/Generating-Outlines-in-OpenGL
I wrote this code;
renderer.autoClear = false;
...
renderer.render(scene, camera);
...
var gl = this.world.renderer.domElement.getContext('webgl') || this.world.renderer.domElement.getContext('experimental-webgl');
gl.clearStencil(0);
gl.clear(gl.STENCIL_BUFFER_BIT);
gl.enable(gl.STENCIL_TEST);
gl.stencilFunc(gl.ALWAYS, 1, 1);
gl.stencilOp(gl.KEEP, gl.REPLACE, gl.REPLACE);
gl.colorMask(0, 0, 0, 0);
renderer.render(sceneMask, camera);
gl.colorMask(1, 1, 1, 1);
gl.stencilFunc(gl.NOTEQUAL, 1, 1);
gl.stencilOp(gl.KEEP, gl.REPLACE, gl.REPLACE);
renderer.render(sceneOutlines, camera);
gl.disable(gl.STENCIL_TEST);
and it works like a charm.
But i want to have outline more thicker. On windows, web browsers using angle and DirectX so i can render thicker lines.
(I know that i can use scaled object by vertex normals, but in this way i will create outline thicker in some places and thiner in other)
Then i got the idea, to blur outline.
I found this tutorial
(this is not a http link)://stemkoski.blogspot.com/2013/03/using-shaders-and-selective-glow.html
and i add MaskPass before rendering scene with objects that will be blured.
What happend then? Nothing.
I inverting mask and disabling buffer clear for mask and render passes but in overall i dont know what im doing.
This is the jsFiddle with some example that i made.
http://jsfiddle.net/9MtGR/15/
It looks like outline works but im using additive shader and green cube (that should work as outline) is added to red cube (that should receive outline).
Is it possible to use Three.js masking in the way that red cube will have green blured outline?
Or mayby is there other way to get the same effect using not Three.js methods?
P.S. This is a matter of life and death so it's not a joke.
When I was working on some animation that required me to include star-wars-like lasers - that what helped in the end: http://bkcore.com/blog/3d/webgl-three-js-animated-selective-glow.html
Especially this example: http://demo.bkcore.com/threejs/webgl_tron_iso.html
I'm loading a jpeg file for light map
var texture = new THREE.ImageUtils.loadTexture("textures/metal.jpg");
Then I apply the texture to THREE.MeshPhongMaterial
var frontMaterial = new THREE.MeshPhongMaterial( {
color: 0xfade7e,
specular: 0xffffff,
ambient: 0xaa0000,
lightMap:texture
} )
Full error message is WebGLRenderingContext: GL ERROR :GL_INVALID_OPERATION : glDrawElements: attempt to access out of range vertices in attribute 2
Is here something wrong? An error occures in all browsers. Three.js r.56
As explained by #alteredq in this thread, a LightMap requires a second set of UVs.
The point of lightmaps is that they can live independently of other textures, thus giving other textures chance to be much higher detail. Lightmaps use their own set of UV coordinates (usually auto-generated by some light baking solution, as opposed to artist-created primary UV set).
Using lightmaps with the same UVs as everything else doesn't make much sense, as then you could achieve basically the same result for less texture cost simply by baking light map together with color map (this is e.g. what Rage uses, it looks fantastic but needs boatload of textures).
Also lightmaps should be multiplicative, not additive. Big use case for lightmaps are pre-baked shadows and ambient occlusion, so you need to be able to darken things.
So the answer to your question is that geometry.faceVertexUvs[0] contains the usual set of UVs; you need to add to your geometry geometry.faceVertexUvs[1].
three.js r.56
This error become because the Three.js buffers are outdated. When your add some textures (map,bumpMap ...) to a Mesh, you must recompose the buffers and update UVs like this :
ob is THREE.Mesh, mt is a Material, tex is a texture.
tex.needsUpdate = true;
mt.map = tex;
ob.material = mt;
ob.geometry.buffersNeedUpdate = true;
ob.geometry.uvsNeedUpdate = true;
mt.needsUpdate = true;
That's all folks !
Hope it's help.
Regards.
Sayris
I have this object I'm loading with THREE.objLoader and then create a mesh with it like so:
mesh = new THREE.SceneUtils.createMultiMaterialObject(
geometry,
[
new THREE.MeshBasicMaterial({color: 0xFEC1EA}),
new THREE.MeshBasicMaterial({
color: 0x999999,
wireframe: true,
transparent: true,
opacity: 0.85
})
]
);
In my scene I then add a DirectionalLight, it works and I can see my object, however it's like the DirectionalLight was an ambient one. No face is getting darker or lighter as it should be.
The object is filled with the color, but no lighting is applied to it.
If someone can help me with that it would be much appreciated :)
What could I be missing ?
Jsfiddle here: http://jsfiddle.net/5hcDs/
Ok folks, thanks to Maƫl Nison and mr doob I was able to understand the few things I was missing, being the total 3d noob that I am... I believe people starting to get into the 3d may find useful a little recap:
Basic 3d concepts
A 3d Face is made of some points (Vertex), and a vector called a normal, indicating the direction of the face (which side is the front and which one is the backside).
Not having normals can be really bad, because lighting is applied on the frontside only by default. Hence the black model when trying to apply a LambertMaterial or PhongMaterial.
An OBJ file is a way to describe 3D information. Want more info on this? Read this wikipedia article (en). Also, the french page provides a cube example which can be useful for testing.
Three.js tips and tricks
When normals are not present, the lighting can't be applied, hence the black model render. Three.js can actually compute vertex and face normals with geometry.computeVertexNormals() and/or geometry.computeFaceNormals() depending on what's missing
When you do so, there's a chance Three.js' normal calculation will be wrong and your normals will be flipped, to fix this you can simply loop through your geometry's faces array like so:
/* Compute normals */
geometry.computeFaceNormals();
geometry.computeVertexNormals();
/* Next 3 lines seems not to be mandatory */
mesh.geometry.dynamic = true
mesh.geometry.__dirtyVertices = true;
mesh.geometry.__dirtyNormals = true;
mesh.flipSided = true;
mesh.doubleSided = true;
/* Flip normals*/
for(var i = 0; i<mesh.geometry.faces.length; i++) {
mesh.geometry.faces[i].normal.x = -1*mesh.geometry.faces[i].normal.x;
mesh.geometry.faces[i].normal.y = -1*mesh.geometry.faces[i].normal.y;
mesh.geometry.faces[i].normal.z = -1*mesh.geometry.faces[i].normal.z;
}
You have to use a MeshPhongMaterial. MeshBasicMaterial does not take light in account when computing fragment color.
However, when using a MeshPhongMaterial, your mesh becomes black. I've never used the OBJ loader, but are you sure your model normales are right ?
Btw : you probably want to use a PointLight instead. And its position should probably be set to the camera position (light.position = camera.position should do the trick, as it will allow the light to be moved when the camera position will be edited by the Controls).