Three.js with texture not loading using .png file - three.js

I am new to 3D Technology by referring few sited I have managed to get some code running to render 3D object. The object was generated by 3D Software Maya. I received these files –
.obj (object file)
.mtl (material file)
.png (texture files).
Using ‘three.js’ I tried to render the object in browser and for that I used three approaches as given below –
1. OBJLoader and explicitly load texture file.
2. OBJMTLLoader
3. JSON loader by converting .obj file into .js.
In all the cases 3D object was rendered successfully but common problem I faced is that it is not at all loading its ‘texture’. Object file, Material file and Texture file all are at same location. It is observed that shading is applied on the object so I can see some parts colourful but the ‘texture’ part (png files) is not getting applied on the object so partly it looks black and dark. There is no error as browser doesn’t show any.
Snippets using JSON loader –
var oLoader = new THREE.JSONLoader();
oLoader.load('Main_unit_01.js', function (geometry, materials) {
// get original materials
var material = new THREE.MeshFaceMaterial(materials);
//var material = new THREE.MeshPhongMaterial();
//material.map = new THREE.ImageUtils.loadTexture(
// "final_texture_v01.png");
var mesh = new THREE.Mesh(geometry, material);
mesh.scale.set(10, 10, 10);
Test.scene.add(mesh);
OBJMTL loader needs material file, using this approach it is observed that it loads both object and material file but not texture, here is a small part information present in MTL file –
newmtl phongE1SG
illum 4
Kd 0.00 0.00 0.00
Ka 0.00 0.00 0.00
Tf 1.00 1.00 1.00
map_Kd final_texture_v01.png
Ni 1.00
Ks 0.50 0.50 0.50
Here is the some part of information present in .js file which was the result of .obj to .js conversion –
"DbgColor" : 238,
"DbgIndex" : 3,
"DbgName" : "phongE1SG",
"colorAmbient" : [0.0, 0.0, 0.0],
"colorDiffuse" : [0.0, 0.0, 0.0],
"colorSpecular" : [0.5, 0.5, 0.5],
"illumination" : 4,
"mapDiffuse" : "final_texture_v01.png",
"opticalDensity" : 1.0
In both above material and js files we can see that ‘.png’ file has been mentioned with some tag. After reading some suggestions given on some blogs I tried to explicitly set the relative path into these files but no success. I have checked most of the blog here and applied most of suggestion but it didn't work. Also I have checked in "github.com" for help but could not get any help. None of the browser gives any error like particular file not found.
Here are the few links which I have followed.
1. three.js JSONLoader Material Not Showing
2. Loading textures with JSONloader
I am not able to post more than 2 links but I have followed so many sites.
Please suggest if any solution to get texture on the object.

The MTL file Kd attribute corresponds to the js attribute colorDiffuse. The Ka attribute is the colorAmbient. As you can see they are both 0. That is why you cannot see the model. Try to make the ambient term (0.2, 0.2, 0.2) and the diffuse term (0.8, 0.0, 0.8).

Related

Why transparency doesn't work on my mesh?

I have two meshes on my scene. One cylinder and one classic plane in the middle.
I applied a png texture on my cylinder so we can see through. It seems to work for the cylinder.
On this screenshot you'll easily see my issue : I don't understand why my image is not visible behind my cylinder.
The code I used for my cylinder :
myCylinderMesh.material.transparent = true;
myCylinderMesh.material.side = THREE.DoubleSide;
How can I manage to see the part of the image hidden behind the cylinder ?
EDIT 1 :
I added the code that #ScieCode sent me :
myCylinderMesh.material.alphaTest = 0.5;
Here's the result :
It works better : now I can see the part missing of my image. But there's one thing missing : the opacity of my cylinder. I'm supposed to see my image behind the letters too.
Currently I have this opacity :
myCylinderMesh.material.opacity = 0.7;
Do you know what I am missing ? Thanks
EDIT 2 :
Here's the code for my two meshes :
Cylinder :
geoCylinder = new THREE.CylinderBufferGeometry( 0.4, 0.4, 2*Math.PI*0.4/(2048/128), 64, 1, true );
matCylinder = new THREE.MeshBasicMaterial( { map:texture, transparent:true, color:0x000000, alphaTest: 0.5, opacity: 0.6, side: THREE.DoubleSide } );
meshCylinder = new THREE.Mesh( geoCylinder, matCylinder );
Plane :
geoPlane = new THREE.PlaneBufferGeometry( 0.8, 0.8 );
matPlane = new THREE.MeshBasicMaterial( { map: texturePlane, transparent:true} );
meshPlane = new THREE.Mesh( geoPlane, matPlane );
This behavior happens because of how transparency rendering works internally. Transparent objects need to be sorted/rendered separately from opaque objects. This assures that objects will render as expected on the final image. (not always, though)
The problem here is that your plane geometry is inside the cylinder geometry, when sorting it will either be rendered first or later. Which, in turn, causes these artifacts you are experiencing here. The whole transparency render is a lot more complex than what I'm making it to be.
Since your plane object doesn't need to be translucent, you can simply set the alphaTest property of its material. Which will only render the fragment pixels with alpha greater than this value. This will also prevent that object from being considered transparent and it will always be rendered first, fixing the artifacts in your scene.
JSFiddle
Additional info: When using a transparent material with DoubleSide, you might experience self transparency problems. This happens for the same reason I just explained, but between faces of the same object. A possible solution for this problem is to set depthWrite = false, this prevent the object from writing to the depth buffer. So every face will get rendered, disregarding if another face occludes it.

How to set camera coordinates to object in three.js? Using example "webgl obj + mtl loader"

I have a 3D model in .obj format. However the coordinates for this 3D model are not (0,0,0). This a 3D render of drone imagery so the coordinates are actual georeferenced coordinates.
I'm following the example in Three.js on how to load an obj with its mtl on webgl. I use the original HTML except that I simply replace the obj listed as male02 by CerroPelaoLow and the files are placed in the obj directory. Firefox displays the model correctly but the position is the problem.
Note that this render is generated by a program this way and even though I can manipulate the model with a program such as Meshlab I'd still prefer the minimum manipulation possible.
So how can I use local coordinates of my object or focus the camera and then use a different set of controls?
You can use the boundingSphere or boundingBox of your object's geometry to determine the position and of your camera. I have already implemented a functionality to focus an object or a set objects. So, here I share some code:
// assuming following variables:
// object -> your obj model (THREE.Mesh)
// camera -> PerspectiveCamera
// controls -> I'm also using OrbitControls
// if boundingSphere isn't set yet
object.computeBoundingSphere();
var sphere = object.geometry.boundingSphere.clone();
sphere.applyMatrix4( object.matrixWorld );
// vector from current center to camera position (shouldn't be zero in length)
var s = new THREE.Vector3().subVectors( camera.position, controls.center );
var h = sphere.radius / Math.tan( camera.fov / 2 * Math.PI / 180 );
var newPos = new THREE.Vector3().addVectors( sphere.center, s.setLength(h) );
camera.position.copy( newPos );
controls.center.copy( sphere.center );

Preserve UV map when using mesh merge in Three.js

I am currently following this tutorial by Jerome Etienne on generating a procedural city using Three.js. The tutorial uses revision 59 of Three.js while I am working with revision 73.
The problem comes from this line in the tutorial,
THREE.GeometryUtils.merge( cityGeometry, buildingMesh );
The method is no longer available. The new way to accomplish this according to this answer is,
buildingMesh.updateMatrix();
cityGeometry.merge( buildingMesh.geometry, buildingMesh.matrix );
However, when I do this, the location of the roof in the UV map changes.
This is what it looks like when I render the buildings individually.
And this is what it looks like when I merge them. Notice the roof location in the UV map.
Specification of the roof's UV map is per the tutorial. Specifically,
geometry.faceVertexUvs[0][4][0].set( 0, 0 );
geometry.faceVertexUvs[0][4][1].set( 0, 0 );
geometry.faceVertexUvs[0][4][2].set( 0, 0 );
geometry.faceVertexUvs[0][5][0].set( 0, 0 );
geometry.faceVertexUvs[0][5][1].set( 0, 0 );
geometry.faceVertexUvs[0][5][2].set( 0, 0 );
and the buildingMesh is created as follows (in a for loop where n is the number of buildings),
var buildingMesh = new THREE.Mesh( geometry );
What do I need to change or do differently in order for the merged mesh to respect the geometry's UV map?
Here is one that uses the latest three.js version (v79). From your code, I don't see how my update is different from yours, but all roofs are rendered correctly:
https://codepen.io/Sphinxxxx/pen/WrbvEz?editors=0010

THREE.js OBJMTLLoader - Request for simple example files

As detailed in a previous question I have learned how to use THREE.js OBJMTLLoader by using the same objects & materials used in the official example.
That example uses a (for me) complex model with the DDSLoader.
I would like to load simpler OBJ+MTL models and have been trying with several free models obtained from the web. I have managed to load the OBJ files OK (by applying further THREE.js code such as defining normals) but there is a problem loading material textures from the MTL files.
Here is a simple example of my code.
//...DolphinB
var posX = -3445; var posY = 750; var posZ = -100;
var common_scale = 100;
var loader = new THREE.OBJMTLLoader();
loader.load(
'TRI_VP_files/models/dolphin/DOLPHIN_B.obj',
'TRI_VP_files/models/dolphin/DOLPHIN_B.mtl',
function(object)
{
object.position.set( posX, posY, posZ );
scene222.add( object );
object.scale.set(common_scale, common_scale, common_scale);
} );
Here is the MTL code
# Wavefront material library
# Tue Aug 03 07:47:56 1999
# Created with Viewpoint Interchange www.viewpoint.com
newmtl dl_body
Ka 0 0 0
Kd 0 0.8 0.9
Ks 0 0 0
illum 1
map_Kd DOLPHIN2.JPG
My Question
Please could someone point me to some simple OBJ + MTL files which are known to load OK with OBJMTLLoader.
You can use the following free-for-private-use fileset hand created by Mohammad Alizadeh (nice work thankyou Mohammad).
It uses a single .JPG image file as source for the material texture.
It uses a single material.
Here is the .MTL file contents...
# Blender MTL File: 'Hand.blend'
# Material Count: 1
newmtl defaultMat
Ns 96.078431
Ka 0.000000 0.000000 0.000000
Kd 0.640000 0.640000 0.640000
Ks 0.500000 0.500000 0.500000
Ni 1.000000
d 1.000000
illum 2
map_Kd hand_mapNew.jpg
You will need to change the top few lines of the .OBJ file from...
# Blender v2.74 (sub 0) OBJ File: 'Hand.blend'
# www.blender.org
mtllib Hand.mtl
o ZBrushPolyMesh3D
v 0.614360 0.281365 -0.675872
v 0.684894 0.445729 -0.634615
to
# Blender v2.74 (sub 0) OBJ File: 'Hand.blend'
# www.blender.org
## mtllib Hand.mtl <===== commented out
usemtl defaultMat ## <===== added usemtl command,note proper name of material
## o ZBrushPolyMesh3D <===== commented out
v 0.614360 0.281365 -0.675872
v 0.684894 0.445729 -0.634615
Note that many free 3D object filesets use .TIF image files. But .TIF's cannot display in browsers (or THREE.js). Converting them to .JPG format is possible but the UV mapping is not preserved.
Also note that some free 3D object filesets need to be edited so that the material names in the .OBJ file match the names given in the .MTL file.
Also note that some .OBJ files (like the hand example above) need to be editted so that the material is indicated by a usemtl command e.g.:-
usemtl defaultMat
Child processing
For the Hand fileset there are vertex normals (vn) in the .OBJ file. But for some reason smoothing is not applied. Applying the following code will produce smoothing (and adjust shininess and set rootObject references for object picking):-
object.traverse ( function (child)
{
if (child instanceof THREE.Mesh)
{
child.material.shininess = 10;//range 0.1 to 30 (default) to 1000 or more, applies to Phong materials.
//child.userData.rootObject = object; //... see West Langley answer at http://stackoverflow.com/questions/22228203/picking-object3d-loaded-via-objmtlloader
//... used for object picking so that, for further operations, we can select picked child object or child's rootObject.
child.rootObject = object; //... avoids infinite loop if cloning 3D objects.
child.geometry.computeFaceNormals();
child.geometry.computeVertexNormals();
//child.geometry.normalsNeedUpdate = true; //... only required if object has already been rendered.
}
}; )
DISCLAIMER
These tricks got things working for me with this particular scenario. I don't say that this is the best way of doing things.

Threejs - Applying simple texture on a shader material

Using Threejs (67) with a Webgl renderer, I can't seem to get a plane with a shader material to wear its texture. No matter what I do the material would just stay black.
My code at the moment looks quite basic :
var grassT = new Three.Texture(grass); // grass is an already loaded image.
grassT.wrapS = grassT.wrapT = Three.ClampToEdgeWrapping;
grassT.flipY = false;
grassT.minFilter = Three.NearestFilter;
grassT.magFilter = Three.NearestFilter;
grassT.needsUpdate = true;
var terrainUniforms = {
grassTexture : { type: "t", value: grassT},
}
Then I just have this revelant part in the vertexShader :
vUv = uv;
And on the fragmentShader side :
gl_FragColor = texture2D(grassTexture, vUv);
This results in :
Black material.
No error in console.
gl_FragColor value is always (0.0, 0.0, 0.0, 1.0).
What I tryed / checked:
Everything works fine if I just apply custom plain colors.
All is ok if I use vertexColors with plain colors too.
My texture width / height is indeed a power of 2.
The image is on the same server than the code.
Tested others images with same result.
The image is actually loading in the browser debugger.
UVS for the mesh are corrects.
Played around with wrapT, wrapS, minFilter, magFilter
Adapted the mesh size so the texture has a 1:1 ratio.
Preloaded the image with requirejs image plugin and created the texture from THREE.Texture() instead of using THREE.ImageUtils();
Played around with needsUpdate : true;
Tryed to add defines['USE_MAP'] during material instanciation.
Tryed to add material.dynamic = true.
I have a correct rendering loop (interraction with terrain is working).
What I still wonder :
It's a multiplayer game using a custom port with express + socket.io. Am I hit by any Webgl security policy ?
I have no lights logic at the moment, is that a problem ?
Maybe the shader material needs other "defines" at instanciation ?
I guess I'm overlooking something simpler, this is why I'm asking...
Thanks.
I am applying various effects on the same shader. I have a custom API that merge all different effects uniforms simply by using Three.UniformsUtils.merge() However this function is calling the clone() method on the texture and this is causing to reset needsUpdate to false before the texture reach the renderer.
It appears that you should set your texture needsUpdate property to true when reaching the material level. On the texture level, if the uniform you set get merged, and therefore cloned, later in the process, it'll lose its needsUpdate property.
The issue is also detailled here: https://github.com/mrdoob/three.js/issues/3393
In my case the following wasn't working (grassT is my texture):
grassT.needsUpdate = true
while the following is running perfectly later on in the code:
material.uniforms.grassTexture.value.needsUpdate = true;
Image loading is asynchronous. Most likely, you are rendering your scene before the texture image loads.
You must set the texture.needsUpdate flag to true after the image loads. three.js has a utility that will do that for you:
var texture = THREE.ImageUtils.loadTexture( "texture.jpg" );
Once rendered, the renderer sets the texture.needsUpdate flag back to false.
three.js r.68

Resources