I'm trying to make the following in three.js:
I made the model in sketchup with some simple coloured textures and used the collader importer, the result looks like this:
Now I want to dynamically load some photographs onto each of the different planes, however what I end up with is this:
So as you can see, each image is loaded but they are very small and repeated across the rest of the surface.
This is how I load the textures: (preloadTexture() is just a simple preloader)
for(i in cubeSidesArray)
{
preloadTexture(modelThumbsArray[i]);
var newTexture = new THREE.MeshPhongMaterial( { map: THREE.ImageUtils.loadTexture(modelThumbsArray[i]) } );
cubeSidesArray[i].material = newTexture;
}
How do I get the textures to fill the surface?
Thanks!
Edit - I played with the model in sketchup and managed to get it a little better, but not much!
Edit 2 - Still no luck, I'm starting to think building it in code from scratch would be simpler
Option 1: I would advise you to do or next.
1 -.Import the model blender
2 -.Export blender to threejs
3 -.Use this method of charging.
AgregarModeloBlender function (geometry, materials) {
console.log(materials);
material = new THREE.MeshFaceMaterial( materials );
modelo3d_ = new THREE.Mesh( geometry,material );
escenario.add(modelo3d_);
modelo3d_.add(camera);
modelo3d_.scale.set(5,5,5);
modelo3d_.position.set(-900,25,850);
modelo3d_.rotation.y=Math.PI;
}
4 -. Subsequently trabajr with textures independently.
Example: http://all.develoteca.com/builder/
Option 2: I would advise you to do or this:
1 -. Create the geometric shape (vertices) to modify each of the faces of the texture.
Example: http://develoteca.com/Panel/
Greetings.
Related
I'm working on an app where I visualize ATV trails in a 3d perspective (NAIP imagery draped over elevation data). I am using three.js for the rendering engine.
In the above image, the white line you see is just a THREE.Line instance, where I convert a trails gps coordinates into threejs coordinates. I'd like to add more of 3d perspective to this line. I tried implementing a THREE.TubeGeometry where the path was a THREE.CatmullRomCurve3 using the same Vector3 points as how I built the line you see in the image above. That did not produce a desirable result...
From the many, many THREE examples I have looked at, I really think an extruded geometry would achieve the look I am after... But I cant for the life of me figure out how to extrude a geometry for the line. Any suggestions/thoughts?
UPDATE 1:
Here is my desired look (same trail - no imagery). This image was produced in QGIS using the Q2Threejs plugin
UPDATE 2: Here is a code of how I have attempted to create a tubegeometry. Maybe I am messing something up in there...
// trailVectors are an array of Vector3 - same as ones used to create line
var trailCurve = new THREE.CatmullRomCurve3(trailVectors);
var tubeGeometry = new THREE.TubeGeometry(trailCurve,80,1,15,false);
var material = new THREE.MeshBasicMaterial({color:0x00ff00});
var tubeMesh = new THREE.Mesh(tubeGeometry,material);
var wireframeMaterial = new THREE.LineBasicMaterial({color:0xffffff,lineWidth:2});
var wireframe = new THREE.Mesh(tubeGeometry,wireframeMaterial);
tubeMesh.add(wireframe);
scene.add(tubeMesh);
UPDATE 3
THREE.TubeGeometry(trailCurve,80,4,2,false) per mzartman request
I think that you should be able to achieve what you want with a TubeGeometry. I think the big thing is that your example (from the picture shown) has more than 2 radius segments. That gives it the tubular shape and makes it look sort of like a blob. If you set the radial segment count to 2 (as it's shown below) then I think it would look a lot better.
tubeGeometry = new THREE.TubeBufferGeometry(
[YOUR_PATH_HERE],
params.extrusionSegments, // <--- Edit this for higher resolution on the spline
3, // <--- This defines the height
2, // <--- This 2 keeps 2D (i.e. not a tube!!!!)
true );
var mesh = new THREE.Mesh( geometry, material );
var wireframe = new THREE.Mesh( geometry, wireframeMaterial );
mesh.add( wireframe );
scene.add( mesh );
Update:
I think that you might do better with a material that shows some shadow like the MeshPhong. Also, to do the wireframe you want to add it as an option in the material initialization. Give it a show with the following:
var tubeGeometry = new THREE.TubeGeometry(curve,80,1,2,false);
var material = new THREE.MeshPhongMaterial({color:0x00ff00, wireframe: true});
var tubeMesh = new THREE.Mesh(tubeGeometry,material);
scene.add(tubeMesh);
I have a square shape .obj model and 2 textures. How do I apply one texture on it's top face and another on rest of faces?
There are a ton of ways to do what you're asking all with varying complexity depending on your needs. It looks like you want to apply two materials to your object and not two textures.
It looks this way because it seems you want the textures to be interchangeable so there's no way you're going to combine the images and keep resolution and OBJ & THREE.Material only support one set of uv attributes so you can't use a single material and multiple textures. So multiple materials it is...
Multiple materials
If you have two materials (2 THREE.Materials which correlate to 2 WebGL programs) then each face needs to know what material it's assigned to.
While the THREE.js multi material API has been in flux for quite a while and there are differences between THREE.Geometry and THREE.BufferGeometry, fortunately for you THREE.OBJLoader supports material groups out of the box. To get this into THREE.js, you want to apply multiple materials in your 3D editor to your object and then export the OBJ to get everything. Doing it by hand is a little harder and requires calling addGroup as shown in the docs/the link above.
In THREE.js you simply pass in all the materials as an array to your object demonstrated in this answer. I also updated your fiddle to do the same thing. Relevant code shown below
var loadingManager = new THREE.LoadingManager();
var ObjLoader = new THREE.OBJLoader(loadingManager);
var textureLoader = new THREE.TextureLoader(loadingManager);
//Material 1 with first texture
var material = new THREE.MeshLambertMaterial({map: textureLoader.load('https://dl.dropboxusercontent.com/s/nvnacip8fhlrvm4/BoxUV.png?dl=0')});
//Material 2 with second texture
var material2 = new THREE.MeshLambertMaterial({map:
textureLoader.load('https://i.imgur.com/311w7oZ.png')});
ObjLoader.load(
'https://dl.dropboxusercontent.com/s/hiazgei0rxeirr4/cabinet30.obj?dl=0',
function ( object ) {
var geo = object.children[0].geometry;
var mats = [material, material2];
//These are just some random groups to demonstrate multi material, you need to set these up so they actually work for your object, either in code or in your 3D editor
geo.addGroup(0,geo.getAttribute("position").count/2,0);
geo.addGroup(geo.getAttribute("position").count/2,
geo.getAttribute("position").count/2,1);
//Mesh with multiple materials for material parameter
obj = new THREE.Mesh(geo, mats);
obj.position.y = 3;
});
I am a newbie in Three js. In my project, I need draw a ground with a lot of texture. The ground has many layers, every layer has 4 textures and textures in different layers are different size. Below picture describe the ground:
Ground is one mesh has multiple material:
this.mesh = new THREE.Mesh(geometry, new THREE.MultiMaterial(materials));
Suppose I have a car it always at center of ground, in other word, center of all layers. When it move, the ground will translate to make sure the car always at center. So everytime ground translate, I need to update texture in new position.
The picture draw 3 layers for illustration, but in my project is 6 layers. So everytime all texture change, that means need to change 6 * 4 = 24 textures, and that cause low fps in my program.
This is my function to load texture from indexed DB every time a texture change:
Ground.prototype.loadTextureFromIndexedDB = function (url, materialIndex) {
var loader = new THREE.TextureLoader();
loader.crossOrigin = '';
loader.load(url,
function (texture) {
var groundMaterial = ground.mesh.material.materials[materialIndex];
groundMaterial.map.dispose();
groundMaterial.map = texture;
groundMaterial.map.anisotropy = ground.maxAnisotropy;
groundMaterial.map.minFilter = THREE.LinearFilter;
groundMaterial.map.needsUpdate = true;
img = null;
window.URL.revokeObjectURL(url);
});
}
I have tried many solutions. One of them is make a mesh with a BufferGeometry and MultiMaterial with array of ShaderMaterial. As what I known, it is the best for performance in THREE JS, isn't it? If it is then maybe THREE JS is not powerful as I thinked. Should I change to another API for my project?
Anyone suggest me any solution to make higher performance in my program? Thanks a lot!
In my game that uses Three.js (r52) I'm having some trouble getting the lighting right.
This dungeon level uses simple cuboids as the walls and the roof. For some reason the lighting is bright at the beginning of each mesh, but then fades to dark towards the other side.
Notice that the floor doesn't have artifacts, this is because it is one huge quad.
The light used is a PointLight. The materials for my meshes are simply created like this:
var texture = new THREE.Texture( image,
new THREE.UVMapping(),
THREE.RepeatWrapping,
THREE.RepeatWrapping,
THREE.NearestFilter,
THREE.NearestMipMapLinearFilter );
return new THREE.MeshLambertMaterial({
map : texture
});
The cuboids are exported OBJ models from 3ds max, converted using gw::OBJ-exporter. These are my export settings:
Any ideas?
Apparently you hit the same issue as in this thread: https://github.com/mrdoob/three.js/issues/1258
You need to use something like material.shading = THREE.FlatShading;
I have this object I'm loading with THREE.objLoader and then create a mesh with it like so:
mesh = new THREE.SceneUtils.createMultiMaterialObject(
geometry,
[
new THREE.MeshBasicMaterial({color: 0xFEC1EA}),
new THREE.MeshBasicMaterial({
color: 0x999999,
wireframe: true,
transparent: true,
opacity: 0.85
})
]
);
In my scene I then add a DirectionalLight, it works and I can see my object, however it's like the DirectionalLight was an ambient one. No face is getting darker or lighter as it should be.
The object is filled with the color, but no lighting is applied to it.
If someone can help me with that it would be much appreciated :)
What could I be missing ?
Jsfiddle here: http://jsfiddle.net/5hcDs/
Ok folks, thanks to Maƫl Nison and mr doob I was able to understand the few things I was missing, being the total 3d noob that I am... I believe people starting to get into the 3d may find useful a little recap:
Basic 3d concepts
A 3d Face is made of some points (Vertex), and a vector called a normal, indicating the direction of the face (which side is the front and which one is the backside).
Not having normals can be really bad, because lighting is applied on the frontside only by default. Hence the black model when trying to apply a LambertMaterial or PhongMaterial.
An OBJ file is a way to describe 3D information. Want more info on this? Read this wikipedia article (en). Also, the french page provides a cube example which can be useful for testing.
Three.js tips and tricks
When normals are not present, the lighting can't be applied, hence the black model render. Three.js can actually compute vertex and face normals with geometry.computeVertexNormals() and/or geometry.computeFaceNormals() depending on what's missing
When you do so, there's a chance Three.js' normal calculation will be wrong and your normals will be flipped, to fix this you can simply loop through your geometry's faces array like so:
/* Compute normals */
geometry.computeFaceNormals();
geometry.computeVertexNormals();
/* Next 3 lines seems not to be mandatory */
mesh.geometry.dynamic = true
mesh.geometry.__dirtyVertices = true;
mesh.geometry.__dirtyNormals = true;
mesh.flipSided = true;
mesh.doubleSided = true;
/* Flip normals*/
for(var i = 0; i<mesh.geometry.faces.length; i++) {
mesh.geometry.faces[i].normal.x = -1*mesh.geometry.faces[i].normal.x;
mesh.geometry.faces[i].normal.y = -1*mesh.geometry.faces[i].normal.y;
mesh.geometry.faces[i].normal.z = -1*mesh.geometry.faces[i].normal.z;
}
You have to use a MeshPhongMaterial. MeshBasicMaterial does not take light in account when computing fragment color.
However, when using a MeshPhongMaterial, your mesh becomes black. I've never used the OBJ loader, but are you sure your model normales are right ?
Btw : you probably want to use a PointLight instead. And its position should probably be set to the camera position (light.position = camera.position should do the trick, as it will allow the light to be moved when the camera position will be edited by the Controls).