I have made a closed hemisphere by merging geometries of a hemisphere and a circle. I have a 360degree image for texture. I want the image to be applied as the texture to the combined geometry. Currently it is applying the texture twice: to the hemisphere and the circle separately.
I have seen some answers on editing the UV mapping, but I am not sure how to go about it.
Here is the code.
var loader = new THREE.TextureLoader();
loader.setPath(srcPath);
loader.load("./texture.jpg", function(texture) {
var hemiSphereGeom = new THREE.SphereGeometry(radius, radialSegments, Math.round(radialSegments / 4), 0, Math.PI * 2, 0, Math.PI * 0.5);
var objMaterial = new THREE.MeshPhongMaterial({
map: texture,
shading: THREE.FlatShading
});
objMaterial.side = THREE.BackSide;
var capGeom = new THREE.CircleGeometry(radius, radialSegments);
capGeom.rotateX(Math.PI * 0.5);
var singleGeometry = new THREE.Geometry();
var cap = new THREE.Mesh(capGeom);
var hemiSphere = new THREE.Mesh(hemiSphereGeom);
hemiSphere.updateMatrix();
singleGeometry.merge(hemiSphere.geometry, hemiSphere.matrix);
cap.updateMatrix();
singleGeometry.merge(cap.geometry, cap.matrix);
el.setObject3D('hemisphere',new THREE.Mesh(singleGeometry , objMaterial));
});
It appears that the code is seeing the closed hemisphere as the two separate entities still. I would try a 3D modeling program and make the shape there and loading it into the AFrame code. Then load the texture on the back side of the geometry.
Related
I trying to get position of hole in extruded geometry. I created plane and made hole in her geometry. I want get x,y,z coordinates in center of hole. Is there some methods to get it?
Here demo: https://codepen.io/DYDOI-NSK/pen/XWqJzXG?editors=0011
Here code:
I created shape of plane
let shape = new THREE.Shape();
let width = 30;
let height = 30;
shape.moveTo(-width, height);
shape.lineTo(-width, -height);
shape.lineTo(width, -height);
shape.lineTo(width, height);
shape.lineTo(-width, height);
I created hole path and add it to shape
let hole = new THREE.Path();
hole.absarc(20, 10, 10, 0, Math.PI * 2, false) //first two argumets is x,y coord of hole
shape.holes.push(hole)
I created plane add add extruded geometry
let geometry = new THREE.PlaneGeometry( 30, 30);
let material = new THREE.MeshBasicMaterial( {color: new THREE.Color('#cea6a6'), side: THREE.DoubleSide} );
let mesh = new THREE.Mesh( geometry, material );
let newGeometry = new THREE.ExtrudeGeometry(shape, settings);
mesh.geometry.dispose()
mesh.geometry = newGeometry;
After 4 days I understand how can do it. I simple created line from mesh center to hole config position. Applied quaternion to line and got x,y,z cords of hole.
Maybe there are more optimized solutions, but it only that I could create. I will be glad if someone share more optimized solution :D
Here codepen: https://codepen.io/DYDOI-NSK/pen/XWqJzXG?editors=0011
Here code:
/*
* findCoords - function to find hole coords in 3d space
* data - parameter that require x,y of hole
*/
let findCoords = function (data) {
let vertexes = [];
//Set coords where you was created hole
let hole = new THREE.Vector3(
data.x,
data.y,
0
);
vertexes.push( new THREE.Vector3() );
vertexes.push(hole)
//Create line
const material = new THREE.LineBasicMaterial({
color: 0x0000ff
});
const geometry = new THREE.BufferGeometry().setFromPoints( vertexes );
const line = new THREE.Line( geometry, material );
scene.add(line)
//Set line to center of mesh
line.position.copy(mesh.position)
//Rotate line like rotated mesh
line.applyQuaternion(mesh.quaternion)
//Extract hole coords from second vertex of line
let holeCoord = new THREE.Vector3()
const positionAttribute = line.geometry.getAttribute( 'position' );
holeCoord.fromBufferAttribute( positionAttribute, 1);
return holeCoord;
}
I use some data to create a model in three.js. It can load texture, but with a weird problem. I load the texture in this way.
`function createMesh(geom, imageFile){
var loader = new THREE.TextureLoader();
texture = loader.load(imageFile);
var mat = new THREE.MeshLambertMaterial({
side: THREE.DoubleSide,
});
mat.map = texture;
var mesh = new THREE.Mesh(geom, mat);
return mesh;}
var geom = new THREE.Geometry();
geom.vertices = vertices;
geom.faces = faces;
geom.computeFaceNormals();
var myModel = createMesh(geom, './tex1.jpg');
scene.add(myModel);`
Here is the screenshot before the texture is loaded.
Here is the screenshot after the texture is loaded.
My texture file(2048*2048.jpg)
I have tested to load the texture in a common cube, and it works. So I can't figure out why the texture can't be loaded in my model. Any suggestions? Thank you very much!
For customized geometry in Three.js, we need to do UV mapping to define how texture is mapped to the geometry.
Firstly, load the texture by var material = new THREE.MeshPhongMaterial( { map: THREE.ImageUtils.loadTexture('texture.jpg') } );
Then here is how I did UV mapping:
create n arrarys, each corresponding to a sub-image of the texture and contains 4 points which define the boudaries of the sub-image. (0, 0) represents the lower left corner and (1, 1) represents the upper right corner.
clear the existed UV mapping by geometry.faceVertexUvs[0] = [];
Map the sub-image of the texture to each triangle face of the geometry.
geometry.faceVertexUvs[0][0] = [ sub_image_1[0], sub_image_1[1], sub_image_1[3] ];
geometry.faceVertexUvs[0][1] = [ sub_image_1[1], sub_image_1[2], sub_image_1[3] ];
Finally, mesh = new THREE.Mesh(geometry, material);
I'm quite new to three.js, building 3d globe with satellite sprite moving around it.
The problem is - sprite image gets distorted after loading - aspect ratio is ignored and initial scale is about 5x of its size. I'm downscaling manually but the image gets distorted as I pick numbers manually for each axis e.g. satelliteSprite.scale.set(0.14, 0.075, 1);
My satellite image:
I have 2 questions here:
How do I load it correctly so at least aspect ratio will be
respected?
When I add AxesHelper to the sprite why its not aligned to the image plane?
original <img> vs downscaled sprite (0.1, 0.1, 0.1):
my code for satellite, earth, camera here:
//.........SPRITE
var satelliteTexture = new THREE.TextureLoader().load('assets/img/sections/section-astronaut/small-satellite.png');
var satelliteSprite = new THREE.Sprite(new THREE.SpriteMaterial({
map: satelliteTexture,
color: 0xffffff,
fog: false
}));
satelliteSprite.scale.set(0.14, 0.075, 1);
satelliteSprite.position.setFromSpherical(new THREE.Spherical().set(0.565, Math.PI * 2 - 0.9, Math.PI * 2 + 0.5));
earth.add(satelliteSprite);
window['satelliteSprite'] = satelliteSprite;
var axesHelperSatelliteSprite = new THREE.AxesHelper(5);
satelliteSprite.add(axesHelperSatelliteSprite);
//..........CAMERA
var camera = window['camera'] = new THREE.PerspectiveCamera(45, width / height, 0.01, 100);
camera.position.z = 1.5;
//..........EARTH
var earth = window['earth'] = new THREE.Mesh(
new THREE.SphereGeometry(earthRadius, earthSegments, earthSegments),
new THREE.MeshPhongMaterial({
map: THREE.ImageUtils.loadTexture('assets/img/sections/section-astronaut/2_no_clouds_4k.jpg'),
bumpMap: THREE.ImageUtils.loadTexture('assets/img/sections/section-astronaut/elev_bump_4k.jpg'),
bumpScale: 0.002,
specularMap: THREE.ImageUtils.loadTexture('assets/img/sections/section-astronaut/water_4k.png'),
specular: new THREE.Color(0x111111),
wireframe: false
})
);
earth.rotation.x = 0.4;
earth.rotation.y = -1.95;
1) create square png texture with transparency so it will not get stretched/distorted and now you can scale equally for each axis e.g. satelliteSprite.scale.set(0.08, 0.08, 0.08)
2) set filters and anisotropy for the texture
satelliteTexture.anisotropy = renderer.capabilities.getMaxAnisotropy();
satelliteTexture.minFilter = satelliteTexture.magFilter = THREE.LinearFilter;
satelliteTexture.minFilter = satelliteTexture.magFilter = THREE.NearestFilter;
I need to apply a texture on a ExtrudeGeometry object.
The shape is a circle and the extrude path is composed of 2 vectors :
One for the top.
One for the bottom.
I didn't choose cylinderGeometry because I need to place top/bottom sections of my geometry at precise positions and because the geometry created will not be always purely vertical (like a oblique cylinder for example).
Here is a picture of a section (one top vector, one bottom vector and a shape extruded between these 2 vectors).
and a picture of the texture I'm trying to apply.
All I want to do is to wrap this picture on the vertical sides of my object just one time.
Here is my code :
var biVectors = [ new THREE.Vector3( this.startVector.x, this.startVector.y, this.startVector.z ) , new THREE.Vector3( this.endVector.x, this.endVector.y, this.endVector.z ) ];
var wellSpline = new THREE.SplineCurve3(biVectors);
var extrudeSettings = {
steps : 1,
material: 0,
extrudeMaterial: 1,
extrudePath : wellSpline
};
var pts = [];
for (var i = 0; i <= this.segments; i++) {
var theta = (i / this.segments) * Math.PI * 2;
pts.push( new THREE.Vector3(Math.cos(theta) * this.diameter , Math.sin(theta) * this.diameter, 0) );
}
var shape = new THREE.Shape( pts );
var geometry = new THREE.ExtrudeGeometry( shape, extrudeSettings );
var texture = THREE.ImageUtils.loadTexture( 'textures/sampleTexture2.jpg' );
texture.wrapS = texture.wrapT = THREE.RepeatWrapping;
texture.flipY = false;
var material = new THREE.MeshBasicMaterial( { map: texture } );
var slice = new THREE.Mesh( geometry, material );
var faceNormals = new THREE.FaceNormalsHelper( slice );
console.log("face normals: ", faceNormals);
myCanvas.scene.add( faceNormals );
slice.parentObject = this;
myCanvas.scene.add( slice );
this.object3D = slice;
}
Now, as you can see, the mapping is not correct at all.
I've read a lot of information about this problem the last 3 days. But I'm running out of options as I'm new to THREE.JS.
I think I have to redefine the UV coordinates but I have no clue how to do this.
It seems that wrapping a texture on a cylinder like object is anything but easy in THREE.JS.
Can someone please help me on this issue ?
I am using three.js. with webGL. I have a single texture file called support.jpg 100x100.
I am creating planes on the fly, with different heights and widths. I need the support.jpg texture to scale to the width and then repeat down the plane. (as soon in image below)
For Example. If the plane was (height:10, width: 10) it would the texture once fiting. If the plane was (height:100, width: 10) it would have 10 of the textures repeating 10by10. If the plane was (height:100, width: 50) it would have 2 of the textures repeating 50by50.
Question: How Do I Create a plane that will have the correct texture mapping.
Here is what I have so far, but it is only rendering a single texture. (this is a width 200, height 800)
function CreateSupportBeam() {
var mesh, texture, material;
texture = THREE.ImageUtils.loadTexture("images/support.png");
material = new THREE.MeshBasicMaterial({ map: texture, transparent: true });
var uvs = [];
uvs.push(new THREE.Vector2(0,0));
uvs.push(new THREE.Vector2(1,0));
uvs.push(new THREE.Vector2(1,4));
uvs.push(new THREE.Vector2(0,4));
var geo = new THREE.PlaneGeometry(200, 800);
geo.faceVertexUvs[0].push([uvs[0], uvs[1], uvs[2], uvs[3]]);
mesh = new THREE.Mesh(geo, material);
scene.add(mesh);
}
rollercoaster.dickinsonbros.com/ <- This is the project I am working on.
You do not need to change the UVs.
Use a pattern like the following to avoid distortion and ensure that the pattern repeats and starts at the "top".
var geometry = new THREE.PlaneGeometry( length, height );
var scale = height / length;
var offset = Math.floor( scale ) - scale;;
var texture = THREE.ImageUtils.loadTexture( ... );
texture.wrapT = THREE.RepeatWrapping;
texture.wrapS = THREE.ClampToEdgeWrapping;
texture.repeat.set( 1, scale );
texture.offset.set( 0, offset );
If that is not exactly what you are looking for, experiment until you get it the way you want it.
three.js r.66