In Three.js, how can I change the way in which a texture gets mapped onto a plane?
Let's assume we have a 1x1 plane and a 16:9 image. How can I control the way in which that image gets mapped onto the plane?
By default, the image gets "squished". I would like it to maintain its aspect ratio and have any overlap get "cut off". Is there a way to configure the material or texture to do this, or would I use a shader? If so, what would it need to look like?
const planeMesh = new THREE.Mesh(
new THREE.PlaneBufferGeometry(1, 1),
new THREE.MeshBasicMaterial({
map: texture,
})
);
PS: In future, I would also like to be able to zoom into and out of the image on mouse hover without affecting the size of the plane, so would think a shader might be better?
A Texture already has several properties built-in that can do what you're looking for.
const texture = textureLoader.load("whatever.png");
const planeMesh = new THREE.Mesh(
new THREE.PlaneBufferGeometry(1, 1),
new THREE.MeshBasicMaterial({
map: texture,
})
);
// Sets the pivot point to the center of the texture
texture.center.set(0.5, 0.5);
// Make the texture repeat 0.5625 times in the x-axis to match 16:9 ratio
let ratio = 9 / 16;
texture.repeat.set(ratio, 1);
// Scale texture up to "zoom" into it
let zoom = 0.5;
texture.repeat.set(ratio * zoom, 1 * zoom);
You can read more about the .repeat .center and even .rotation properties in the Texture docs. Just keep in mind that repeating a texture is a bit counter-intuitive because you're doing the inverse of scaling a texture. So to scale a texture by 2, you have to tell it to repeat 1/2 times.
Related
I have seams between horizontal faces of the cube when use texture atlas in three.js.
This is demo: http://jsfiddle.net/rnix/gtxcj3qh/7/ or http://jsfiddle.net/gtxcj3qh/8/ (from comments)
Screenshot of the problem:
Here I use repeat and offset:
var materials = [];
var t = [];
var imgData = document.getElementById("texture_atlas").src;
for ( var i = 0; i < 6; i ++ ) {
t[i] = THREE.ImageUtils.loadTexture( imgData ); //2048x256
t[i].repeat.x = 1 / 8;
t[i].offset.x = i / 8;
//t[i].magFilter = THREE.NearestFilter;
t[i].minFilter = THREE.NearestFilter;
t[i].generateMipmaps = false;
materials.push( new THREE.MeshBasicMaterial( { map: t[i], overdraw: 0.5 } ) );
}
var skyBox = new THREE.Mesh( new THREE.CubeGeometry( 1024, 1024, 1024), new THREE.MeshFaceMaterial(materials) );
skyBox.applyMatrix( new THREE.Matrix4().makeScale( 1, 1, -1 ) );
scene.add( skyBox );
The atlas has size 2048x256 (power of two). I also tried manual UV-mapping instead of repeat, but the result is the same. I use 8 tiles instead of 6 because I have thought precision of division 1/6 causes the problem, but not.
Pixels on this line are from next tile in atlas. I tried completly white atlas and there was not any artefacts. This explains why there are not seams on vertical borders of Z-faces. I have played with filters, wrapT, wrapS and mipmaps but it does not help. Increasing resolution does not help. There is 8192x1024 atlas http://s.getid.org/jsfiddle/atlas.png I tried another atlas, the result is the same.
I know that I can split atlas into separate files and it works perfectly but it is not convenient.
Whats wrong?
I think the issue is the filtering problem with texture sheets. On image borders in a texture sheet, the gpu may pick the texel from either the correct image or the neighbor image due to limited precision. Because the colors are usually very different, this results in the visible seams. In regular textures, this is solved with CLAMP_TO_EDGE.
If you must use texture alias, then you need to fake CLAMP_TO_EDGE behavior by padding the image borders. See this answer https://gamedev.stackexchange.com/questions/61796/sprite-sheet-textures-picking-up-edges-of-adjacent-texture. It should look something like this: (exaggerated borders for clarity)
Otherwise, the simpler solution is to use a different texture for each face. Webgl supports the cube texture and that is usually used the majority of the time to implement skyboxes.
Hack the uv, replace all value 1.0 with 0.999, replace all value 0 with 0.001 will fakely resolve part of this problem.
I've gotten sprites to work fine in my game engine, but I'm noticing some odd issues when moving the camera. Mainly, the sprites' pixels kind of 'shake' or go 'wavey' when the camera moves. You can see an example of this here (please watch in HD) http://youtu.be/om3EhKsGd9M
I'm setting the following properties on my sprite texture and sprite material, and my textures are 64 x 64 pixels in size:
spriteTexture.magFilter = THREE.NearestFilter;
spriteTexture.minFilter = THREE.NearestMipMapNearestFilter;
var spriteMaterial = new THREE.SpriteMaterial(
{
map:sheet,
useScreenCoordinates:true,
transparent:true,
side:THREE.DoubleSide,
});
Any ideas on a solution for this issue? I'd wager that this is somehow related to how the sprites are rendered when mapped into 3D space.
I am composing 2D planes with textures. I have 3 levels.
A background plane at z=0,
black shapes for conections at z=0.1 and
small planes with textures at z=0.2
the problem is that when I move the camera planes seems to change z position.
Planes are drawn in incorrect Z, it depend on the position of the camera. Moving the camera it changes again and looks very ugly.
Maybe I need to activate some ZBuffer property for correct drawing
WebGL init is like this and planes are exactly the same (only Z coord change)
renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
renderer._microCache = new MicroCache(); //cache de imagenes
renderer.setClearColor(0xeeeeee, 1);
document.body.appendChild(renderer.domElement);
// add directional light source
var directionalLight = new THREE.DirectionalLight(0xffffff, 1);
directionalLight.position.set(1, 1, 1300).normalize();
scene.add(directionalLight);
//background plane
plane = new THREE.Mesh(new THREE.PlaneGeometry(200000, 200000, 1, 1), new THREE.MeshLambertMaterial({ color: 0xffffff, opacity: planeOpacity, transparent: true }););
plane.position.z = 0;
scene.add(plane);
Other planes are exactly the same but greater Z position
Help please!
Thanks!
Palomo
The thing you're seing is probably z-fighting. Internally, depth is represented by integer in GPU so there is only fixed number of distinct z-s between camera's near and far planes. The solution is to either move your planes apart or narrow camera's near-far range down.
I have a mesh with plane geometry that I texture with a 2D canvas element. This is working perfectly.
var landTexture = new THREE.Texture(land.canvas); // animated canvas element
var material = new THREE.MeshLambertMaterial({map: landTexture});
var plane = new THREE.Mesh( new THREE.PlaneGeometry( this.land_width, this.land_height ), material );
landTexture.needsUpdate = true;
landObject.add(plane);
The 2D canvas has an animated pattern which I want to use as a texture on a pentagon instead of a plane. How do I go about texturing more complex polygons, e.g. a pentagon, with a 2D canvas texture?
Edit: how I generate the pentagon
var pentagon = new THREE.Mesh( new THREE.CircleGeometry(this.land_width, 5), material );
Screenshot of the animated texture on PlaneGeometry and the same texture on a CircleGeometry with 5 sides. Notice the "stretched" canvas texture on the pentagon, which is not what I want. It should fit proportionally.
I think that the UVs are already set so that code such as:
var grid = new THREE.ImageUtils.loadTexture( 'images/uvgrid01.jpg' );
var pentagon = new THREE.Mesh( new THREE.CircleGeometry(50, 5), new THREE.MeshBasicMaterial({map:grid}) );
when the grid is the image:
will produce an image like:
Is that what you're looking for? Then all that is left is to update the texture repeatedly, and each time this is done, set the flag to update the material. I can go into more detail, but I would need to know how the canvas is being animated.
I am trying to use the Three.js library to display a large number of colored points on the screen (about half a million to million for example). I am trying to use the Canvas renderer rather than the WebGL renderer if possible (The web pages would also be displayed in the Google Earth Client bubbles, which seems to work with Canvas renderer but not the WebGL renderer.)
While I have the problem solved for a small number of points (tens of thousands) by modifying the code from here, I am having trouble scaling it beyond that.
But in the the following code using WebGL and the Particle System I can render half a million random points, but without colors.
...
var particles = new THREE.Geometry();
var pMaterial = new THREE.ParticleBasicMaterial({
color: 0xFFFFFF,
size: 1,
sizeAttenuation : false
});
// now create the individual particles
for (var p = 0; p < particleCount; p++) {
// create a particle with randon position values,
// -250 -> 250
var pX = Math.random() * POSITION_RANGE - (POSITION_RANGE / 2),
pY = Math.random() * POSITION_RANGE - (POSITION_RANGE / 2),
pZ = Math.random() * POSITION_RANGE - (POSITION_RANGE / 2),
particle = new THREE.Vertex(
new THREE.Vector3(pX, pY, pZ)
);
// add it to the geometry
particles.vertices.push(particle);
}
var particleSystem = new THREE.ParticleSystem(
particles, pMaterial);
scene.add(particleSystem);
...
Is the reason for the better performance of the above code due to the Particle System? From what I have read in the documentation it seems the Particle System can only be used by the WebGL renderer.
So my question(s) are
a) Can I render such large number of particles using the Canvas renderer or is it always going to be slower than the WebGL/ParticleSystem version? If so, how do I go about doing that? What objects and or tricks do I use to improve performance?
b) Is there a compromise I can reach if I give up some features? In other words, can I still use the Canvas renderer for the large dataset if I give up the need to color the individual points?
c) If I have to give up the Canvas and use the WebGL version, is it possible to change the colors of the individual points? It seems the color is set by the material passed to the ParticleSystem and that sets the color for all the points.
EDIT: ParticleSystem and PointCloud has been renamed to Points. In addition, ParticleBasicMaterial and PointCloudMaterial has been renamed to PointsMaterial.
This answer only applies to versions of three.js prior to r.125.
To have a different color for each particle, you need to have a color array as a property of the geometry, and then set vertexColors to THREE.VertexColors in the material, like so:
// vertex colors
var colors = [];
for( var i = 0; i < geometry.vertices.length; i++ ) {
// random color
colors[i] = new THREE.Color();
colors[i].setHSL( Math.random(), 1.0, 0.5 );
}
geometry.colors = colors;
// material
material = new THREE.PointsMaterial( {
size: 10,
transparent: true,
opacity: 0.7,
vertexColors: THREE.VertexColors
} );
// point cloud
pointCloud = new THREE.Points( geometry, material );
Your other questions are a little too general for me to answer, and besides, it depends on exactly what you are trying to do and what your requirements are. Yes, you can expect Canvas to be slower.
EDIT: Updated for three.js r.124