I tried to change a texture inside a glb file containing a map with many objects (mesh). The texture has been changed but for a few seconds before the texture is loaded the part of texture appears black and then is loaded good.
seems to be loading problem due to the fact that the glb is large.
The glb files have a size of 20MB and the texture that I replace on the fly is 1.8MB. How you can eliminate this effect.
const texture = new TextureLoader().load(this.config.service.baseURL + '/' +this.config.projectCurrent.path + '/' + immagine)
const plane = this.terrainDef.scene.getObjectByName(this.config.projectCurrent.mapLayer.object)
plane.material = new MeshLambertMaterial({map: texture})
Related
I am working on a glb viewer with threejs r116. In order to display metalness correctly, I add an environment map to the mesh.
Without the environment map, the model displays black, as expected. With the envMap it shows correctly but the console throws: RENDER WARNING: there is no texture bound to the unit 1.
model.traverse(n => {
if(n.isMesh){
if(n.material.metalness === 1){
n.material.envMap = cubeTexture
}
n.castShadow = true
n.receiveShadow = true
}
})
I tried setting a 1x1px white texture but couldn't figure out how to make that work.
This is how I create the cube texture:
let loader = new THREE.CubeTextureLoader()
let cubeTexture = loader.load([
'./images/envMap/posx.jpg', './images/envMap/negx.jpg',
'./images/envMap/posy.jpg', './images/envMap/negy.jpg',
'./images/envMap/posz.jpg', './images/envMap/negz.jpg'
])
You can safely ignore this warning. This happens because you are trying to use an environment map among other texture before it is actually loaded.
three.js allocates a texture unit as soon as a texture is applied to a material. However, the actual texture binding and upload can only be done after the texture has completely loaded. Firefox and Safari don't even log a warning in this case (at least on macOS).
If you want avoid the warning, start loading your glTF asset in the onLoad() callback of CubeTextureLoader.load().
I'm trying to load spritesheets from Texturepacker in ThreeJS, which comprises an image and json. The image has a bunch of small sprites packed together and the json defines the location and size of the small sprites in the image.
I have tried 3 methods for loading.
using ThreeJS loaders for the json and image and assigning new textures with different repeat and offset values.
using WebGLRenderTarget buffers to crop the source image into
using Canvas buffers to crop the source image into
The method using multiple texture instances with different offsets should work ok as I'm not copying the source image but when I run an animation by switching a material's texture, it uses a crazy amount of RAM as if it's copying the entire source spritesheet into memory for each one. If I change the texture offsets for the animation instead of using texture copies, it works ok but an offset change would be applied to every object using the same source spritesheet.
The WebGLRenderTarget method needs a camera and scene for cropping the textures and a sprite added to the scene. The output from this is unusable as it doesn't generate a 1:1 crop of the original texture and it's really slow to load. Is there a way to render textures 1:1 to smaller buffers in ThreeJS?
The Canvas method worked best where I create a canvas element for each sprite and crop the spritesheet into each. This is 1:1 and good quality but the point of using a spritesheet is that the GPU only has a single image to address and this needs an HTML loader process. Ideally I don't want to crop the spritesheet to smaller texture buffers.
Why does using the same large source image with multiple THREE.Texture objects use so much memory? I expected it would only need to keep a single texture in memory and the Texture objects would just display the same texture with different offsets.
I found a way that works.
First, I load the texture by making a WebGLTexture from the spritesheet image loaded via a ThreeJS ImageLoader, which gets stored in _spritesheets[textureID].texture.
let texture = this._spritesheets[textureID].texture;
let gl = this._renderer.getContext();
let webGLTexture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, webGLTexture);
gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, texture);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
Then I set the webGL texture parameter of the texture object to this and set its webglInit value to true so it doesn't create a new buffer.
let frames = textureJSON.frames;
for (let frameID of Object.keys(frames)) {
let frame = frames[frameID];
let t = new THREE.Texture(texture);
let data = frame.frame;
t.repeat.set(data.w / texture.width, data.h / texture.height);
t.offset.x = data.x / texture.width;
t.offset.y = 1 - data.h / texture.height - data.y / texture.height;
let textureProperties = this._renderer.properties.get(t);
textureProperties.__webglTexture = webGLTexture;
textureProperties.__webglInit = true;
this._textures[frameID] = {};
this._textures[frameID].texture = t;
this._textures[frameID].settings = { wrapS: 1, wrapT: 1, magFilter: THREE.LinearFilter, minFilter: THREE.NearestFilter };
}
The spritesheet JSON is loaded via a ThreeJS FileLoader. I then store the sprites by frame id in a _textures object and can assign those to a material's map property.
I'm struggeling with textures on objects that are a bit farther back in the scene. The textures become very jagged, and creates a disturbing effect as the camera moves. I've tried changing the anisotropy, and I've tried changing the min and mag filters, but nothing seems to help at all.
Code I'm using to load textures (all textures are 1024px by 1024px):
var texture = new THREE.Texture();
var texloader = new THREE.ImageLoader(manager);
texloader.load('static/3d/' + name + '.jpg', function (image) {
texture.image = image;
texture.needsUpdate = true;
texture.anisotropy = 1;
texture.minFilter = THREE.LinearFilter;
texture.magFilter = THREE.LinearMipmapLinearFilter;
});
You can see it in action here: http://www.90595.websys.sysedata.no/
gaitat is wrong, you do want the mipmaps.
The problem with your code is that they are not generated.
Using the console, I found that while "generateMipmaps" in your textures is set to "true", mipmaps are not generated, as seen in this screenshot: http://imgur.com/hAUEaur.
I looked at your textures, and I believe the mipmaps weren't generated due to your textures not being a power of 2 (e.g. 128x128, 256x256, 512x512). Try making your textures of width and height that are powers of 2 and I think the mipmaps will be generated and they won't look jagged anymore.
As objects move further away from the camera webgl uses textures automatically generated called mipmaps. These are of lower resolution. If you don't like them disable them by:
texture.generateMipmaps = false;
Okay. So I thought I'd tried all the different mipmap filters, but apparently no. So this is what ended up doing the trick:
texture.minFilter = THREE.NearestMipMapNearestFilter;
texture.magFilter = THREE.LinearMipMapLinearFilter;
Didn't need the anisotropy at all.
I have an effectsComposer creating a result that is heavy in white.
var composer = new THREE.EffectComposer(renderer);
var renderPass = new THREE.RenderPass(shaderHeavyScene, camera);
composer.render(delta)
In the same project I have a material and scene with with a second image loaded into it.
When I replace: composer.render(delta);
with: renderer.render( secondImageScene, camera );
I can see my secondary image loaded into the three.js canvas.
My plan was to multiply the white heavy image of effectsComposer over the secondImageScene. (Revealing secondImageScene through the white)
My question is this: How would I go about multiplying the end result of the effectsComposer overtop of secondImageScene?
Use another pass that has input of two textures that were rendered in the previous passes, that multiplies them.
I have created a planegeometry and a texture which is made by ImageUtils with CanvasRenderer,
and then I created a mesh and finished all relative parameters'settings. The result is that a PlaneGeometry with the image
texture in the space,but a diagonal line is showed up,what's more, one image are divided into two
parts by this diagonal line,thus images on this two parts become dismatched.How should I resolve it? The key code are as follows:
var planeGeo=new THREE.PlaneGeometry(2000,3000);
var map = THREE.ImageUtils.loadTexture( 'images/greenLines.png' );
map.wrapS=map.wrapT = THREE.RepeatWrapping;
map.repeat.set(2,2);
map.anisotropy = 16;
var planeMaterial=new THREE.MeshBasicMaterial({color:0xFFBBEE,map:map,overdraw:true});
var planeMesh=new THREE.Mesh(planeGeo,planeMaterial);
planeMesh.position.set(2000,-2000,5000);
scene.add(planeMesh);
NOTICE:the images are constructed by green lines.The result displays that lines are cut and one
line has become two lines which are located on two divided triangles .
Now how should I modify my code so that the two parts of this image looks matched well without displacement!
Can the diagonal line be removed?