Threejs: There is no texture bound to the unit 1 - three.js

I am working on a glb viewer with threejs r116. In order to display metalness correctly, I add an environment map to the mesh.
Without the environment map, the model displays black, as expected. With the envMap it shows correctly but the console throws: RENDER WARNING: there is no texture bound to the unit 1.
model.traverse(n => {
if(n.isMesh){
if(n.material.metalness === 1){
n.material.envMap = cubeTexture
}
n.castShadow = true
n.receiveShadow = true
}
})
I tried setting a 1x1px white texture but couldn't figure out how to make that work.
This is how I create the cube texture:
let loader = new THREE.CubeTextureLoader()
let cubeTexture = loader.load([
'./images/envMap/posx.jpg', './images/envMap/negx.jpg',
'./images/envMap/posy.jpg', './images/envMap/negy.jpg',
'./images/envMap/posz.jpg', './images/envMap/negz.jpg'
])

You can safely ignore this warning. This happens because you are trying to use an environment map among other texture before it is actually loaded.
three.js allocates a texture unit as soon as a texture is applied to a material. However, the actual texture binding and upload can only be done after the texture has completely loaded. Firefox and Safari don't even log a warning in this case (at least on macOS).
If you want avoid the warning, start loading your glTF asset in the onLoad() callback of CubeTextureLoader.load().

Related

Three.js PMREMGenerator has incorrect texture filtering

I have a scene in Three (using AFrame) that requires environment mapping for lighting. Using a standard HDR cubemap, I get the following results:
This is correct as far as blurring based on roughness goes, since I have mipmaps being generated and the minFilter set to LinearMipmapLinearFilter. The issue with this approach is that ambient lighting isn't being applied - the light in the scene, a directional light, is the only thing providing any lighting information to the scene. Unfortunately, this results in entirely black shadows, no matter how bright the HDRI.
However, if I use the PMREMGenerator from Three in addition to the above, the ambient lighting issue is solved. Unfortunately, this is what happens as a result:
As shown here, the texture filtering is now out of whack. According to comments left in the PMREMGen script itself.
This class generates a Prefiltered, Mipmapped Radiance Environment Map (PMREM) from a cubeMap environment texture. This allows different levels of blur to be quickly accessed based on material roughness. It is packed into a special CubeUV format that allows us to perform custom interpolation so that we can support nonlinear formats such as RGBE. Unlike a traditional mipmap chain, it only goes down even more filtered 'mips' at the same LOD_MIN resolution, associated with higher roughness levels. In this way we maintain resolution to smoothly interpolate diffuse lighting while limiting sampling computation.
...which leads me to believe the output should be smoothed, like in my first example.
Here's my code for the first example:
const ctl = new HDRCubeTextureLoader();
ctl.setPath(hdrPath);
ctl.setDataType(THREE.UnsignedByteType);
const hdrUrl = [
`${src}/px.hdr`,
`${src}/nx.hdr`,
`${src}/py.hdr`,
`${src}/ny.hdr`,
`${src}/pz.hdr`,
`${src}/nz.hdr`
];
const hdrSky = ctl.load(hdrUrl, tex => this.skies[src] = hdrSky );
// Then later...
obj.material.envMap = this.skies[this.skySources[index]];
obj.material.needsUpdate = true;
Here's my code for the second example:
const ctl = new HDRCubeTextureLoader();
ctl.setPath(hdrPath);
ctl.setDataType(THREE.UnsignedByteType);
const hdrUrl = [
`${src}/px.hdr`,
`${src}/nx.hdr`,
`${src}/py.hdr`,
`${src}/ny.hdr`,
`${src}/pz.hdr`,
`${src}/nz.hdr`
];
const pmremGen = new PMREMGenerator(this.el.sceneEl.renderer);
pmremGen.compileCubemapShader();
const hdrSky = ctl.load(hdrUrl, tex => {
const hdrRenderTarget = pmremGen.fromCubemap(hdrSky);
this.skies[src] = hdrRenderTarget.texture;
});
// Then later...
obj.material.envMap = this.skies[this.skySources[index]];
obj.material.needsUpdate = true;
I seem to have hit a wall in regard to the filtering. Even when I explicitly change the filtering type and turn on mipmap generation inside of PMREMGenerator.js, the results appear to be the same. Here's an official example that uses the PMREMGenerator without any issue: https://threejs.org/examples/webgl_materials_envmaps_hdr.html
As a closing remark I will note that we're using Three.js r111 (and there are reasons we can't switch it out fully), so I brought in the current version of the PMREMGenerator from the latest version of Three as of this writing (r122 - later version needed as the r111 one is written completely differently). Thus, I wouldn't be surprised if this was all being caused by some conflict between versions.
EDIT: Just put the resulting envmap as a standard map on some planes, and much to my surprise none of the blurred LODS even show up. Here's what mine look like:
And here's what it should resemble (don't mind the torus knot):
EDIT: I've found a workaround for now (essentially, not using PMREMGenerator), but will leave this up in case a solution is discovered.
I believe the problem is with the argument you're using in fromCubemap(). Right now your code isn't using the tex cubetexture variable that's passed to the loaded callback. Try this:
const hdrSky = ctl.load(hdrUrl, tex => {
// use tex instead of hdrSky
const hdrRenderTarget = pmremGen.fromCubemap(tex);
this.skies[src] = hdrRenderTarget.texture;
});

How to disable three.js to resize images in power of two?

three.js automatically resizes the texture image, if it is not power of two.
In my case am using a custom canvas as texture , which is not power of two.while resizing
makes the texture not appearing properly.Is there any way to disable the resizing of the images in three.js
three.js actually is trying to do you a favor.
Since it is open source we can read the source code of WebGLRenderer.js and see that the setTexture method calls the (non public visible) method uploadTexture.
The latter has this check:
if ( textureNeedsPowerOfTwo( texture ) && isPowerOfTwo( image ) === false ){
image = makePowerOfTwo( image );
}
Which is quite explanatory itself.
You may wonder now what textureNeedsPowerOfTwo actually checks. Let's see.
function textureNeedsPowerOfTwo( texture ) {
if ( texture.wrapS !== THREE.ClampToEdgeWrapping || texture.wrapT !== THREE.ClampToEdgeWrapping ) return true;
if ( texture.minFilter !== THREE.NearestFilter && texture.minFilter !== THREE.LinearFilter ) return true;
return false;
}
If you use wrapping for the texture coordinated different from clamp or if you use a filtering that is not nearest nor linear the texture gets scaled.
If you are surprised by this code I strongly suggest you to take a look at the MDN page on using textures.
Quoting
The catch: these textures [Non Power Of Two textures] cannot be used with mipmapping and they must not "repeat" (tile or wrap).
[...]
Without performing the above configuration, WebGL requires all samples of NPOT [Non Power Of Two] textures to fail by returning solid black: rgba(0,0,0,1).
So using a NPOT texture with incorrect texture parameters would give you the good old solid black.
Since three.js is open source, you can edit your local copy and remove the "offending" check.
However a better, simpler, and more maintainable approach is to simply scale the UV mapping. After all it is there just for this use case.

Threejs - Applying simple texture on a shader material

Using Threejs (67) with a Webgl renderer, I can't seem to get a plane with a shader material to wear its texture. No matter what I do the material would just stay black.
My code at the moment looks quite basic :
var grassT = new Three.Texture(grass); // grass is an already loaded image.
grassT.wrapS = grassT.wrapT = Three.ClampToEdgeWrapping;
grassT.flipY = false;
grassT.minFilter = Three.NearestFilter;
grassT.magFilter = Three.NearestFilter;
grassT.needsUpdate = true;
var terrainUniforms = {
grassTexture : { type: "t", value: grassT},
}
Then I just have this revelant part in the vertexShader :
vUv = uv;
And on the fragmentShader side :
gl_FragColor = texture2D(grassTexture, vUv);
This results in :
Black material.
No error in console.
gl_FragColor value is always (0.0, 0.0, 0.0, 1.0).
What I tryed / checked:
Everything works fine if I just apply custom plain colors.
All is ok if I use vertexColors with plain colors too.
My texture width / height is indeed a power of 2.
The image is on the same server than the code.
Tested others images with same result.
The image is actually loading in the browser debugger.
UVS for the mesh are corrects.
Played around with wrapT, wrapS, minFilter, magFilter
Adapted the mesh size so the texture has a 1:1 ratio.
Preloaded the image with requirejs image plugin and created the texture from THREE.Texture() instead of using THREE.ImageUtils();
Played around with needsUpdate : true;
Tryed to add defines['USE_MAP'] during material instanciation.
Tryed to add material.dynamic = true.
I have a correct rendering loop (interraction with terrain is working).
What I still wonder :
It's a multiplayer game using a custom port with express + socket.io. Am I hit by any Webgl security policy ?
I have no lights logic at the moment, is that a problem ?
Maybe the shader material needs other "defines" at instanciation ?
I guess I'm overlooking something simpler, this is why I'm asking...
Thanks.
I am applying various effects on the same shader. I have a custom API that merge all different effects uniforms simply by using Three.UniformsUtils.merge() However this function is calling the clone() method on the texture and this is causing to reset needsUpdate to false before the texture reach the renderer.
It appears that you should set your texture needsUpdate property to true when reaching the material level. On the texture level, if the uniform you set get merged, and therefore cloned, later in the process, it'll lose its needsUpdate property.
The issue is also detailled here: https://github.com/mrdoob/three.js/issues/3393
In my case the following wasn't working (grassT is my texture):
grassT.needsUpdate = true
while the following is running perfectly later on in the code:
material.uniforms.grassTexture.value.needsUpdate = true;
Image loading is asynchronous. Most likely, you are rendering your scene before the texture image loads.
You must set the texture.needsUpdate flag to true after the image loads. three.js has a utility that will do that for you:
var texture = THREE.ImageUtils.loadTexture( "texture.jpg" );
Once rendered, the renderer sets the texture.needsUpdate flag back to false.
three.js r.68

ThreeJs and Blender (using colladaLoader): first contact

How can I render an exported scene (with many objects, each with different colors and different properties, such as rotation aroung an axis in the scene) from Blender (with colladaLoader -->.dae) in ThreeJs?
So, the first step is to learn how to create a scene in threeJs and learn some feature with Blender. When you are ready, create your first model and before exporting keep this in mind:
you need to an object with vertices, so if you just create a text with Blender, you have to convert it to a mesh, otherwise threeJs will not render it
be sure to choose the Blender render option and not the Cycles,
otherwise the .dae you export will not be rendered in threeJs
when applying a texture, use just colors and basic materials (basic, phong and lambert) - the others will not work using the colladaLoader
to see if the object will be rendered with color in threeJs with
colladaLoader just look at object in Blender with object mode
(solid) - if it's gray and not colored of the color you choose, it
will be rendered in threeJs the same way
if you apply the 'solidify' modifier to the object and then on threeJs set it to transparent, it will be rendered as wireframed
if you append multiple objects in the scene and 'join' them, the
respective positions and rotations will be respected in threeJs,
otherwise not: for example, if you want to renderize a flower in the
bottle (and thoose objects are different blender files which are
appended/linked in the scene), the flower will not fit in the bottle
in threeJs, but would have a different position and rotation than
the bottle
grouping the objects will not solve this: to see the scene as you see it in Blender you have to 'join' the objects (with the consequences that this entails) or manually change position and rotation on threeJs
the .dae export options don't matter for the rendering of the object in threeJs
and now, the part that regards threeJs:
be sure to import the colladaLoader with:
<script src="jsLib/ColladaLoader.js"></script>
insert this code into your init() function so the loader will load your .dae model:
var loader = new THREE.ColladaLoader();
loader.options.convertUpAxis = true;
loader.load( 'model.dae', function ( collada ) {
// with this you can get the objects of the scene; the [0] is not the first object
// you display in blender in case of many objects (which means you didn't join them)
var obj1 = collada.scene.children[0];
// you can name the object so you can use it even out of the function, if you want
// animate it for example obj1.name = "daeObj1";
// you can set here some material properties as trasparency
obj1.material.needsUpdate = true;
obj1.material.transparent = true;
obj1.material.opacity = 0.5;
obj1.hearth.material.wireframe = false;
// and now some position and rotation for good visualization
obj1.position.set(0, -5, -0.6); //x,z,y
obj1.rotation.set(0, 45, 0);
// and add the obj to the threeJs scene
scene.add(obj1);
});
and some code to the animate() function if you want to update some of your objects, with rotation for example
scene.traverse (function (object) {
if (object.name === 'daeObj1') {
object.rotation.z -= 0.01;
}
});
I hope someone will benefit from this post

How can I get rid of INVALID_VALUE warning when loading a three.js texture?

Using Three.Js r66 and a setup like so:
var texture = THREE.ImageUtils.loadTexture('/data/radar.png');
texture.wrapS = THREE.RepeatWrapping;
var radar = new THREE.Mesh(
sphere,
new THREE.MeshBasicMaterial({
map: texture,
transparent: true,
}));
I'm getting the following warnings on the console:
WebGL: INVALID_VALUE: texImage2D: invalid image dev_three.js:26190
[.WebGLRenderingContext]RENDER WARNING: texture bound to texture unit 0 is not renderable. It maybe non-power-of-2 and have incompatible texture filtering or is not 'texture complete'
I'm pretty sure this is because the object is being rendered before the texture has been loaded, thus WebGL is trying to access a null texture in: three.js line 26190.
_gl.texImage2D( _gl.TEXTURE_2D, 0, glFormat, glFormat, glType, texture.image );
The code as quoted above works - once the texture has been loaded it displays just fine. I'd like to get rid of the warnings though - any ideas? Other materials (e.g. phong) seem to handle asynchronous texture loading better. They show up black until the texture arrives. Noticeably, they do not spam the console with warnings.
This demo (http://jeromeetienne.github.io/tunnelgl/) exhibits the same problem.
Wait for the texture to load
var safeToRender = true;
var texture = THREE.ImageUtils.loadTexture('/data/radar.png',
undefined,
textureHasLoaded);
function textureHasLoaded() {
safeToRender = true;
}
Then don't start rendering until safeToRender is true.
Of course if you're loading more than 1 image you'll need to use a count or something instead of a flag.

Resources