I am trying to use a custom attribute on a shaderMaterial, but I can't get it to work.
My simplified code is
attributes = {
aColor: { type: "f", value:] },
};
for ( i = 0; i < points.length; i ++ ) {
attributes.aColor.value.push (0.9) ;
}
var uniforms = THREE.UniformsLib['lights'];
sMaterial = new THREE.ShaderMaterial ({
attributes: attributes,
uniforms: uniforms,
vertexShader: vShader,
fragmentShader: fShader,
lights: true,
})
var line2 = new THREE.Line( geometry, sMaterial);
scene.add( line2 );
In my shader I set a debug statement
attribute float aColor;
void main()
if (aColor == 0.0) {
// debugcode
}
and the debugcode is always executed.
Inspecting the WebGlProgram, I can see in the ACTIVE_ATTRIBUTES the aColor, and it looks ok.
What is going wrong here ?
Or, even better, how can I debug a problem like this ?
Just trying, I found out the problem.
I was reusing a geometry that I had already used for another mesh, and somehow that was causing the problem.
Anyway, I am still interested in learning techniques to deal with this kind of problems
Following on from #vals' answer, I've just solved a similar problem, not with shared geometries, but with trying to assign a new shader material with attributes to an existing object. It's because if three.js detects that the geometry and its owning object are already initialised (using the variable geometry.__webglInit), then even if the material has changed, it won't try to update the geometry's buffers, including attributes, in the GPU memory. The initialisation also won't run unless the renderer detects that objects have been added to the scene.
The solution I used:
// Create the new shader
var shader = new THREE.ShaderMaterial({
attributes: {intensity: {type: 'f', value: []}},
vertexShader: vShaderText,
fragmentShader: fShaderText
});
// Populate the intensity attribute
// ...
// Remove the existing scene object
var existingObject = myObjects[0]; // ... retrieve from scene or variable
var geometry = existingObject.geometry;
scene.remove(existingObject);
// Recreate the new scene object (important!: note geometry.clone() below)
var pc = new THREE.PointCloud(geometry.clone(), shader);
scene.add(pc);
// Clean up memory
geometry.dispose();
There is probably a more memory-efficient way to do this by reusing the existing vertex buffers that are already in GPU memory, but this works well enough for our application.
Related
I've dug around looking for a solve for this - but couldn't find the answer on here. I have a gltf/glb model loading into three js as such:
createScene: function() {
this.renderer = new Three.WebGLRenderer({
antialias: true,
alpha: true
});
let container = document.getElementById('container');
this.renderer.setSize(container.clientWidth, container.clientHeight);
this.renderer.setPixelRatio(window.devicePixelRatio);
this.renderer.setClearColor(new Three.Color('#fff'));
this.renderer.setClearAlpha( 0 );
//this.render.shadowMap.type = Three.PCFSoftShadowMap;
container.appendChild(this.renderer.domElement);
},
createCamera: function() {
this.camera = new Three.PerspectiveCamera(6, container.clientWidth/container.clientHeight, 1, 1000);
this.camera.position.set(0, 0, 40);
},
createShape: function() {
let me = this;
const loader = new GLTFLoader();
loader.load( 'model/fbf3.glb', function ( gltf ) {
me.bottle = gltf.scene
me.scene.add( gltf.scene );
gltf.scene.traverse(function(obj) { obj.frustumCulled = false; });
me.pivot = new Three.Group();
me.pivot.position.set( 0.0, 0.0, 0 );
me.bottle.add( me.pivot );
console.log(gltf.scene);
console.log(gltf.scene.children[1]);
//me.pivot.add( me.bottle );
me.animate();
}, undefined, function ( error ) {
console.error( error );
});
},
I found a post that said to loop through the scene and be sure to add frustumCulled to false - so that's added in ( and when I log the child, that objects frustumCulled is set to false ). In order to have a label on my object easy to map and also take on a different material/glossiness i've created another object in the group that is just slightly in front of my other object on the yaxis. When it is facing the camera, it works well - however, it is when the object rotates is where it disappears. Working:
Rotate enough, and gone:
Is there a setting in threejs that I need to add to be sure the render order is correct? Or is it something wrong with my object set up in Blender? Ideally I wouldn't have to UV wrap the whole bottle as one object and add the label to the bottle texture because I want the label to have less specularity ( that a word? ). Any help would be appreciated.
Solved - the tutorial is slightly old ( 2+years ) and now Blender ( the latest version 2.9+ ) has the Principled BSDF Shader baked in. Instead of using the glTF-Blender-Exporter-Master pbr appended shaders from the listed tutorial - I instead used the Principled BSDF Shader - linked my material to it's Base Color and linked that to the Material Output ( all in the node/now Shader Editor. No issues anymore.
I am using the latest three.js build from github, it was updated the night before.
This code worked a few days back, but without changing the code this stopped working yesterday. It gives the error message maptexelToLinear; no matching overloaded function found on line 6 in the map_fragment shaderChunk:
vec4 texelColor = texture2D( map, vUv );
texelColor = mapTexelToLinear( texelColor ); //here
Did something change? Is this still the correct way of creating a standard material from a shadermaterial? With the defines, extensions and map uniform?
https://jsfiddle.net/EthanHermsey/c4sea1rg/119/
let texture = new THREE.TextureLoader().load(
document.getElementById( 'blockDiff' ).src
);
// this works fine
/* let mat = new THREE.MeshStandardMaterial( {
map: texture
} ) */
// this does not
let mat = new THREE.ShaderMaterial( {
//custom shaders
// vertexShader: document.getElementById( 'blockVertexShader' ).textContent,
// fragmentShader: document.getElementById( 'blockFragmentShader' ).textContent,
//The standard shaders do not even work :/
vertexShader: THREE.ShaderLib[ 'standard' ].vertexShader,
fragmentShader: THREE.ShaderLib[ 'standard' ].fragmentShader,
uniforms: THREE.UniformsUtils.merge( [
THREE.ShaderLib[ 'standard' ].uniforms,
{
blockScale: { value: new THREE.Vector3() } // used in custom shaders
}
] ),
defines: {
"STANDARD": '',
"USE_UV": '',
"USE_MAP": ''
},
lights: true
} );
mat.uniforms.map.value = texture;
mat.extensions.derivatives = true;
mat.uniformsNeedUpdate = true;
There was actually an error in previous three.js versions that injected maptexelToLinear() with a wrong implementation into shader code. This problems was fixed with r118. However, it requires from user code that your custom shader material has a property called map.
Updated code: https://jsfiddle.net/og8Lmp6e/
In this way, it's also not necessary to set custom defines like USE_MAP or USE_UV anymore. That happens automatically. And of course the implementation of maptexelToLinear() now respects the encoding of your texture.
BTW: It's actually best to modify built-in materials with Material.onBeforeCompile().
I have a "logic" shader that updates the program state writing it in a texture, and a "rendering" shader that reads the state from the texture and renders the scene to the screen.
While trying to implement this, I incurred in the problem that when I read a RenderTarget's texture from a shader with texture2D(), I always get a black pixel, even if the RenderTarget has been written by a previous shader.
To try to understand the problem, I wrote the following code, where an EffectComposer fills a RenderTarget with red, and then another EffectComposer reads the RenderTarget and writes to screen:
var renderer;
var composer1, pass1, renderTarget1;
var composer2, pass2;
init();
animate();
function init() {
renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
const vertexShader = `
void main() {
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
`
renderTarget1 = new THREE.WebGLRenderTarget(64, 64)
pass1 = new THREE.ShaderPass({
uniforms: {},
vertexShader,
fragmentShader: `
void main() {
gl_FragColor = vec4(1,0,0,1);
}
`
})
composer1 = new THREE.EffectComposer(renderer, renderTarget1)
composer1.addPass(pass1)
pass2 = new THREE.ShaderPass({
uniforms: {
tRenderTarget1: { type: 't', value: renderTarget1.texture },
},
vertexShader,
fragmentShader: `
uniform sampler2D tRenderTarget1;
void main() {
gl_FragColor = texture2D(tRenderTarget1, vec2(0.5));
}
`
})
pass2.renderToScreen = true
composer2 = new THREE.EffectComposer(renderer)
composer2.addPass(pass2)
document.body.appendChild(renderer.domElement);
}
function animate() {
requestAnimationFrame(animate);
render();
}
function render() {
composer1.render()
composer2.render()
}
I expect the screen to be red, but it is black. The code can be tested here http://jsfiddle.net/matfer/f6qwr9yb/
Just to clarify, I did not put the two ShaderPasses in the same EffectComposer because they are not postprocess passes to be applied on the same image, as I said they are "logic" and "rendering" calculations that have to be written to different buffers (with different size, filtering, etc.).
What is wrong? Am I doing this the wrong way?
It's no good approach to use a render target in that way if you pass it to a EffectComposer. The composer internally clones the target and uses both targets to manage the read and write buffer for post processing. Since these buffers are swapped at certain points, you never know what your applied render target actually represents.
I've refactored your fiddle a bit so it has a similar style like post-processing effects with multiple passes.
http://jsfiddle.net/f6qwr9yb/7/
Your program is fine. You just need to define your uniforms after the constructor:
pass2.uniforms.tRenderTarget1.value = renderTarget1.texture;
Also, you can define your uniforms like this, instead.
uniforms: {
tRenderTarget1: { value: null },
},
Specifying type is no longer required.
Updated fiddle: http://jsfiddle.net/f6qwr9yb/9/
three.js r.98
UPDATE: Issue was that texData object was recreated each time and thus reference for DataTexture was lost. Solution by WestLangley was to overwrite the data in texData instead of recreating texData object.
I have a simple threejs scene with a DataTexture in a ShaderMaterial. The data array passed to it once during initialization is updated on mouse events. However the DataTexture does not seem to update.
Did i assign uniforms or texture data wrongly? Or using the needsUpdate flags wrongly? It does work when deleting and recreating the texture, material, mesh and scene objects each time, but this shouldnt really be necessary as i have seen from many examples which i could however not reproduce.
Note that the data itself is updated nicely, just not the DataTexture.
// mouse event triggers request to server
// server then replies and this code here is called
// NOTE: this code **is** indeed called on every mouse update!
// this is the updated data from the msg received
// NOTE: texData **does** contain the correct updated data on each event
texData = new Float32Array(evt.data.slice(0, msgByteLength));
// init should happen only once
if (!drawContextInitialized) {
// init data texture
dataTexture = new THREE.DataTexture(texData, texWidth, texHeight, THREE.LuminanceFormat, THREE.FloatType);
dataTexture.needsUpdate = true;
// shader material
material = new THREE.ShaderMaterial({
vertexShader: document.querySelector('#vertexShader').textContent.trim(),
fragmentShader: document.querySelector('#fragmentShader').textContent.trim(),
uniforms: {
dataTexture: { value: dataTexture }
}
});
// mesh with quad geometry and material
geometry = new THREE.PlaneGeometry(width, height, 1, 1);
mesh = new THREE.Mesh(geometry, material);
// scene
scene = new THREE.Scene();
scene.add(mesh);
// camera + renderer setup
// [...]
drawContextInitialized = true;
}
// these lines seem to have no effect
dataTexture.needsUpdate = true;
material.needsUpdate = true;
mesh.needsUpdate = true;
scene.needsUpdate = true;
renderer.render(scene, camera);
When updating the DataTexture data, do not instantiate a new array. Instead, update the array elements like so:
texData.set( javascript_array );
Also, the only flag you need to set when you update the texture data is:
dataTexture.needsUpdate = true;
three.js r.83
I had a real hard time for some reason seeing any change to modifactions . In desperation i just made a new DataTexture . Its important to set needsUpdate to true
imgData.set(updatedData)
var newDataTex = new THREE.DataTexture( imgData,...
var newDataTex.needsUpdate = true
renderMesh.material.uniforms.texture.value = newDataTex
I want to adapt this Shader here:
https://aerotwist.com/tutorials/an-introduction-to-shaders-part-2/
to a standard Lambert or Phong that it works with all my Lights in the Scene.
My current state is that I extend the Lambert with this code:
var attributes = {
displacement: {
type: 'f', // a float
value: [] // an empty array
}
};
var uniforms = {
amplitude: {
type: 'f', // a float
value: 0
}
};
var shaders = { mylambert : THREE.ShaderLib[ 'lambert' ] };
var materials = {};
materials.mylambert = function( parameters, myUniforms ) {
var material = new THREE.ShaderMaterial( {
vertexShader: $('#vertexLambert').text(),
fragmentShader: shaders.mylambert.fragmentShader,
uniforms: THREE.UniformsUtils.merge( [ shaders.mylambert.uniforms, myUniforms ] ),
attributes :attributes,
lights:true,
shading:THREE.FlatShading
} );
material.setValues( parameters );
return material;
};
var myProperties = {
lights: true,
fog: true,
transparent: true
};
var myMaterial = new materials.mylambert( myProperties, uniforms );
Which I got from this Post:
extending lambert material, opacity not working
The vertexShader is basically the shape as shaders.mylambert.vertexShader but with the additional code from the shader example on top.
It works somehow, so the vertices move, but the faces didn't shade when they change their shape so they always have the same shader when I use a plane for example as a the mesh.
In short;
I need a Lambert/Phong Shader that manipulates the Vertices over time up and down to simulate a low Poly Water surface.
If this is still relevant, You can solve this issue much simpler:
Have your model render with a Lambert, Phong, Standard or whatever lit material you like.
Create another Scene, Camera and a WebGLRenderTarget, create a plane and apply your ShaderMaterial to it. Position your camera so that the plane fits exactly your entire frame of the scene.
Render the other Scene to the WebGlRenderTarget and apply it as a map to your original Lambert material this way:
let mat = new THREE.MeshLambertMaterial({
map: renderTarget.texture
})
Viola! You now have a fully lit ShaderMaterial as you like.