I did the obvious, but it doesn't work:
myshader.fragmentShader = myfragmentshader; //string
As mentioned in the documentation:
Built in attributes and uniforms are passed to the shaders along with
your code. If you don't want the WebGLProgram to add anything to your
shader code, you can use RawShaderMaterial instead of this class. You
can use the directive #pragma unroll_loop in order to unroll a for
loop in GLSL by the shader preprocessor. The directive has to be
placed right above the loop. The loop formatting has to correspond to
a defined standard
The procedure for three.js can be found in the RawShaderMaterial documentation.
This is a good approach because we do it from the material level. This means that you can set it up for each material custom shader and not for all objects in the webGL app.
var material = new THREE.RawShaderMaterial( {
uniforms: {
time: { value: 1.0 }
},
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent,
} );
At runtime just setup material in classic three.js easy style:
object3d.material = raw_mat;
Related
I'am using Three.js 3D library and I would like to use a fragmentShader to generate a THREE.Texture or THREE.Image, because its far faster in glsl than in typescript.
So I have a square Plane with a ShaderMaterial.
My fragmentShader makes an image appear on the plane and I would like to get/extract this image as a Texture (or Image) so I can reuse it as a static Texture elsewhere.
Is there a way to do this ?
const tileGeometry = new THREE.PlaneBufferGeometry( 500, 500 );
const tileUniforms = {};
const tileMaterial = new THREE.ShaderMaterial({
uniforms: tileUniforms,
vertexShader: this.shadersService.getCode('testVertShader'),
fragmentShader: this.shadersService.getCode('testFragShader'),
side: THREE.FrontSide,
blending: THREE.NormalBlending,
wireframe: false,
transparent: true
});
const tileMesh = new THREE.Mesh( tileGeometry, tileMaterial );
this.scene.add( tileMesh );
I know that a possible solution was to use a WebGLRenderTarget but perhaps there is a more straitforward solution now ?
I know that a possible solution was to use a WebGLRenderTarget but perhaps there is a more straitforward solution now ?
No, you have to render into a render target and then use its texture property with another material.
I am using the latest three.js build from github, it was updated the night before.
This code worked a few days back, but without changing the code this stopped working yesterday. It gives the error message maptexelToLinear; no matching overloaded function found on line 6 in the map_fragment shaderChunk:
vec4 texelColor = texture2D( map, vUv );
texelColor = mapTexelToLinear( texelColor ); //here
Did something change? Is this still the correct way of creating a standard material from a shadermaterial? With the defines, extensions and map uniform?
https://jsfiddle.net/EthanHermsey/c4sea1rg/119/
let texture = new THREE.TextureLoader().load(
document.getElementById( 'blockDiff' ).src
);
// this works fine
/* let mat = new THREE.MeshStandardMaterial( {
map: texture
} ) */
// this does not
let mat = new THREE.ShaderMaterial( {
//custom shaders
// vertexShader: document.getElementById( 'blockVertexShader' ).textContent,
// fragmentShader: document.getElementById( 'blockFragmentShader' ).textContent,
//The standard shaders do not even work :/
vertexShader: THREE.ShaderLib[ 'standard' ].vertexShader,
fragmentShader: THREE.ShaderLib[ 'standard' ].fragmentShader,
uniforms: THREE.UniformsUtils.merge( [
THREE.ShaderLib[ 'standard' ].uniforms,
{
blockScale: { value: new THREE.Vector3() } // used in custom shaders
}
] ),
defines: {
"STANDARD": '',
"USE_UV": '',
"USE_MAP": ''
},
lights: true
} );
mat.uniforms.map.value = texture;
mat.extensions.derivatives = true;
mat.uniformsNeedUpdate = true;
There was actually an error in previous three.js versions that injected maptexelToLinear() with a wrong implementation into shader code. This problems was fixed with r118. However, it requires from user code that your custom shader material has a property called map.
Updated code: https://jsfiddle.net/og8Lmp6e/
In this way, it's also not necessary to set custom defines like USE_MAP or USE_UV anymore. That happens automatically. And of course the implementation of maptexelToLinear() now respects the encoding of your texture.
BTW: It's actually best to modify built-in materials with Material.onBeforeCompile().
I'm using Three.js and a ShaderMaterial to draw Points Geometry with a Fragment Shader.
I want each point to have a blurred edge, but I can't get the alpha to work, it just turns white.
I can discard pixels when they're completely transparent to make a circle, but the blur fades to white and is then abruptly cut off.
Here's what I see on screen:
Three.JS code
var shaderMaterial = new THREE.ShaderMaterial( {
uniforms: uniforms,
vertexShader: document.getElementById( 'vertexshader' ).textContent,
fragmentShader: document.getElementById( 'fragmentshader' ).textContent,
blending: THREE.NormalBlending,
depthTest: true,
transparent: true,
clipping: true
});
var points = new THREE.Points(geometry, shaderMaterial);
Fragment Shader:
//translate gl_PointCoord to be centered at 0,0
vec2 cxy = 2.0 * gl_PointCoord - 1.0;
//calculate alpha based on distance from centre
//(I'm doubling it to get a less gradual fade)
float newAlpha = (1.0-distance(cxy, vec2(0.0,0.0))) * 2.0;
if (newAlpha < 0.01)
{
//discard pixels that have ~0 alpha
discard;
}
gl_FragColor = vec4( newR, newG, newB, newAlpha);
Thanks in advance for any help :) This has been puzzling me for AGES.
Edit: Images of depthTest on and off. It looks to me like depth test does put them in the right order?
depthTest false:
depthTest true:
Your JSFiddle example has several instances where it fights with itself. You're trying to set the material blending mode in Three.js, but then you override that with:
var gl = renderer.context;
gl.enable(gl.BLEND);
gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);
You're requesting an animationFrame, but then do a setTimeout inside of that, removing all benefits of using animationFrame.
I was able to get something slightly improved with
blending: THREE.NormalBlending,
transparent: true,
depthWrite: false,
depthTest: true
You can see the fixes live here: https://jsfiddle.net/yxj8zvmp/
... although there are too many attributes and uniforms being passed to your geometry to really get to the bottom of why the depth isn't working as expected. Why are you passing u_camera, when you already have cameraPosition, and your position attribute is of length 1? It feels like a blend of raw WebGL fighting with Three.js.
I am trying to use a custom attribute on a shaderMaterial, but I can't get it to work.
My simplified code is
attributes = {
aColor: { type: "f", value:] },
};
for ( i = 0; i < points.length; i ++ ) {
attributes.aColor.value.push (0.9) ;
}
var uniforms = THREE.UniformsLib['lights'];
sMaterial = new THREE.ShaderMaterial ({
attributes: attributes,
uniforms: uniforms,
vertexShader: vShader,
fragmentShader: fShader,
lights: true,
})
var line2 = new THREE.Line( geometry, sMaterial);
scene.add( line2 );
In my shader I set a debug statement
attribute float aColor;
void main()
if (aColor == 0.0) {
// debugcode
}
and the debugcode is always executed.
Inspecting the WebGlProgram, I can see in the ACTIVE_ATTRIBUTES the aColor, and it looks ok.
What is going wrong here ?
Or, even better, how can I debug a problem like this ?
Just trying, I found out the problem.
I was reusing a geometry that I had already used for another mesh, and somehow that was causing the problem.
Anyway, I am still interested in learning techniques to deal with this kind of problems
Following on from #vals' answer, I've just solved a similar problem, not with shared geometries, but with trying to assign a new shader material with attributes to an existing object. It's because if three.js detects that the geometry and its owning object are already initialised (using the variable geometry.__webglInit), then even if the material has changed, it won't try to update the geometry's buffers, including attributes, in the GPU memory. The initialisation also won't run unless the renderer detects that objects have been added to the scene.
The solution I used:
// Create the new shader
var shader = new THREE.ShaderMaterial({
attributes: {intensity: {type: 'f', value: []}},
vertexShader: vShaderText,
fragmentShader: fShaderText
});
// Populate the intensity attribute
// ...
// Remove the existing scene object
var existingObject = myObjects[0]; // ... retrieve from scene or variable
var geometry = existingObject.geometry;
scene.remove(existingObject);
// Recreate the new scene object (important!: note geometry.clone() below)
var pc = new THREE.PointCloud(geometry.clone(), shader);
scene.add(pc);
// Clean up memory
geometry.dispose();
There is probably a more memory-efficient way to do this by reusing the existing vertex buffers that are already in GPU memory, but this works well enough for our application.
I'm trying to build the phong shader from THREE.ShaderLib.
This is what I got:
var phongShader = THREE.ShaderLib.phong;
var uniforms = THREE.UniformsUtils.clone(phongShader.uniforms);
material = new THREE.ShaderMaterial({
uniforms: uniforms,
vertexShader: phongShader.vertexShader,
fragmentShader: phongShader.fragmentShader
});
It doesn't seem to work. What am I doing wrong?
Fiddle: http://jsfiddle.net/Jvf9k/2/
Similar SO question: Three js - Cloning a shader and changing uniform values
Edit: Updated the fiddle with the help of Tapio answer. It now works!
Your JSFiddle is using THREE.CanvasRenderer which doesn't (and can't) support shader materials (but can support the built-in materials). Change that to THREE.WebGLRenderer. Also, it doesn't make sense to use phong material without lights as the result will be all black. Phong with wireframe doesn't sound very useful either.
add lights and fog:
var shaderMaterial = new THREE.ShaderMaterial({
uniforms: uniforms,
vertexShader: phongShader.vertexShader,
fragmentShader: phongShader.fragmentShader,
lights:true,
fog: true
});