I'm trying to build the phong shader from THREE.ShaderLib.
This is what I got:
var phongShader = THREE.ShaderLib.phong;
var uniforms = THREE.UniformsUtils.clone(phongShader.uniforms);
material = new THREE.ShaderMaterial({
uniforms: uniforms,
vertexShader: phongShader.vertexShader,
fragmentShader: phongShader.fragmentShader
});
It doesn't seem to work. What am I doing wrong?
Fiddle: http://jsfiddle.net/Jvf9k/2/
Similar SO question: Three js - Cloning a shader and changing uniform values
Edit: Updated the fiddle with the help of Tapio answer. It now works!
Your JSFiddle is using THREE.CanvasRenderer which doesn't (and can't) support shader materials (but can support the built-in materials). Change that to THREE.WebGLRenderer. Also, it doesn't make sense to use phong material without lights as the result will be all black. Phong with wireframe doesn't sound very useful either.
add lights and fog:
var shaderMaterial = new THREE.ShaderMaterial({
uniforms: uniforms,
vertexShader: phongShader.vertexShader,
fragmentShader: phongShader.fragmentShader,
lights:true,
fog: true
});
Related
I'am using Three.js 3D library and I would like to use a fragmentShader to generate a THREE.Texture or THREE.Image, because its far faster in glsl than in typescript.
So I have a square Plane with a ShaderMaterial.
My fragmentShader makes an image appear on the plane and I would like to get/extract this image as a Texture (or Image) so I can reuse it as a static Texture elsewhere.
Is there a way to do this ?
const tileGeometry = new THREE.PlaneBufferGeometry( 500, 500 );
const tileUniforms = {};
const tileMaterial = new THREE.ShaderMaterial({
uniforms: tileUniforms,
vertexShader: this.shadersService.getCode('testVertShader'),
fragmentShader: this.shadersService.getCode('testFragShader'),
side: THREE.FrontSide,
blending: THREE.NormalBlending,
wireframe: false,
transparent: true
});
const tileMesh = new THREE.Mesh( tileGeometry, tileMaterial );
this.scene.add( tileMesh );
I know that a possible solution was to use a WebGLRenderTarget but perhaps there is a more straitforward solution now ?
I know that a possible solution was to use a WebGLRenderTarget but perhaps there is a more straitforward solution now ?
No, you have to render into a render target and then use its texture property with another material.
I am using the latest three.js build from github, it was updated the night before.
This code worked a few days back, but without changing the code this stopped working yesterday. It gives the error message maptexelToLinear; no matching overloaded function found on line 6 in the map_fragment shaderChunk:
vec4 texelColor = texture2D( map, vUv );
texelColor = mapTexelToLinear( texelColor ); //here
Did something change? Is this still the correct way of creating a standard material from a shadermaterial? With the defines, extensions and map uniform?
https://jsfiddle.net/EthanHermsey/c4sea1rg/119/
let texture = new THREE.TextureLoader().load(
document.getElementById( 'blockDiff' ).src
);
// this works fine
/* let mat = new THREE.MeshStandardMaterial( {
map: texture
} ) */
// this does not
let mat = new THREE.ShaderMaterial( {
//custom shaders
// vertexShader: document.getElementById( 'blockVertexShader' ).textContent,
// fragmentShader: document.getElementById( 'blockFragmentShader' ).textContent,
//The standard shaders do not even work :/
vertexShader: THREE.ShaderLib[ 'standard' ].vertexShader,
fragmentShader: THREE.ShaderLib[ 'standard' ].fragmentShader,
uniforms: THREE.UniformsUtils.merge( [
THREE.ShaderLib[ 'standard' ].uniforms,
{
blockScale: { value: new THREE.Vector3() } // used in custom shaders
}
] ),
defines: {
"STANDARD": '',
"USE_UV": '',
"USE_MAP": ''
},
lights: true
} );
mat.uniforms.map.value = texture;
mat.extensions.derivatives = true;
mat.uniformsNeedUpdate = true;
There was actually an error in previous three.js versions that injected maptexelToLinear() with a wrong implementation into shader code. This problems was fixed with r118. However, it requires from user code that your custom shader material has a property called map.
Updated code: https://jsfiddle.net/og8Lmp6e/
In this way, it's also not necessary to set custom defines like USE_MAP or USE_UV anymore. That happens automatically. And of course the implementation of maptexelToLinear() now respects the encoding of your texture.
BTW: It's actually best to modify built-in materials with Material.onBeforeCompile().
I'm using Three.js and a ShaderMaterial to draw Points Geometry with a Fragment Shader.
I want each point to have a blurred edge, but I can't get the alpha to work, it just turns white.
I can discard pixels when they're completely transparent to make a circle, but the blur fades to white and is then abruptly cut off.
Here's what I see on screen:
Three.JS code
var shaderMaterial = new THREE.ShaderMaterial( {
uniforms: uniforms,
vertexShader: document.getElementById( 'vertexshader' ).textContent,
fragmentShader: document.getElementById( 'fragmentshader' ).textContent,
blending: THREE.NormalBlending,
depthTest: true,
transparent: true,
clipping: true
});
var points = new THREE.Points(geometry, shaderMaterial);
Fragment Shader:
//translate gl_PointCoord to be centered at 0,0
vec2 cxy = 2.0 * gl_PointCoord - 1.0;
//calculate alpha based on distance from centre
//(I'm doubling it to get a less gradual fade)
float newAlpha = (1.0-distance(cxy, vec2(0.0,0.0))) * 2.0;
if (newAlpha < 0.01)
{
//discard pixels that have ~0 alpha
discard;
}
gl_FragColor = vec4( newR, newG, newB, newAlpha);
Thanks in advance for any help :) This has been puzzling me for AGES.
Edit: Images of depthTest on and off. It looks to me like depth test does put them in the right order?
depthTest false:
depthTest true:
Your JSFiddle example has several instances where it fights with itself. You're trying to set the material blending mode in Three.js, but then you override that with:
var gl = renderer.context;
gl.enable(gl.BLEND);
gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);
You're requesting an animationFrame, but then do a setTimeout inside of that, removing all benefits of using animationFrame.
I was able to get something slightly improved with
blending: THREE.NormalBlending,
transparent: true,
depthWrite: false,
depthTest: true
You can see the fixes live here: https://jsfiddle.net/yxj8zvmp/
... although there are too many attributes and uniforms being passed to your geometry to really get to the bottom of why the depth isn't working as expected. Why are you passing u_camera, when you already have cameraPosition, and your position attribute is of length 1? It feels like a blend of raw WebGL fighting with Three.js.
I did the obvious, but it doesn't work:
myshader.fragmentShader = myfragmentshader; //string
As mentioned in the documentation:
Built in attributes and uniforms are passed to the shaders along with
your code. If you don't want the WebGLProgram to add anything to your
shader code, you can use RawShaderMaterial instead of this class. You
can use the directive #pragma unroll_loop in order to unroll a for
loop in GLSL by the shader preprocessor. The directive has to be
placed right above the loop. The loop formatting has to correspond to
a defined standard
The procedure for three.js can be found in the RawShaderMaterial documentation.
This is a good approach because we do it from the material level. This means that you can set it up for each material custom shader and not for all objects in the webGL app.
var material = new THREE.RawShaderMaterial( {
uniforms: {
time: { value: 1.0 }
},
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent,
} );
At runtime just setup material in classic three.js easy style:
object3d.material = raw_mat;
I'm currently customizing the base phong material using THREE.ShaderMaterial and rebuilding the material with most of THREE js's base chunks and a customized fragment shader chunk. The problem I'm having is with the #define's in many part of three.js and trying to find the proper way to set them.
In the actual program it goes like this
// Clone the uniforms
var uniforms = THREE.UniformsUtils.clone(shader['uniforms']);
// Set uniform values
uniforms["map"].value = texture;
uniforms["diffuse"].value = new THREE.Color( 0xff0000 );
uniforms["envMap"].value = envMapt;
uniforms["reflectivity"].value = 0.7;
// Create material using shader
var material = new THREE.ShaderMaterial( {
vertexShader: shader['vertexShader'],
fragmentShader: shader['fragmentShader'],
uniforms: uniforms,
lights: true,
//map: true, // These don't seem to do anything
//envMap: true // These don't seem to do anything
} );
With a custom shader built like this
fragmentShader: [
"#define USE_MAP",
//"#define USE_ENVMAP",
"uniform vec3 diffuse;",
"uniform float opacity;",
.......
"void main() {",
THREE.ShaderChunk[ "alphatest_fragment" ],
THREE.ShaderChunk[ "specularmap_fragment" ],
......
// NDJ - Using custom frag shader
//THREE.ShaderChunk[ "lights_phong_fragment" ],
CustomShaderChunk[ "lights_phong_fragment" ],
......
THREE.ShaderChunk[ "fog_fragment" ],
"}"
].join("\n")
By manually adding the required #defines at the start of the shaders I can get it to do what I want. However this doesn't seem like the proper way to set it and it isn't very flexible.
Something similar to this, But I only need the base definitions. I've tried looking through the api and examples to find how to set these but can't seem to get it to work.
Just do it exactly as you described. Here is a sample:
ph = new THREE.MeshPhongMaterial( { ambient: 0x000000,
color: 0x0020ff,
specular: 0x2040ff,
shininess: 30,
map: theMap,
side: THREE.DoubleSide } );
ph.defines = {waldo_waldo_three: '(dx+3)', wonke: 7};
if you're being prudent, be careful about overwriting any exsiting 'defines' object