How to get Image (or Texture) generated by a fragmentShader - three.js

I'am using Three.js 3D library and I would like to use a fragmentShader to generate a THREE.Texture or THREE.Image, because its far faster in glsl than in typescript.
So I have a square Plane with a ShaderMaterial.
My fragmentShader makes an image appear on the plane and I would like to get/extract this image as a Texture (or Image) so I can reuse it as a static Texture elsewhere.
Is there a way to do this ?
const tileGeometry = new THREE.PlaneBufferGeometry( 500, 500 );
const tileUniforms = {};
const tileMaterial = new THREE.ShaderMaterial({
uniforms: tileUniforms,
vertexShader: this.shadersService.getCode('testVertShader'),
fragmentShader: this.shadersService.getCode('testFragShader'),
side: THREE.FrontSide,
blending: THREE.NormalBlending,
wireframe: false,
transparent: true
});
const tileMesh = new THREE.Mesh( tileGeometry, tileMaterial );
this.scene.add( tileMesh );
I know that a possible solution was to use a WebGLRenderTarget but perhaps there is a more straitforward solution now ?

I know that a possible solution was to use a WebGLRenderTarget but perhaps there is a more straitforward solution now ?
No, you have to render into a render target and then use its texture property with another material.

Related

Artefacts when rendering multiple intersecting transparent meshes with threejs

I am trying to render a stack of transparent planes using THREE.ShaderMaterial:
const renderMaterial = new ShaderMaterial({
uniforms: {
color: { value: color },
},
vertexShader: VERTEX_SHADER,
fragmentShader: FRAGMENT_SHADER, // fragment shader sets alpha to 0.3
transparent: true,
side: THREE.FrontSide
});
This looks fine, but as I rotate the camera around the scene, at certain angles I get rendering artefacts.
Correct render:
Corrupted render when I rotate the camera. It gets more obvious as I rotate the camera:
Can anyone point me in the right direction? I tried experimenting with renderer.sortObjects = false and setting custom renderOrder to each transparent mesh without much success. I don't understand why it happens only at certain camera angles?
Thanks in advance

Alpha in Fragment Shader gl_FragColor not working

I'm using Three.js and a ShaderMaterial to draw Points Geometry with a Fragment Shader.
I want each point to have a blurred edge, but I can't get the alpha to work, it just turns white.
I can discard pixels when they're completely transparent to make a circle, but the blur fades to white and is then abruptly cut off.
Here's what I see on screen:
Three.JS code
var shaderMaterial = new THREE.ShaderMaterial( {
uniforms: uniforms,
vertexShader: document.getElementById( 'vertexshader' ).textContent,
fragmentShader: document.getElementById( 'fragmentshader' ).textContent,
blending: THREE.NormalBlending,
depthTest: true,
transparent: true,
clipping: true
});
var points = new THREE.Points(geometry, shaderMaterial);
Fragment Shader:
//translate gl_PointCoord to be centered at 0,0
vec2 cxy = 2.0 * gl_PointCoord - 1.0;
//calculate alpha based on distance from centre
//(I'm doubling it to get a less gradual fade)
float newAlpha = (1.0-distance(cxy, vec2(0.0,0.0))) * 2.0;
if (newAlpha < 0.01)
{
//discard pixels that have ~0 alpha
discard;
}
gl_FragColor = vec4( newR, newG, newB, newAlpha);
Thanks in advance for any help :) This has been puzzling me for AGES.
Edit: Images of depthTest on and off. It looks to me like depth test does put them in the right order?
depthTest false:
depthTest true:
Your JSFiddle example has several instances where it fights with itself. You're trying to set the material blending mode in Three.js, but then you override that with:
var gl = renderer.context;
gl.enable(gl.BLEND);
gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);
You're requesting an animationFrame, but then do a setTimeout inside of that, removing all benefits of using animationFrame.
I was able to get something slightly improved with
blending: THREE.NormalBlending,
transparent: true,
depthWrite: false,
depthTest: true
You can see the fixes live here: https://jsfiddle.net/yxj8zvmp/
... although there are too many attributes and uniforms being passed to your geometry to really get to the bottom of why the depth isn't working as expected. Why are you passing u_camera, when you already have cameraPosition, and your position attribute is of length 1? It feels like a blend of raw WebGL fighting with Three.js.

Color a tetrahedron in three.js

In three.js, I'm trying to draw a tetrahedron using THREE.TetrahedronGeometry where each face is a different color. When I use MeshNormalMaterial, each vertex has a different color but the faces are color gradients between the vertexes. This works for a BoxGeometry, but not for TetrahedronGeometry.
I tried using PhongMaterial with shading: THREE.FlatShading but that just gives me black or white faces.
I tried writing my own ShaderMaterial and in the fragment material, I color using the normal vector, but that also gets the gradient affect.
I'm sure I'm missing something obvious, but can't see it...
For versions of three.js prior to r125*, this is how you do it:
var geo = new THREE.TetrahedronGeometry(sphereRadius, 0);
for ( var i = 0; i < geo.faces.length; i ++ ) {
geo.faces[ i ].color.setHex( Math.random() * 0xffffff );
}
var material = new THREE.MeshBasicMaterial({
side: THREE.DoubleSide,
shading: THREE.FlatShading,
vertexColors: THREE.VertexColors
})
var mesh = new THREE.Mesh( geo, material );
So you need THREE.FlatShader, THREE.VertexColors, and then you need to assign the face colors.
For later versions, see how to render a tetrahedron with different texture on each face (using three.js)?.
* THREE.Geometry will be removed from core with r125

ThreeJS - Extend Lambert Shader with Custom VertexShader

I want to adapt this Shader here:
https://aerotwist.com/tutorials/an-introduction-to-shaders-part-2/
to a standard Lambert or Phong that it works with all my Lights in the Scene.
My current state is that I extend the Lambert with this code:
var attributes = {
displacement: {
type: 'f', // a float
value: [] // an empty array
}
};
var uniforms = {
amplitude: {
type: 'f', // a float
value: 0
}
};
var shaders = { mylambert : THREE.ShaderLib[ 'lambert' ] };
var materials = {};
materials.mylambert = function( parameters, myUniforms ) {
var material = new THREE.ShaderMaterial( {
vertexShader: $('#vertexLambert').text(),
fragmentShader: shaders.mylambert.fragmentShader,
uniforms: THREE.UniformsUtils.merge( [ shaders.mylambert.uniforms, myUniforms ] ),
attributes :attributes,
lights:true,
shading:THREE.FlatShading
} );
material.setValues( parameters );
return material;
};
var myProperties = {
lights: true,
fog: true,
transparent: true
};
var myMaterial = new materials.mylambert( myProperties, uniforms );
Which I got from this Post:
extending lambert material, opacity not working
The vertexShader is basically the shape as shaders.mylambert.vertexShader but with the additional code from the shader example on top.
It works somehow, so the vertices move, but the faces didn't shade when they change their shape so they always have the same shader when I use a plane for example as a the mesh.
In short;
I need a Lambert/Phong Shader that manipulates the Vertices over time up and down to simulate a low Poly Water surface.
If this is still relevant, You can solve this issue much simpler:
Have your model render with a Lambert, Phong, Standard or whatever lit material you like.
Create another Scene, Camera and a WebGLRenderTarget, create a plane and apply your ShaderMaterial to it. Position your camera so that the plane fits exactly your entire frame of the scene.
Render the other Scene to the WebGlRenderTarget and apply it as a map to your original Lambert material this way:
let mat = new THREE.MeshLambertMaterial({
map: renderTarget.texture
})
Viola! You now have a fully lit ShaderMaterial as you like.

three js envMap using one texture

I have a code:
var urls = [ 'img/effects/cloud.png','img/effects/cloud.png','img/effects/cloud.png','img/effects/cloud.png','img/effects/cloud.png','img/effects/cloud.png' ];
var textureCube = THREE.ImageUtils.loadTextureCube( urls, new THREE.CubeRefractionMapping );
var cubeMaterial3 = new THREE.MeshBasicMaterial( { color: 0xffffff, envMap: textureCube, refractionRatio: 0.98, reflectivity:0.9 } );
mesh = new THREE.Mesh( wormholeGeom, cubeMaterial3 );
scene.add(mesh);
This successfully works and inserts sphere with refraction map.
But i do not using skybox, but skysphere where is whole sky represented by one texture.
Is the way to make a refraction mapping from one texture?
Not by array of six textures?
I tried many thinks (THREE.ImageUtils.loadTexture,THREE.SphericalRefractionMapping too) but no luck.
Documentation is "TODO".
This is my goal, but with one texture in skydome. There are used 6 textures in square to make sky.

Resources