Artefacts when rendering multiple intersecting transparent meshes with threejs - three.js

I am trying to render a stack of transparent planes using THREE.ShaderMaterial:
const renderMaterial = new ShaderMaterial({
uniforms: {
color: { value: color },
},
vertexShader: VERTEX_SHADER,
fragmentShader: FRAGMENT_SHADER, // fragment shader sets alpha to 0.3
transparent: true,
side: THREE.FrontSide
});
This looks fine, but as I rotate the camera around the scene, at certain angles I get rendering artefacts.
Correct render:
Corrupted render when I rotate the camera. It gets more obvious as I rotate the camera:
Can anyone point me in the right direction? I tried experimenting with renderer.sortObjects = false and setting custom renderOrder to each transparent mesh without much success. I don't understand why it happens only at certain camera angles?
Thanks in advance

Related

How to get Image (or Texture) generated by a fragmentShader

I'am using Three.js 3D library and I would like to use a fragmentShader to generate a THREE.Texture or THREE.Image, because its far faster in glsl than in typescript.
So I have a square Plane with a ShaderMaterial.
My fragmentShader makes an image appear on the plane and I would like to get/extract this image as a Texture (or Image) so I can reuse it as a static Texture elsewhere.
Is there a way to do this ?
const tileGeometry = new THREE.PlaneBufferGeometry( 500, 500 );
const tileUniforms = {};
const tileMaterial = new THREE.ShaderMaterial({
uniforms: tileUniforms,
vertexShader: this.shadersService.getCode('testVertShader'),
fragmentShader: this.shadersService.getCode('testFragShader'),
side: THREE.FrontSide,
blending: THREE.NormalBlending,
wireframe: false,
transparent: true
});
const tileMesh = new THREE.Mesh( tileGeometry, tileMaterial );
this.scene.add( tileMesh );
I know that a possible solution was to use a WebGLRenderTarget but perhaps there is a more straitforward solution now ?
I know that a possible solution was to use a WebGLRenderTarget but perhaps there is a more straitforward solution now ?
No, you have to render into a render target and then use its texture property with another material.

Alpha in Fragment Shader gl_FragColor not working

I'm using Three.js and a ShaderMaterial to draw Points Geometry with a Fragment Shader.
I want each point to have a blurred edge, but I can't get the alpha to work, it just turns white.
I can discard pixels when they're completely transparent to make a circle, but the blur fades to white and is then abruptly cut off.
Here's what I see on screen:
Three.JS code
var shaderMaterial = new THREE.ShaderMaterial( {
uniforms: uniforms,
vertexShader: document.getElementById( 'vertexshader' ).textContent,
fragmentShader: document.getElementById( 'fragmentshader' ).textContent,
blending: THREE.NormalBlending,
depthTest: true,
transparent: true,
clipping: true
});
var points = new THREE.Points(geometry, shaderMaterial);
Fragment Shader:
//translate gl_PointCoord to be centered at 0,0
vec2 cxy = 2.0 * gl_PointCoord - 1.0;
//calculate alpha based on distance from centre
//(I'm doubling it to get a less gradual fade)
float newAlpha = (1.0-distance(cxy, vec2(0.0,0.0))) * 2.0;
if (newAlpha < 0.01)
{
//discard pixels that have ~0 alpha
discard;
}
gl_FragColor = vec4( newR, newG, newB, newAlpha);
Thanks in advance for any help :) This has been puzzling me for AGES.
Edit: Images of depthTest on and off. It looks to me like depth test does put them in the right order?
depthTest false:
depthTest true:
Your JSFiddle example has several instances where it fights with itself. You're trying to set the material blending mode in Three.js, but then you override that with:
var gl = renderer.context;
gl.enable(gl.BLEND);
gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);
You're requesting an animationFrame, but then do a setTimeout inside of that, removing all benefits of using animationFrame.
I was able to get something slightly improved with
blending: THREE.NormalBlending,
transparent: true,
depthWrite: false,
depthTest: true
You can see the fixes live here: https://jsfiddle.net/yxj8zvmp/
... although there are too many attributes and uniforms being passed to your geometry to really get to the bottom of why the depth isn't working as expected. Why are you passing u_camera, when you already have cameraPosition, and your position attribute is of length 1? It feels like a blend of raw WebGL fighting with Three.js.

ThreeJS: White PNG image loaded as texture, used as material and rendered as plane has grey edges

I'm having an issue when rendering a white material in ThreeJS version 87.
Here are the steps to replicate:
A white PNG image that is loaded as texture
This texture is used to create a MeshBasicMaterial (passed as parameter map)
The MeshBasicMaterial is used along a plane Geometry to create a Mesh
The Mesh is added to an empty Scene and rendered on a WebGLRenderer with alpha: true and clearColor as white
The problem is that the rendered texture now has grey edges on parts that should be fully white.
This happens with any image with white edges. I've also tried many different configurations for the renderer and the material but to no avail.
I've made a very simple CodePen that replicates the behavior as simple as possible. Does anyone know how can this problem be solved?
CodePen:
https://codepen.io/ivan-i1/pen/pZxwZX
var renderer, width, height, scene, camera, dataUrl, threeTexture, geometry, material, mesh;
width = window.innerWidth;
height = window.innerHeight;
dataUrl = '//data url from image';
threeTexture = new THREE.ImageUtils.loadTexture(dataUrl);
material = new THREE.MeshBasicMaterial({
map: threeTexture,
transparent: true,
alphaTest: 0.1
});
material.needsUpdate = true;
geometry = new THREE.PlaneGeometry(5, 5);
mesh = new THREE.Mesh(geometry, material);
mesh.position.z = -5;
scene = new THREE.Scene();
scene.add(mesh);
camera = new THREE.PerspectiveCamera( 70, window.innerWidth / window.innerHeight, 1, 1000 );
renderer = new THREE.WebGLRenderer({
alpha: true
});
document.body.appendChild( renderer.domElement );
renderer.setSize(width, height);
renderer.setClearColor( 0xffffff, 1 );
//renderer.render(scene, camera);
function render() {
//Finally, draw to the screen
requestAnimationFrame(render);
renderer.render(scene, camera);
}
render();
Any help is truly appreciated.
ThreeJS/87
Edit:
I think I'm lacking more precision on my post.
This is the original full alpha image:
It might not show because its all white
And this is the same image with different transparencies on 4 quadrants:
This one too might not show because its all white
I got a helpful answer where I was told to make the alphaTest higher, but the problem is that doing that wipes out the transparent parts out of the images, and I need to conserve those parts.
Here is a copy of the codepen with the updated images and showing the same (but slight) grey edges:
codepen
Sorry for not being as precise the first time, any further help is even more appreciated.
Set alphaTest to 0.9.. or higher.. observe the improvement.
Your star texture has gray or black in the area outside the star, which is why you're seeing a gray halo. You can fix it by filling the image with white, (but not changing the alpha channel) in your image editing tool.
Also, you should upgrade to latest three.js (r95)
edit:
I'm not sure what your exact expectation is.. but there are many different settings that control alpha blending in THREE. There is renderer.premultipliedAlpha = true/false (defaults to true) and material.transparent = true/false; material.alphaTest is a threshold value to control at what level alpha is ignored completely. There are also the material.blending, .blendEquation .blendEquation, .blendEquationAlpha, blendDst and blendSrc. etc. etc. You probably need to read up on those.
https://threejs.org/docs/#api/materials/Material
For instance.. here is your texture with:
renderer.premultipliedAlpha = false;
notice the black border on one quadrant of your texture.
https://codepen.io/manthrax/pen/KBraNB

Three.js use framebuffer as texture

I'm using an image in a canvas element as a texture in Three.js, performing image manipulations on the canvas using JavaScript, and then calling needsUpdate() on the texture. This works, but it's quite slow.
I'd like to perform the image calculations in a fragment shader instead. I've found many examples which almost do this:
Shader materials: http://mrdoob.github.io/three.js/examples/webgl_shader2.html This example shows image manipulations performed in a fragment shader, but that shader is functioning as the fragment shader of an entire material. I only want to use the shader on a texture, and then use the texture as a component of a second material.
Render to texture: https://threejsdoc.appspot.com/doc/three.js/examples/webgl_rtt.html This shows rendering the entire scene to a WebGLRenderTarget and using that as the texture in a material. I only want to pre-process an image, not render an entire scene.
Effects composer: http://www.airtightinteractive.com/demos/js/shaders/preview/ This shows applying shaders as a post-process to the entire scene.
Edit: Here's another one:
Render to another scene: http://relicweb.com/webgl/rt.html This example, referenced in Three.js Retrieve data from WebGLRenderTarget (water sim), uses a second scene with its own orthographic camera to render a dynamic texture to a WebGLRenderTarget, which is then used as a texture in the primary scene. I guess this is a special case of the first "render to texture" example listed above, and would probably work for me, but seems over-complicated.
As I understand it, ideally I'd be able to make a new framebuffer object with its own fragment shader, render it on its own, and use its output as a texture uniform for another material's fragment shader. Is this possible?
Edit 2: It looks like I might be asking something similar to this: Shader Materials and GL Framebuffers in THREE.js ...though the question doesn't appear to have been resolved.
Render to texture and Render to another scene as listed above are the same thing, and are the technique you want. To explain:
In vanilla WebGL the way you do this kind of thing is by creating a framebuffer object (FBO) from scratch, binding a texture to it, and rendering it with the shader of your choice. Concepts like "scene" and "camera" aren't involved, and it's kind of a complicated process. Here's an example:
http://learningwebgl.com/blog/?p=1786
But this also happens to be essentially what Three.js does when you use it to render a scene with a camera: the renderer outputs to a framebuffer, which in its basic usage goes straight to the screen. So if you instruct it to render to a new WebGLRenderTarget instead, you can use whatever the camera sees as the input texture of a second material. All the complicated stuff is still happening, but behind the scenes, which is the beauty of Three.js. :)
So: To replicate a WebGL setup of an FBO containing a single rendered texture, as mentioned in the comments, just make a new scene containing an orthographic camera and a single plane with a material using the desired texture, then render to a new WebGLRenderTarget using your custom shader:
// new render-to-texture scene
myScene = new THREE.Scene();
// you may need to modify these parameters
var renderTargetParams = {
minFilter:THREE.LinearFilter,
stencilBuffer:false,
depthBuffer:false
};
myImage = THREE.ImageUtils.loadTexture( 'path/to/texture.png',
new THREE.UVMapping(), function() { myCallbackFunction(); } );
imageWidth = myImage.image.width;
imageHeight = myImage.image.height;
// create buffer
myTexture = new THREE.WebGLRenderTarget( width, height, renderTargetParams );
// custom RTT materials
myUniforms = {
colorMap: { type: "t", value: myImage },
};
myTextureMat = new THREE.ShaderMaterial({
uniforms: myUniforms,
vertexShader: document.getElementById( 'my_custom_vs' ).textContent,
fragmentShader: document.getElementById( 'my_custom_fs' ).textContent
});
// Setup render-to-texture scene
myCamera = new THREE.OrthographicCamera( imageWidth / - 2,
imageWidth / 2,
imageHeight / 2,
imageHeight / - 2, -10000, 10000 );
var myTextureGeo = new THREE.PlaneGeometry( imageWidth, imageHeight );
myTextureMesh = new THREE.Mesh( myTextureGeo, myTextureMat );
myTextureMesh.position.z = -100;
myScene.add( myTextureMesh );
renderer.render( myScene, myCamera, myTexture, true );
Once you've rendered the new scene, myTexture will be available for use as a texture in another material in your main scene. Note that you may want to trigger the first render with the callback function in the loadTexture() call, so that it won't try to render until the source image has loaded.

3d object that looks the same from every direction

have a 3d maze with walls and floor.
have an image with a key ( or other object its not important, but all of em are images and not 3d models ).
I want to display it on the floor and if the camera moves around the object needs to look the same without rotating the object. How can i achieve this?
Update1:
I created a plane geometry added the image ( its a transparent png ) and rotating at render. Its working good, but if i turn the camera sometimes the plane lose transparency for about a few milisec and the get a solid black background ( blinking ).
Any idea why?
here is the code:
var texture = new THREE.ImageUtils.loadTexture('assets/images/sign.png');
var material = new THREE.MeshBasicMaterial( {map: texture, transparent: true} );
plane = new THREE.Mesh(new THREE.PlaneGeometry(115, 115,1,1), material );
plane.position.set(500, 0, 1500);
scene.add(plane);
// at render:
plane.rotation.copy( camera.rotation );
This will be achieved by using:
function animate() {
not3dObject.rotation.z = camera.rotation.z;
not3dObject.rotation.x = camera.rotation.x;
not3dObject.rotation.y = camera.rotation.y;
...
render();
}

Resources