three.js bloom is a blur? - three.js

I am currently experimenting a bit with three.js and I am trying to add a bloom effect. however, when Iadd the bloom, it comes out more as a blur than an actual bloom:
the code:
composer = new THREE.EffectComposer(renderer, renderTarget);
effectBloom = new THREE.BloomPass(1, 25, 5);
composer.addPass(renderModel)
composer.addPass(effectBloom);
composer.addPass(copyPass)
and its being rendered with:
composer.render( delta )
I would like to get closer to this:

I had this similar issue. Bloom was just blurring the rendered image. To fix the issue I had to set renderer autoClear to false:
renderer.autoClear = false;
And in my render loop I had to do the clearing manually just before using composer to render the scene:
renderer.clear();
composer.render();
Check my pen to see this in action:
http://codepen.io/jaamo/pen/BoKXrL

A bloom is a blurred image, super imposed over your main rendering. You probably need more of a dynamic range to create the effect as seen in the attached image.

Related

How to disable three.js to resize images in power of two?

three.js automatically resizes the texture image, if it is not power of two.
In my case am using a custom canvas as texture , which is not power of two.while resizing
makes the texture not appearing properly.Is there any way to disable the resizing of the images in three.js
three.js actually is trying to do you a favor.
Since it is open source we can read the source code of WebGLRenderer.js and see that the setTexture method calls the (non public visible) method uploadTexture.
The latter has this check:
if ( textureNeedsPowerOfTwo( texture ) && isPowerOfTwo( image ) === false ){
image = makePowerOfTwo( image );
}
Which is quite explanatory itself.
You may wonder now what textureNeedsPowerOfTwo actually checks. Let's see.
function textureNeedsPowerOfTwo( texture ) {
if ( texture.wrapS !== THREE.ClampToEdgeWrapping || texture.wrapT !== THREE.ClampToEdgeWrapping ) return true;
if ( texture.minFilter !== THREE.NearestFilter && texture.minFilter !== THREE.LinearFilter ) return true;
return false;
}
If you use wrapping for the texture coordinated different from clamp or if you use a filtering that is not nearest nor linear the texture gets scaled.
If you are surprised by this code I strongly suggest you to take a look at the MDN page on using textures.
Quoting
The catch: these textures [Non Power Of Two textures] cannot be used with mipmapping and they must not "repeat" (tile or wrap).
[...]
Without performing the above configuration, WebGL requires all samples of NPOT [Non Power Of Two] textures to fail by returning solid black: rgba(0,0,0,1).
So using a NPOT texture with incorrect texture parameters would give you the good old solid black.
Since three.js is open source, you can edit your local copy and remove the "offending" check.
However a better, simpler, and more maintainable approach is to simply scale the UV mapping. After all it is there just for this use case.

three.js - Objects farther away from camera gets jagged textures

I'm struggeling with textures on objects that are a bit farther back in the scene. The textures become very jagged, and creates a disturbing effect as the camera moves. I've tried changing the anisotropy, and I've tried changing the min and mag filters, but nothing seems to help at all.
Code I'm using to load textures (all textures are 1024px by 1024px):
var texture = new THREE.Texture();
var texloader = new THREE.ImageLoader(manager);
texloader.load('static/3d/' + name + '.jpg', function (image) {
texture.image = image;
texture.needsUpdate = true;
texture.anisotropy = 1;
texture.minFilter = THREE.LinearFilter;
texture.magFilter = THREE.LinearMipmapLinearFilter;
});
You can see it in action here: http://www.90595.websys.sysedata.no/
gaitat is wrong, you do want the mipmaps.
The problem with your code is that they are not generated.
Using the console, I found that while "generateMipmaps" in your textures is set to "true", mipmaps are not generated, as seen in this screenshot: http://imgur.com/hAUEaur.
I looked at your textures, and I believe the mipmaps weren't generated due to your textures not being a power of 2 (e.g. 128x128, 256x256, 512x512). Try making your textures of width and height that are powers of 2 and I think the mipmaps will be generated and they won't look jagged anymore.
As objects move further away from the camera webgl uses textures automatically generated called mipmaps. These are of lower resolution. If you don't like them disable them by:
texture.generateMipmaps = false;
Okay. So I thought I'd tried all the different mipmap filters, but apparently no. So this is what ended up doing the trick:
texture.minFilter = THREE.NearestMipMapNearestFilter;
texture.magFilter = THREE.LinearMipMapLinearFilter;
Didn't need the anisotropy at all.

Threejs - Applying simple texture on a shader material

Using Threejs (67) with a Webgl renderer, I can't seem to get a plane with a shader material to wear its texture. No matter what I do the material would just stay black.
My code at the moment looks quite basic :
var grassT = new Three.Texture(grass); // grass is an already loaded image.
grassT.wrapS = grassT.wrapT = Three.ClampToEdgeWrapping;
grassT.flipY = false;
grassT.minFilter = Three.NearestFilter;
grassT.magFilter = Three.NearestFilter;
grassT.needsUpdate = true;
var terrainUniforms = {
grassTexture : { type: "t", value: grassT},
}
Then I just have this revelant part in the vertexShader :
vUv = uv;
And on the fragmentShader side :
gl_FragColor = texture2D(grassTexture, vUv);
This results in :
Black material.
No error in console.
gl_FragColor value is always (0.0, 0.0, 0.0, 1.0).
What I tryed / checked:
Everything works fine if I just apply custom plain colors.
All is ok if I use vertexColors with plain colors too.
My texture width / height is indeed a power of 2.
The image is on the same server than the code.
Tested others images with same result.
The image is actually loading in the browser debugger.
UVS for the mesh are corrects.
Played around with wrapT, wrapS, minFilter, magFilter
Adapted the mesh size so the texture has a 1:1 ratio.
Preloaded the image with requirejs image plugin and created the texture from THREE.Texture() instead of using THREE.ImageUtils();
Played around with needsUpdate : true;
Tryed to add defines['USE_MAP'] during material instanciation.
Tryed to add material.dynamic = true.
I have a correct rendering loop (interraction with terrain is working).
What I still wonder :
It's a multiplayer game using a custom port with express + socket.io. Am I hit by any Webgl security policy ?
I have no lights logic at the moment, is that a problem ?
Maybe the shader material needs other "defines" at instanciation ?
I guess I'm overlooking something simpler, this is why I'm asking...
Thanks.
I am applying various effects on the same shader. I have a custom API that merge all different effects uniforms simply by using Three.UniformsUtils.merge() However this function is calling the clone() method on the texture and this is causing to reset needsUpdate to false before the texture reach the renderer.
It appears that you should set your texture needsUpdate property to true when reaching the material level. On the texture level, if the uniform you set get merged, and therefore cloned, later in the process, it'll lose its needsUpdate property.
The issue is also detailled here: https://github.com/mrdoob/three.js/issues/3393
In my case the following wasn't working (grassT is my texture):
grassT.needsUpdate = true
while the following is running perfectly later on in the code:
material.uniforms.grassTexture.value.needsUpdate = true;
Image loading is asynchronous. Most likely, you are rendering your scene before the texture image loads.
You must set the texture.needsUpdate flag to true after the image loads. three.js has a utility that will do that for you:
var texture = THREE.ImageUtils.loadTexture( "texture.jpg" );
Once rendered, the renderer sets the texture.needsUpdate flag back to false.
three.js r.68

Outline object (normal scale + stencil mask) three.js

For some time, I've been trying to figure out how to do an object selection outline in my game. (So the player can see the object over everything else, on mouse-over)
This is how the result should look:
The solution I would like to use goes like this:
Layer 1: Draw model in regular shading.
Layer 2: Draw a copy in red color, scaled along normals using vertex shader.
Mask: Draw a black/white flat color of the model to use it as a stencil mask for the second layer, to hide insides and show layer 1.
And here comes the problem. I can't really find any good learning materials about masks. Can I subtract the insides from the outline shape? What am I doing wrong?
I can't figure out how to stack my render passes to make the mask work. :(
Here's a jsfiddle demo
renderTarget = new THREE.WebGLRenderTarget(window.innerWidth, window.innerHeight, renderTargetParameters)
composer = new THREE.EffectComposer(renderer, renderTarget)
// composer = new THREE.EffectComposer(renderer)
normal = new THREE.RenderPass(scene, camera)
outline = new THREE.RenderPass(outScene, camera)
mask = new THREE.MaskPass(maskScene, camera)
// mask.inverse = true
clearMask = new THREE.ClearMaskPass
copyPass = new THREE.ShaderPass(THREE.CopyShader)
copyPass.renderToScreen = true
composer.addPass(normal)
composer.addPass(outline)
composer.addPass(mask)
composer.addPass(clearMask)
composer.addPass(copyPass)
Also I have no idea whether to use render target or renderer for the source of the composer. :( Should I have the first pass in the composer at all? Why do I need the copy pass? So many questions, I know. But there are just not enough resources to learn from, I've been googling for days.
Thanks for any advice!
Here's a js fiddle with working solution. You're welcome. :)
http://jsfiddle.net/Eskel/g593q/6/
Update with only two render passes (credit to WestLangley):
http://jsfiddle.net/Eskel/g593q/9/
The pieces missing were these:
composer.renderTarget1.stencilBuffer = true
composer.renderTarget2.stencilBuffer = true
outline.clear = false
Now I think I've found a bit simpler solution, from the THREEx library. It pre-scales the mesh so you dont need a realtime shader for it.
http://jeromeetienne.github.io/threex.geometricglow/examples/geometricglowmesh.html

Super sample antialiasing with threejs

I want to render my scene at twice the resolution of my canvas and then downscale it before displaying it. How would I do that using threejs?
for me the best way to have a perfect AA with not too much work (see the code below)
ps :if you increase more than 2 its start to be too sharpen
renderer = new THREE.WebGLRenderer({antialiasing:true});
renderer.setPixelRatio( window.devicePixelRatio * 1.5 );
This is my solution. The source comments should explain what's going on. Setup (init):
var renderer;
var composer;
var renderModel;
var effectCopy;
renderer = new THREE.WebGLRenderer({canvas: canvas});
// Disable autoclear, we do this manually in our animation loop.
renderer.autoClear = false;
// Double resolution (twice the size of the canvas)
var sampleRatio = 2;
// This render pass will render the big result.
renderModel = new THREE.RenderPass(scene, camera);
// Shader to copy result from renderModel to the canvas
effectCopy = new THREE.ShaderPass(THREE.CopyShader);
effectCopy.renderToScreen = true;
// The composer will compose a result for the actual drawing canvas.
composer = new THREE.EffectComposer(renderer);
composer.setSize(canvasWidth * sampleRatio, canvasHeight * sampleRatio);
// Add passes to the composer.
composer.addPass(renderModel);
composer.addPass(effectCopy);
Change your animation loop to:
// Manually clear you canvas.
renderer.clear();
// Tell the composer to produce an image for us. It will provide our renderer with the result.
composer.render();
Note: EffectComposer, RenderPass, ShaderPass and CopyShader are not part of the default three.js file. You have to include them in addition to three.js. At the time of writing they can be found in the threejs project under the examples folder:
/examples/js/postprocessing/EffectComposer.js
/examples/js/postprocessing/RenderPass.js
/examples/js/postprocessing/ShaderPass.js
/examples/js/shaders/CopyShader.js
Here's how you might be able to work it out: In your three.js initialization code, when you create your renderer, make it double the dimensions of your primary canvas, and set it to render to a secondary, hidden canvas element that is twice as large as your primary canvas. Perform the necessary image manipulation on the secondary canvas, and then display the result on the primary canvas.

Resources