Thansks to #Mugen87, I managed to create a dynamic mask on my scene using a post processing method : https://jsfiddle.net/40gef6sz/ (if you move the camera you will see the cube disappear behind and invisible circle, made with this image : https://i.imgur.com/sJwKYvZ.jpg)
I would like to add some effects on my mask using a fragment shader script. On this fiddle you can see my texture moving (I replaced a THREE.MeshBasicMaterial with a THREE.ShaderMaterial) : https://jsfiddle.net/grc_michael/ghmf5sdo/
When I combine this new material with the post processing method, I can't reach the same result than the first jsfiddle. I don't know where to add my #define ALPHATEST value inside my fragment shader.
You can see the result here : https://jsfiddle.net/grc_michael/82bkLn96/
I think I'm quite close the expected result. Does anyone know where to add the ALPHATEST value properly inside my fragment shader ? Thank you
Here us the original, updated fiddle with your custom shader material: https://jsfiddle.net/k2c5upfo/1/
When using an image as an alpha map, you can sample it in the same way like built-in materials. Meaning the alpha test looks like so:
if (texture2D(texture1,xy).g < 0.5) discard;
Related
I need to post-process the scene that I rendered previously on textureA (as render target), with my custom shader and save the result to textureB (input:textureA, output:textureB). Therefore, I don't need a scene and a camera. I think it's too simple to even bother with three.js classes like EffectComposer, Shaderpass, CopyShader, TexturePass, etc.
So, how do I setup this computing-like post-processing in a simple way?
I've create for you a fiddle that shows a basic post-processing effect without EffectComposer. The idea behind this code is to work with an instance of WebGLRenderTarget.
First, you draw the scene into this render target. In the next step, you use this render target as a texture for a plane which is rendered with an orthographic camera. The code in the render loop looks like this:
renderer.clear();
renderer.render( scene, camera, renderTarget );
renderer.render( sceneFX, cameraFX );
The corresponding material of this plane is your custom shader. I've used a luminosity shader from the official repo.
Therefore, I don't need a scene and a camera.
Please do it like in the example. That's the intended way of the library.
Demo: https://jsfiddle.net/f2Lommf5/5149/
three.js R91
I am trying to implement new custom features to the MeshStandardMaterial, in particular I would like to add the possibility to add two normal map that use different UV sets. Then I will combine them inside the fragment shader.
So far I have "doubled" meshstandardmaterial and make WebGLProgram insert keyword like "Use NormalMap2". The next step would be to mess around with actual glsl code.
Is there some way to print fragment shader or some how look what has been passed to it?
The easiest way to debug my code was to use webGL inspector, it shows me what texture has been passed to the shader and also it shows me all shader code
I'm trying to resolve some stitching while using a mesh in wireframe mode and some lines represented at the same coordinates as the mesh. The idea is that they are used for two representation modes of the same data in my software.
I created a jsFiddle for testing purpose : http://jsfiddle.net/Ludobaka/c0tq3gd0/
I already tried some tips from ThreeJS topics here as :
Using renderOrder of THREE.Mesh to render my wireframe first and using depthFunc of THREE.Material with THREE.LessDepth for the line representation.
Using polygonOffset property of THREE.Material of the mesh like the following code :
mat.polygonOffset = true;
mat.polygonOffsetFactor = 1.0;
mat.polygonOffsetUnit = 4.0;
But it looks like because wireframe is using glContext.LINES, it is not affected by these properties.
I want my wireframe always be on top of the lines. Do you know an other way to proceed ?
Thanks
In two cases, I have a THREE.ShaderMaterial that doesn't doesn't correctly render an object, omitting its texture.
On both examples, the middle object is a basic THREE.MeshPhongMaterial
Example1: http://jsfiddle.net/sG9MP/4/ The object that's closest to the screen never shows.
On this one, it works with renderer.render(...) but not composer.render(...).
renderer.render( scene, camera );
//composer.render();
Example2: http://jsfiddle.net/sG9MP/5/ Here I'm trying to duplicate the MeshPhongMaterial shader as a base so I can modify it. I tried to replicate it perfectly. I copied the uniform, vert, frag, and replicated what's in the object. I can't see anything different, so I don't get why it's not working the same as the standard Three.js phong shader.
So it's two cases where I'm using THREE.ShaderMaterial and it's not rendering the shader correctly, and I can't figure out why. On the second example(which is the one where I really need fixed. The first was an old test), in the webGL inspector I see the scene often looks fine until there is a "bindTexture(TEXTURE_2D, null);" call that happens under the hood by three.js. Though sometimes it just draws without it. In the first example, it's always drawing without it.
I feel like I must be missing some sort of flag in the renderer, or composer, or something. Or in my second example, where I'm trying to copy the Three.js phong shader, maybe I didn't copy something perfectly.
The goal here is just to copy the phong shader, so I can modify the uniform, vert, and frag on it. Sadly, I can't simply .clone() it since the vert and frag can't be modified after it's compiled.
It looks like while ShaderMaterial.map was being set, ShaderMaterial.uniforms.map.value was not consistently set.
I really don't understand this, though. In some cases I had issues not setting things at the top level under ShaderMaterial. Other cases I have issues not setting uniforms.
In my material, I just went and added this:
for(var k in phongMat){
if( typeof phongMat.uniforms[k] != 'undefined' ){
phongMat.uniforms[k].value = phongMat[k];
}
}
I want to render all the vertexes and the lines in the mesh.
I tried
1) A custom shader following this link
https://github.com/mrdoob/three.js/issues/2028
2) Set transparency with the material or like:
THREE.MeshNormalMaterial( { transparent: true, opacity: 0.5 } ),
3) Set ink material to the model in 3dMax and export to an obj file and load it in Threejs with objloader
None of them works fine.
Is there any solution to load mashes from 3dMax model (using objloader) and apply it with ink material in Threejs, just like what we can do in 3DMax?
See below as an example
http://makeitcg.com/wireframe-rendering-techniques-in-3ds-max/160/
First of all, Number 3) will NEVER work with OBJ, because obj will not export the ink shader from 3dsMax. And just to be clear, i don't think there is ANY way you can export the 3dsMax Shader.
Then, I don't understand what 2) is about. What do you want to achieve anyway? Do you want to set an object to be transparent? What does this have to do with Ink 'n Paint? Btw. i am not sure if transparency works for MeshNormalMaterial, i guess it should but Lambert and Phong will definetly work with transparency.
The problem is that you do not state what you want to achieve. From 1) it seems you want an outline around an object. For an outline see this example: http://stemkoski.github.io/Three.js/Outline.html or this one as a more advanced example: http://stemkoski.blogspot.de/2013/07/shaders-in-threejs-glow-and-halo.html
If you want to set transparent objects, well try MeshLambert or meshBasic material if MeshNormal does not work (i am not sure Right Now and can't check)
3) if you want to have a Ink 'n Paint shader, you need a Custom Shader. See examples here:
http://www.chromeexperiments.com/detail/cel-shader/
and already done with three.js here:
http://learningthreejs.com/data/THREEx/docs/THREEx.CelShader.html
Just apply the shader, see examples for setting up a ShaderMaterial.