OpenGL ES 2.0: Alpha blending issues - opengl-es

I trying to blend my gui komponents again the background but atm it looks very strange, even when i set the alpha to 1.0 in the shader the blendings are really strange.
I enable the blending with the following code:
GL.Enable(All.Blend);
GL.BlendFunc(All.SrcAlpha, All.One);
With alpha set to 1.0 the buttons look like this ( Note: One button is missing ):
The pixel shader is simple:
varying lowp vec2 textureCoordinates;
uniform sampler2D texture;
void main()
{
gl_FragColor = vec4(texture2D(texture, textureCoordinates).bgr, 1.0); // r and b switched, because the colors are switched????
}
One of the buttons:

Use OneMinusSrcAlpha instead of One solves the problem.

Related

Blending issues between InstancedMesh using a ShaderMaterial (trying to reproduce mix-blend-mode: overlay from Figma)

I've been working on blending between InstancedMesh using a ShaderMaterial. The blending between the InstancedMesh works fine with THREE.JS blending mode (Additive, Substractive, ...) but i struggle to match the Figma design with thoses (See the attached screenshots, first is the current result with THREE.JS using Additive blend mode, second is the result i need from Figma).
Current Result using Additive Blending
Result needed from Figma
A mix-blend-mode: overlay is used on Figma between all the circles (the InstancedMesh in Three.js), so i tried to add some glsl blend overlay code (https://github.com/jamieowen/glsl-blend) like that in the ShaderMaterial used by the InstancedMesh :
uniform vec3 uColors[6];
uniform float uThresholds[6];
varying vec2 vUv;
float blendOverlay(float base, float blend) {
return base<0.5?(2.0*base*blend):(1.0-2.0*(1.0-base)*(1.0-blend));
}
vec3 blendOverlay(vec3 base, vec3 blend) {
return vec3(blendOverlay(base.r,blend.r),blendOverlay(base.g,blend.g),blendOverlay(base.b,blend.b));
}
vec3 blendOverlay(vec3 base, vec3 blend, float opacity) {
return (blendOverlay(base, blend) * opacity + base * (1.0 - opacity));
}
void main() {
vec3 color = mix(uColors[0], uColors[1], smoothstep(uThresholds[0], uThresholds[1], vUv.y));
color = mix(color, uColors[2], smoothstep(uThresholds[1], uThresholds[2], vUv.y));
color = mix(color, uColors[3], smoothstep(uThresholds[2], uThresholds[3], vUv.y));
color = mix(color, uColors[4], smoothstep(uThresholds[3], uThresholds[4], vUv.y));
color = mix(color, uColors[5], smoothstep(uThresholds[4], uThresholds[5], vUv.y));
gl_FragColor = vec4(blendOverlay(color.rgb, ???), 1.0);
}
I understood that i would need some sampler2D uniforms to use them as texture2D for the blendOverlay method to work, but my problem is how can i render those textures ?
If the overlay was just between a "background" and all the InstancedMesh i could render the InstancedMesh in a renderTarget once and use it as a texture. But here i need the overlay blending to be between all the InstancedMesh objects. Should i render the InstancedMesh one by one in a renderTarget and store every textures ? I'm a bit lost here hehe.

Why does overriding one value in the shader make the cube turn white?

I've been trying to work with WebGL and finally managed to find a 1-line change that can break one of the demos.
https://github.com/KhronosGroup/WebGL/blob/master/sdk/demos/webkit/SpiritBox.html
has a vertex shader :
uniform mat4 u_modelViewProjMatrix;
uniform mat4 u_normalMatrix;
uniform vec3 lightDir;
attribute vec3 vNormal;
attribute vec4 vTexCoord;
attribute vec4 vPosition;
varying float v_Dot;
varying vec2 v_texCoord;
void main()
{
gl_Position = u_modelViewProjMatrix * vPosition;
v_texCoord = vTexCoord.st;
vec4 transNormal = u_normalMatrix * vec4(vNormal, 1);
v_Dot = max(dot(transNormal.xyz, lightDir), 0.0);
}
The demo shows a spinning box with a picture of a puppy on each face.
If we add a single line to the end of the shader function:
v_Dot = 1.0;
then the box now renders as white. Switching from =1.0 to
v_Dot = max(v_Dot, 1.0);
makes the puppy reappear.
Here's a copy of the fragment shader just in case the link is broken:
precision mediump float;
uniform sampler2D sampler2d;
varying float v_Dot;
varying vec2 v_texCoord;
void main()
{
vec2 texCoord = vec2(v_texCoord.s, 1.0 - v_texCoord.t);
vec4 color = texture2D(sampler2d, texCoord);
color += vec4(0.1, 0.1, 0.1, 1);
gl_FragColor = vec4(color.xyz * v_Dot, color.a);
}
What the heck is going on here? I am using Firefox version 24.7.0 .
Yes! I ran into exactly this bug, trying to add this exact same line to this exact same demo!
It is certainly a bug. Furthermore it manifests only on certain platforms.
E.g. when I view the page on my macbook pro, it looks fine
(spinning textured cube with bright non-directional lighting, as intended)
but when I run it on my ubuntu box it fails (no texture, as you described).
This is using chrome "Version 46.0.2490.33 beta (64-bit)" on both machines.
Here is my explanation (essentially confirming #DietrichEpp 's comment).
There are 3 attributes: 0:vNormal, 1:vTexCoord, 2:vPosition,
and the main script makes hard-coded assumptions that the indices are precisely these
(Note that it currently says "vColor" instead of "vTexCood", but I think that part is harmless.)
But when you add "v_Dot=1.", the optimizer (on the problem platform) notices vNormal is no longer used,
so it removes it, and the attributes are now 0:vTexCoord, 1:vPosition,
and so the main script's indexing becomes incorrect.
Sort that out (i.e. remove all now-unused stuff and adjust the main script's indexing accordingly),
and the program works again.
This is obviously really fragile, and I'm not familiar enough with webgl or opengl ES to say
whether this shows a bug in the shader compiler/optimizer, or the main script, or the utility library,
or some subset of the above, or something else.
I was just now doing a web search trying to figure out where to report this bug, and that's how I found your question.
The dot product here represents the angle between the surface normal and the light. By forcing vDot to 1.0 everywhere, you're essentially asserting that 'there is no angle between the surface normal and the light'. So for every point on the surface, it acts like the light is directly above it and the surface is fully illuminated.

How can I create a laser beam effect with additive blending..?

I would like to create a laser beam effect as described here: http://codepoke.net/2011/12/27/opengl-libgdx-laser-fx/
But when I set blending to THREE.AdditiveBlending the color of the laser beam gets blended with the background color. http://i.imgur.com/kSCFB3U.png ( 3rd pic on the right ).
I don't know if its related to the wrong blending or if something is wrong with the color mixing in my shader, however I suspect it has to do something with blending.
My shader code are just a few lines, I use the same texture as in the blog entry.
uniform sampler2D uTex;
varying vec2 vUv;
void main() {
vec4 texelColor = texture2D( uTex, vUv );
vec4 color = vec4(1.0,0.0,0.0,1.0);
gl_FragColor = vec4(mix(color.rgb,texelColor.rgb,texelColor.a),texelColor.a);
How can I achieve the cool looks like in the video so that the laser color stays red if it's red but still use additive blending to make it look better?
What you're doing is right already. Change the grass background with a background like in the example you're using as reference and you should get similar results.

OpenGL ES 2.0, how to animate texture's opacity

I'm using OpenGL ES 2.0 to create the following scene:
Draw a background image on the entire screen and above it draw another overlay image that fades in and out (alpha changes)
I also need to use "Screen blend" to blend the overlay and the background textures.
So I created a shader that blends the two textures, I thought that I can use a uniform (randomAlpha) to change the overlay texture alpha over time and create a fade animation, but the following code doesn't do the trick, the overlay texture opacity doesn't changes!
I know there is an "Alpha Blend" that I can use to blend the overlay and the background textures, but the problem is that I want the final overlay (after opacity changes) to blend the background with a "Screen Blend" not an "Alpha Blend"
This is how my fragment shader main method looks like:
void main()
{
mediump vec4 background = texture2D(backgroundTexture, backgroundCoords);
mediump vec4 overlay = texture2D(overlayTexture, overlayCoords);
overlay.a = randomAlpha;
// add overlay to background (using screen blend)
mediump vec4 whiteColor = vec4(1.0);
gl_FragColor = whiteColor - ((whiteColor - overlay) * (whiteColor - background));
}
Clearly I missing something important here..
How can I change the texture's opacity? how can I create the fade effect?
Like you already figured it's just a matter of blending.
What you're missing is that you need to re-calculate the rgb channels, when you change the alpha value when you want to blend to textures together.
void main()
{
mediump vec4 background = texture2D(backgroundTexture, backgroundCoords);
mediump vec4 overlay = texture2D(overlayTexture, overlayCoords);
overlay.a = randomAlpha;
background.a = 1.0 - overlay.a;
gl_FragColor = vec4(background.rgb * background.a + overlay.rgb * overlay.a, 1.0);
}
Here is a simplified version of the above code, the above code is just easier to understand and read.
void main()
{
mediump vec4 background = texture2D(backgroundTexture, backgroundCoords);
mediump vec4 overlay = texture2D(overlayTexture, overlayCoords);
overlay.rgb *= randomAlpha;
background.rgb *= 1.0 - randomAlpha;
gl_FragColor = vec4(background.rgb + overlay.rgb, 1.0);
}

GLSL: gl_FragCoord issues

I am experimenting with GLSL for OpenGL ES 2.0. I have a quad and a texture I am rendering. I can successfully do it this way:
//VERTEX SHADER
attribute highp vec4 vertex;
attribute mediump vec2 coord0;
uniform mediump mat4 worldViewProjection;
varying mediump vec2 tc0;
void main()
{
// Transforming The Vertex
gl_Position = worldViewProjection * vertex;
// Passing The Texture Coordinate Of Texture Unit 0 To The Fragment Shader
tc0 = vec2(coord0);
}
//FRAGMENT SHADER
varying mediump vec2 tc0;
uniform sampler2D my_color_texture;
void main()
{
gl_FragColor = texture2D(my_color_texture, tc0);
}
So far so good. However, I'd like to do some pixel-based filtering, e.g. Median. So, I'd like to work in pixel coordinates rather than in normalized (tc0) and then convert the result back to normalized coords. Therefore, I'd like to use gl_FragCoord instead of a uv attribute (tc0). But I don't know how to go back to normalized coords because I don't know the range of gl_FragCoords. Any idea how I could get it? I have got that far, using a fixed value for 'normalization', though it's not working perfectly as it is causing stretching and tiling (but at least is showing something):
//FRAGMENT SHADER
varying mediump vec2 tc0;
uniform sampler2D my_color_texture;
void main()
{
gl_FragColor = texture2D(my_color_texture, vec2(gl_FragCoord) / vec2(256, 256));
}
So, the simple question is, what should I use in the place of vec2(256, 256) so that I could get the same result as if I were using the uv coords.
Thanks!
gl_FragCoord is in screen coordinates, so to get normalized coords you need to divide by the viewport width and height. You can use a uniform variable to pass that information to the shader, since there is no built in variable for it.
You can also sample the texture by un-normalized coordinates if:
sampling by texture() from GL_TEXTURE_RECTANGLE
sampling by texelFetch() from a regular texture or texture buffer

Resources