I'm working in webGL. I'm pretty new to OpenGL. Having trouble with the blending function. My options look like:
gl.enable(gl.BLEND)
gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA)
I render a source rectangle with color [0,0,0,0.5] on top of a destination background with color [0,0,0,1]. Based on everything I've read, I expect the result to be black. Instead it looks to be about 25% white. Here's what I get when I render red and black rectangles with alpha values ranging from 0.0 to 1.0.
View live demo and source here. Am I misunderstanding the blending function, and if so, how do I get what I expect? Thanks!
You should specify separate function for alpha:
gl.enable(gl.BLEND);
gl.blendEquation(gl.FUNC_ADD);
gl.blendFuncSeparate(
gl.SRC_ALPHA,
gl.ONE_MINUS_SRC_ALPHA,
gl.ONE,
gl.ONE_MINUS_SRC_ALPHA
);
Related
I'm trying to implement a shader that can round the corners of a texture, and apply a border around it. This would be used in a window manager to have rounded windows with borders around them. The current fragment shader is as shown below (pardon the formatting).
const GLchar surface_fragment_src_rgba[] =
"precision mediump float;\n"
"varying vec2 v_texcoord;\n"
"uniform sampler2D tex;\n"
"\n"
"uniform vec2 half_size;\n"
"uniform vec2 position;\n"
"uniform float alpha;\n"
"uniform float radius;\n"
"uniform float half_border_thickness;\n"
"uniform vec4 border_color;\n"
"\n"
"float RectSDF(vec2 p, vec2 b, float r) {\n"
" vec2 d = abs(p) - b + vec2(r);\n"
" return min(max(d.x, d.y), 0.0) + length(max(d, 0.0)) - r;\n"
"}\n"
"\n"
"void main() {\n"
" vec2 center = gl_FragCoord.xy - position - half_size;\n"
" float dist = RectSDF(center, half_size + half_border_thickness, radius - half_border_thickness);\n"
"\n"
" vec4 tex_color = texture2D(tex, v_texcoord) * alpha;\n"
" vec4 from_color = border_color;\n"
" vec4 to_color = vec4(0.0);\n"
" if (half_border_thickness > 0.0) {\n"
" if (dist < 0.0) {\n"
" to_color = tex_color;\n"
" }\n"
" dist = abs(dist) - half_border_thickness;\n"
" } else {\n"
" from_color = tex_color;\n"
" }\n"
"\n"
" float blend_amount = smoothstep(-1.0, 1.0, dist);\n"
" gl_FragColor = mix(from_color, to_color, blend_amount);\n"
"}\n";
I have a mockup on shadertoy.
I am pretty confident that the shader works as intended, but my problem is that the shader doesn't operate outside of the original bounds of the texture. As an example, this is what it looks like when rendering a window. If anyone knows how I can render the region of the window plus the borders please let me know!
As a side note, if this is a suboptimal way to go about this, please let me know :) Originally, I rendered borders with a separate shader, but I figured I'd need knowledge of the texture's colors to blend the borders for antialiased rounded corners.
I would like to map an image I have to the face of a box in Pov-ray.
The image's dimensions are 1500x1125
(Example Image)
So I set up a scene with a light source above a camera looking at a box
camera{location <3,1.8,0> look_at <3,1.8,1>}
light_source{<3,20,0> color rgb <1,1,1>}
box{<0,0,0> <1,0.75,1> texture{pigment{image_map{png "Test1.png"}}} translate <2.5,1.425,3>}
The box's dimensions are 1x0.75 (z not relevant) which has the same 4:3 ratio as the image.
However, when the scene is rendered, the width of the image maps perfectly onto the box but some of the height is cut off. The image does not look stretched and I am confused why it does not fit.
IIRC, porvray will always read images as if they had a 1:1 aspect ratio.
If you insert a scale inside your pigment statement, before using it, that should fix it:
box{
<0,0,0> <1,0.75,1>
texture{
pigment {
image_map{png "Test1.png"}
scale <1, 0.75, 1>
}
} translate <2.5,1.425,3>
}
(I apologize for not testing this to be really sure right now).
I am trying to implement webgl for my game. The problem is I use multiple canvases, and my webgl enabled canvas is semi-transparent over my terrain canvas (which is drawn once and just moves with the player). Here is a picture:
I've searched for the past 2 hours on google and can't find anything that has helped.
I have tried:
getContext("experimental-webgl",{alpha: false});
This just hides the terrain completely (now all black), but my webgl drawn objects have the correct color.
gl.pixelStorei(gl.UNPACK_PREMULTIPLY_ALPHA_WEBGL, false);
gl.pixelStorei(gl.UNPACK_COLORSPACE_CONVERSION_WEBGL, false);
Neither of these did anything noticeable.
gl.clearColor(0, 0, 0, 0)
Didn't affect the outcome, still looks like the screenshot above.
Everything else here: http://games.greggman.com/game/webgl-and-alpha/
Nothing seems to work. Why is what is drawn on the canvas semi transparent? There is no CSS affecting the canvas element.
Figured it out! I am using WebGL-2D, which is a javascript file that adds the context2D API to webGL. So if I call drawImage, it actually handles that with webgl. In the getContext definition I had to change:
gl.colorMask(1,1,1,0);
to
gl.colorMask(1,1,1,1);
The colorMask() method specifies whether red, green, blue, and alpha can or cannot be written into the frame buffer.
I have no idea why it was 0 before.
Created a simple scene with a cube in it. Able to see the color of the containing element, (body), in the background.
Added an FXAA shader and the antialiasing works well. However the background is now black, so can no longer see the color of the background container.
Added the following code:
var target = new THREE.WebGLRenderTarget(512, 512);
var composer = new THREE.EffectComposer( renderer, target );
in order to set the effect composer render target format to THREE.RGBAFormat, rather than the default THREE.RGBFormat.
This makes the background work properly, but then there are black and white edges around the cube and the antialiasing does not look very good.
Repeated the above but used the Sepia shader instead of the FXAA shader. This works correctly. The cube looks sepia and the background containing element color is correct.
Are there any workarounds to allow antialiasing and transparent background?
Thanks for any help
I read your issue and there seems to be a good source that can solve or at least lead you down the right path. Go check out: https://github.com/mrdoob/three.js/issues/2125
Hope this helps.
Check my answer in https://stackoverflow.com/a/21056080/2482976
The FXAA needed to be updated to handle the transparent background
I am having a problem with WebGL. Semi-transparent textures appear semi-transparent, but they're also getting white, color from the texture isn't considered while rendering...
That's what I've set:
gl.blendFunc(BlendingFactorSrc.SRC_ALPHA, BlendingFactorDest.ONE_MINUS_SRC_ALPHA);
What are the possible solutions?
Typically if white appears unexpectedly in a WebGL scene, it's blending in from the CSS background of the web page itself (which defaults to white unless you change it).
In other words, with the default WebGL settings, any shader that writes alpha values less than 1.0 to the color buffer will make the <canvas> itself become translucent and show the web page's background. I believe you can change the WebGLContextAttributes settings to disable this behavior, or just make sure your shaders always output alpha values of 1.0.
For me setting these WebGL context attributes solved the issue:
const gl = canvas.getContext('webgl', {
alpha: true,
premultipliedAlpha: false,
});
Without premultipliedAlpha: false WebGL rendered not fully transparent pixels as white, instead of having opacity like 0.1.
When WebGL premultipliedAlpha disabled, pixels rendered at proper opacity level.