OpenGL transparency in texture when render with stencil buffer - opengl-es

The question has been updated thanks to the comments.
Screenshot of how textures overlap
To draw 2 points with brush texture using the stencil buffer to avoid textures transparency overlap, the following code is used:
glEnable(GL_STENCIL_TEST.gluint)
glClear(GL_STENCIL_BUFFER_BIT.gluint | GL_DEPTH_BUFFER_BIT.gluint)
glStencilOp(GL_KEEP.gluint, GL_KEEP.gluint, GL_REPLACE.gluint)
glStencilFunc(GL_ALWAYS.gluint, 1, 1)
glStencilMask(1)
glDrawArrays(GL_POINTS.gluint, 0, 1)
glStencilFunc(GL_NOTEQUAL.gluint, 1, 1)
glStencilMask(1)
glDrawArrays(GL_POINTS.gluint, 1, 1)
glDisable(GL_STENCIL_TEST.gluint)
And stencil buffer works, however, each point fill a full rectangle in the stencil buffer, but a texture image has transparency. So maybe texture used in the wrong way?
The texture is loaded like this
glGenTextures(1, &gl_id)
glBindTexture(GL_TEXTURE_2D.gluint, gl_id)
glTexParameteri(GL_TEXTURE_2D.gluint, GL_TEXTURE_MIN_FILTER.gluint, GL_LINEAR)
glTexImage2D(GL_TEXTURE_2D.gluint, 0, GL_RGBA, gl_width.int32, gl_height.int32, 0, GL_RGBA.gluint, GL_UNSIGNED_BYTE.gluint, gl_data)
Blending set as
glEnable(GL_BLEND.gluint)
glBlendFunc(GL_ONE.gluint, GL_ONE_MINUS_SRC_ALPHA.gluint)
Could you please advice where to look in order to fill 1s in stencil buffer by exactly not transparent area of brush image?

I recommend to discard the transparent parts of the texture in the fragment shader. A fragment can be completely skipped in the fragment shader by the discard keyword.
See Fragment Shader - Special operations.
Use a small threshold and discard a fragment, if the alpha channel of the texture color is below the threshold:
vec4 texture_color = .....;
float threshold = 0.01;
if ( texture_color.a < threshold )
discard;
An other possibility would be to use an Alpha test. This would be only available in OpenGL compatibility profile, but not in core profile or OpenGL ES.
See Khronos OpenGL-Refpages glAlphaFunc:
The alpha test discards fragments depending on the outcome of a comparison between an incoming fragment's alpha value and a constant reference value.
With the following alpha test, the fragments whos alpha channel is below the threshold are discarded:
float threshold = 0.01;
glAlphaFunc(GL_GEQUAL, threshold);
glEnable(GL_ALPHA_TEST)

Related

If alpha is 0, RBG is 0 even with premultipyAlpha to false

I have a 3D model with a RGBA texture where the alpha value should be ignored (I must use those alpha values differently for some reflections effects).
PNG specs say : colors are not premultiplied by alpha.
Then, in shaders we should be able to ignore alpha values and only displaying RGB colors.
I tried many things, disabling premultipliedAlpha in WebGLRenderer, Texture and Material but RGB values are 0 if alpha is 0.
In the example below, the models should have skin colored hands and neck but appear white.
On the center texture is loaded natively by the FBXLoader.
On the right texture is loaded by the TextureLoader.
On the left another texture without alpha is loaded by the TextureLoader for demonstration purpose.
For ShaderMaterials, the fragment shader is :
void main() {
FragColor = vec4(texture2D(map, vUv).rgb, 1.);
}
Codesandbox
What I am doing wrong ? Is that a WebGL or THREE limitation ?
PS: using 2 textures is not an option

Three.js render depth from texture

Is it possible to somehow render to depth buffer from pre-rendered texture?
I am pre-rendering scene like original resident evil games and I would like to apply both pre-rendered depth and color texture to screen.
I previously used technique to make simpler proxy scene for depth but I am wondering if there is a way to use precise pre rendered depth texture instead.
three.js provides a DepthTexture class which can be used to save the depth of a rendered scene into a texture. Typical use cases for such a texture are post processing effects like Depth-of-Field or SSAO.
If you bind a depth texture to a shader, you can sample it like any other texture. However, the sampled depth value is sometimes converted to different representations for further processing. For instance you could compute the viewZ value (which is the z-distance between the rendered fragment and the camera) or convert between perspective and orthographic depth and vice versa. three.js provides helper functions for such tasks.
The official depth texture example uses these helper functions in order to visualize the scene's depth texture. The important function is:
float readDepth( sampler2D depthSampler, vec2 coord ) {
float fragCoordZ = texture2D( depthSampler, coord ).x;
float viewZ = perspectiveDepthToViewZ( fragCoordZ, cameraNear, cameraFar );
return viewZToOrthographicDepth( viewZ, cameraNear, cameraFar );
}
In the example, the resulting depth value is used to compute the final color of the fragment.

Geometry Shader Quad Post Processing

Using directx 11, I'm working on a graphics effect system that uses a geometry shader to build quads in world space. These quads then use a fragment shader in which the main texture is the rendered scene texture. Effectively producing post process effects on qorld space quads. The simplest of which is a tint effect.
The vertex shader only passes the data through to the geometry shader.
The geometry shader calculates extra vertices based on a normal. Using cross product, I find the x and z axis and append the tri-stream with 4 new verts in each diagonal direction from the original position (generating a quad from the given position and size).
The pixel shader (tint effect) simply multiplies the scene texture colour with the colour variable set.
The quad generates and displays correctly on screen. However;
The problem that I am facing is the mapping of the uv coordinates fails to align with the image on the back buffer. That is, when using the tint shader with half alpha as the given colour you can see the image displayed on the quad does not overlay the image on the back buffer perfectly, unless the quad facing towards the camera. The closer the quad normal matches the cameras y axis, the more the image is skewed.
I am currently using the formula below to calculate the uv coordinates:
float2 uv = vert0.position.xy / vert0.position.w;
vert0.uv.x = uv.x * 0.5f + 0.5f;
vert0.uv.y = -uv.y * 0.5f + 0.5f;
I have also used the formula below, which resulted (IMO) in the uv's not taking perspective into concideration.
float2 uv = vert0.position.xy / SourceTextureResolution;
vert0.uv.x = uv.x * ASPECT_RATIO + 0.5f;
vert0.uv.y = -uv.y + 0.5f;
Question:
How can I obtain screen space uv coordinates based on a vertex position generated in the geometry shader?
If you would like me to elaborate on any points please ask and i will try my best :)
Thanks in advance.

Masking with glBlendFunc, without frame buffers

I'm hoping this is possible to do without using frame buffers or shaders, just by straight up using the glBlendFunc or glBlendFuncSeparate.
I'm rendering my scene normally with my standard blend mode:
glBlendFunc(GL.GL_SRC_ALPHA, GL.GL_ONE_MINUS_SRC_ALPHA);
Ontop of that scene, I want to draw a texture which is masked by some other texture. These are drawn at arbitrary positions (not necessarily rectangular, not necessarily the same size / position as each other).
The order is; render the masked texture, then the mask texture.
The masked texture is a regular image, with alpha.
The mask texture either black RGBA(0, 0, 0, 255), or transparent RGBA(0, 0, 0, 0)
I want anything that the black does NOT touch, to be invisible. Basically, the final result should be:
RGBA(masked.r, masked.g, masked.b, masked.a * mask.a)
Below are images of the ordering, to explain what I mean. I'm really looking for a solution to avoid having to use a different shader or stick things onto a framebuffer. If it absolutely isn't possible, please let me know.
I'll explain why this isn't possible. Masking with blending requires three passes because it has three parts: the destination, the source, and the mask. No matter what you do, you must blend the source and the mask into a framebuffer and THEN render to destination.
The stencil buffer, however, is built into the default window framebuffer, provided you tell OpenGL to provide for it (like you would a depth buffer), and appears to do exactly what you want. As a GLUT call, it would look like this to initialize the stencil buffer in your window along with the alpha-enabled color and depth buffers, in a double-buffered window:
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA | GLUT_ALPHA | GLUT_DEPTH | GLUT_STENCIL);
The stencil buffer is able to do exactly what you need - you can draw a shape in it, and selectively tell it to either discard or keep pixels inside that shape. Here's an example of how to use it, modified from the the OpenGL Red Book:
GLdouble dRadius = 0.1; // Initial radius of spiral
GLdouble dAngle; // Looping variable
// Use 0 for clear stencil, enable stencil test
glClearStencil(0);
glEnable(GL_STENCIL_TEST);
// Clear stencil buffer
glClear(GL_STENCIL_BUFFER_BIT);
// All drawing commands fail the stencil test, and are not
// drawn, but increment the value in the stencil buffer.
// glStencilFunc takes three arguments: the stencil function, the reference value, and the mask value. Whenever the stencil function is tested (for example GL_LESS), both the reference and the stencil value being tested from the buffer is bitwise ANDed with the mask: GL_LESS returns true if (ref & mask) < (stencil & mask).
glStencilFunc(GL_NEVER, 0x0, 0x0);
// glStencilOp takes three arguments: the stencil operation to call when the stencil test fails, the stencil operation to call when the stencil test passes but the depth test fails, and the stenciloperation to call when the stencil test passes AND the depth test passes (or depth test is disabled or no depth buffer is allocated).
glStencilOp(GL_INCR, GL_INCR, GL_INCR);
// Spiral pattern will create stencil pattern
// Draw the spiral pattern with white lines. We
// make the lines white to demonstrate that the
// stencil function prevents them from being drawn
glColor3f(1.0f, 1.0f, 1.0f);
glBegin(GL_LINE_STRIP);
for(dAngle = 0; dAngle < 400.0; dAngle += 0.1)
{
glVertex2d(dRadius * cos(dAngle), dRadius * sin(dAngle));
dRadius *= 1.002;
}
glEnd();
// Now, allow drawing, except where the stencil pattern is 0x1
// and do not make any further changes to the stencil buffer
glStencilFunc(GL_NOTEQUAL, 0x1, 0x1);
glStencilOp(GL_KEEP, GL_KEEP, GL_KEEP);
// Now draw red square
glColor3f(0.0f, 1.0f, 0.0f);
glRectf(0, 0, 200, 200);
The output of this drawing code is a red square, with a spiral across it, starting at (1, 1). The spiral is made up of discarded pixels, and as such will be the same color as the cleared color buffer. If you were to use this code for your purposes, you would draw the square where you wanted your texture to be where the spiral code is written, and then use GL_EQUAL as the stencil function, drawing your masked texture where the red square is drawn. More information on the stencil buffer can be found here.

OpenGL ES Texture Masking

I want to implement a effect like colorsplash effect by opengl es,so I search by the website and get the guide(http://www.idevgames.com/forums/thread-899.html)
Now I block in the third step for the draw time,i don't know how to create the multi-texture followed the guide,can you help me? give me some suggestions or some code about it
To do multi-texturing, you need to put different textures into the various texture units, and set texture coordinates for them. Something like this:
// Put a texture into texture unit 0
glActiveTexture (GL_TEXTURE0);
glBindTexture (GL_TEXTURE_RECTANGLE_EXT, texID0);
...
// Put a texture into texture unit 1
glActiveTexture (GL_TEXTURE1);
glBindTexture (GL_TEXTURE_RECTANGLE_EXT, texID1);
...
// Now draw our textured quad - you could also use VBOs
glBegin (GL_QUADS);
// Set up the texture coordinates for each texture unit for the first vertex
glMultiTexCoord2f (GL_TEXTURE0, x0, y0);
glMultiTexCoord2f (GL_TEXTURE1, x1, y1);
// Define the first vertex's location
glVertex2f (x, y);
... // Do the other 3 vertexes
glEnd();

Resources