I am learning OpenGL ES2.0. I need a stencil buffer in my project.
What I am going to do:
1) Create a stencil buffer.
2) Load a 8-bit gray color image into this stencil buffer (which is also 8-bit per pixel).
3) The gray color image has different area (by setting different part a different value), so I can render for each area by changing the stencil test value.
I've searched for a lot time, still have no idea on how to load the image into stencil buffer.
So for the image above, I set stencil value as 1 for the blue area, and set 2 for the greeen area. How to implement this?
If your bitmap were 1 bit, just write a shader that either discards or allows pixels to proceed based on an alpha test, or use glAlphaFunc to do the same thing if under the fixed pipeline, and draw a suitable quad with the appropriate glStencilFunc.
If it's 8-bit and you genuinely want all 8 transferring to the stencil, the best cross-platform solutions I can think of either involve 8 draws — start from 0, use glStencilMask to isolate individual bits, set glStencilOp to invert, test for non-zero in the relevant bit in your shader — or just using the regular texture and implementing the equivalent of a stencil test directly in your shader.
Related
I have a small sample, es-300-fbo-srgb, supposed to showing how to manage gamma correction in opengl es3.
Essentially I have:
a GL_SRGB8_ALPHA8 texture TEXTURE_DIFFUSE
a framebuffer with another GL_SRGB8_ALPHA8 texture on GL_COLOR_ATTACHMENT0 and a GL_DEPTH_COMPONENT24 texture on GL_DEPTH_ATTACHMENT
the back buffer of my default fbo is GL_LINEAR
GL_FRAMEBUFFER_SRGB initially disabled.
I get
instead of
Now, if I recap the display metho, this is what I do:
I render the TEXTURE_DIFFUSE texture on the sRGB fbo and since the source texture is in sRGB space, my fragment shader will read automatically a linear value and write it to the fbo. Fbo should contain now linear values, although it is sRGB, because GL_FRAMEBUFFER_SRGB is disabled, so no linear->sRGB conversion is executed.
I blit the content of the fbo to the default fbo back buffer (through a program). But since the texture of this fbo has the sRGB component, on the read values a wrong gamma operation will be performed because they are assumed in sRGB space when they are not.
a second gamma operation is performed by my monitor when it renders the content of the default fbo
So my image is, if I am right, twice as wrong..
Now, if I glEnable(GL_FRAMEBUFFER_SRGB); I get instead
The image looks like it have been too many times sRGB corrected..
If I, instead, leave the GL_FRAMEBUFFER_SRGB disabled and change the format of the GL_COLOR_ATTACHMENT0 texture of my fbo, I get finally the right image..
Why do I not get the correct image with glEnable(GL_FRAMEBUFFER_SRGB);?
I think you are basically right: you get the net effect of two decoding conversions where one (the one in your monitor) would be enough. I suppose that either your driver or your code breaks something so OpenGL doesn't 'connect the dots' properly; perhaps this answer helps you:
When to call glEnable(GL_FRAMEBUFFER_SRGB)?
Using OpenGL ES 1.1.
The picture I am generating is being color-keyed later on down the line. It replaces one color (magenta) with something else.
So I clean the background with that color, draw on top of it, and everything is good.
However, textures with an alpha channel cause some complications. I effectively want to use only maximum or minimum alpha, and show the background OR show the image blended with, say, black.
My mostly-working hack has the texture data manually adjusted force the alpha channel to either min or max, and do pre-multiplication for the actual color value, and this mostly works.
However, when the texture size changes, I get a little bit of filtering and some magenta goes through.
So:
1) Is there some combination of glBlendFunc and glTexEnv combiner functions that will let me stop manually editing the textures?
or, failing that....
2) What parameters should I use when drawing the texture to keep alpha at either 0 or 1 when it's scaling?
Use alpha testing instead of blending. Use glAlphaFunc to select a comparision function and reference value. Enable with glEnable(GL_ALPHA_TEST) (and disable once you no longer need it during rendering).
I am trying to render 2D (flat) sprites in a 3D environment using OpenGL ES 2. The way I create each sprite is pretty standard: I create a quad consisting of two triangles, and I map the texture onto that. Everything works fine, except I noticed something strange: when depth testing is turned on (which it should be in 3D mode), the corners of my sprites are painted using the background color.
The easiest way to show this is by illustration:
When I turn off depth testing (on the left) it looks fine, but when I turn it on (on the right) you can see the green sprite's rectangle overlapping on top of the yellow sprite. They both use the same code, the same PNG file, the same shader. Everything is the same except depth testing.
I'm hoping someone might know a way to work around this.
What you can do is alpha testing. Basically your texture has to have an alpha value of 0 where it should be transparent (which it may already have). Then you configure alpha test like e.g.
glAlphaFunc(GL_GREATER, 0.5f);
glEnable(GL_ALPHA_TEST);
This way every pixel (or better fragment) with an alpha value <= 0.5 will not be written into the framebuffer (and therefore not into the depth buffer). You can also do the alpha test yourself in the fragment shader by just discarding the fragment:
...
if(color.a < 0.5)
discard;
...
Then you don't need the fixed-function alpha test (I think that is the reason why it is deprecated in modern desktop GL, don't know about ES).
EDIT: After looking into the ES 2.0 spec, it seems there is no fixed-function alpha test any more, so you will have to do it in the fragment shader like written above. This way you can also make it dependent on a specific color or any other computable property instead of the alpha channel.
I am doing my iPhone graphics using OpenGL. In one of my projects, I need to use an image, which I need to use as a texture in OpenGL. The .png image is 512 * 512 in size, its background is transparent, and the image has a thick blue line in its center.
When I apply my image to a polygon in OpenGL, the texture appears as if the transparent part in the image is black and the thick blue line is seen as itself. In order to remove the black part, I used blending:
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
Then my black part of the texture in the polygon is removed. Now only the blue band is seen. Thus the problem is solved.
But I want to add many such images and make many objects in OpenGL. I am able to do that, but the frame rate is very low when I add more and more images to objects. But when I comment out blending, the frame rate is normal, but the images are not seen.
Since I do not have good fps, the graphics are a bit slow and I get a shaky effect.
So:
1) Is there any other method than blending to solve my problem?
2) How can I improve the frame rate of my OpenGL app? What all steps need to be taken in order to implement my graphics properly?
If you want to have transparent parts of an object, the only way is to blend to pixel data for the triangle with what is currently in the buffer (what you are currently doing). Normally, when using solid textures, the new pixel data for a triangle just overwrites what ever was in buffer (as long as it is closers, ie z-buffer). But with transparency, it has start looking at the transparency of that part of the texture, look at what is behind it, all the way back to something solid. Then has combine all of those over lapping layers of transparent stuff till you get the final image.
If all you are wanting your transparency for is something like a simple tree sprite, and removing the 'stuff' form the sides of the trunk etc. Then you may be better of providing more complex geometry that actually defines the shape of the trunk and thus not need to bother with transparency.
Sadly, I don't think there is much you can do to try to speed up your FPS, other then cut down the amount of transparency you are calculating. Maybe even adding some optimization that checks images to see if it can turn of alpha blending for this image or not. Depending on how much you are trying to push through, may save time in the long wrong.
i have created a Bitmap using GDI+.I am drawing text on to that bitmap using GDI Drawtext.Using Drawtext i am unable to apply tranparency.
Any help or code will be appreciated.
If you want to draw text without a background fill, SetBkMode(hdc,TRANSPARENT) will tell GDI to leave the background when drawing text.
To actually render the foreground color of the text with alpha... is going to be more complicated. GDI does not actually support alpha channels all that widely in its APIs. Outside of AlphaBlend actually all it does is preserve the channel. Its actually not valid to set the upper bits of a COLOREF to alpha values as the high byte is actually used for flags to indicate whether the COLOREF is (rather than an RGB value) a palette entry.
So, unfortunately, your only real way forward is to:
Create a 32bit DIBSection. (CreateDIBSection). This gives you an HBITMAP that is guaranteed to be able to hold alpha information. If you create a bitmap via one of the other bitmap creation functions its going to be at the device colordepth that might not be 32bpp.
DrawText onto the DIBSection.
When you created the DIBSection you got a pointer to the actual memory. At this point you need to go through the memory and set the alpha values. I don't think that DrawText is going to do anything to the alpha channel by itself at all. Im thinking a simple check of the RGB components of each DWORD of the bitmap - if theyre the forground color, rewrite the DWORD with a 50% (or whatever) alpha in the alpha byte, if theyre the background color, rewrite with a 100% alpha in the alpha byte. *
AlphaBlend the bitmap onto the final destination. AlphaBlend requires the alpha channel in the source to be pre-multiplied.
*
It might be sufficient to simply memset the DIBSection with a 50% alpha before doing the DrawText, and ensure that the BKColor is black. I don't know what DrawText might do to the alpha channel though. Some experimentation is called for.
SIMPLE and EASY solution:)
Had this problem, i tried to change alpha values and premultiply, but there was another problem - antialiased and cleartype fonts where not shown correctly (ugly edges). So what i did...
Compose your scene (bitmaps, graphics, etc.)
BitBlt required rectangle from this scene (same as your text rectangle, from the place where you want your text to be) to memory DC with compatible bitmap selected at 0,0 destination coordinates
Draw Your text to that rectangle in memory DC.
Now AlphaBlend that rectangle without AC_SRC_ALPHA in the BLENDFUNCTION and with desired SourceConstantAlpha from this memory DC back to your scene DC.
I think You got it :)
Hmmmm - trying to do same here - wondering - I see that when you create a dib section youi specify the "masks" that is a R,G,B (and alpha) mask.
IF and thats a big if it really does not alter the alpha chhannel, then you might specify the mask differently for two bitmap headers. ONe specifies thr RGB in the proper places, the other makes them all have their bits assigned to the alpha channel. (set the text color to white in this case) then render in two passes, one to load the color values, the other to load the alpha values.
???? anyway just musing :)
While this question is about making text semi-transparent, I had the opposite problem.
DrawText was making the text in my layered window (UpdateLayeredWindow) semi-transparent ... and I didn't want it to be.
Take a look at this other question ... since in the other question I post some code that you could easily modify ... and is almost exactly what Chris Becke suggests in his answer.
A limited answer for a specific situation:
If you have a graphic with alpha channel and you want to draw opaque text over a locally opaque background, first prepare your 32 bit bitmap with 32 bit brushes created with CreateDIBPatternBrushPt. Then scan through the bitmap bits inverting the alpha channel, draw your text as you normally would (including SetBkMode to TRANSPARENT), then invert the alpha in the bitmap again. You can skip the first inversion if you invert the alpha of your brushes.