I'm currently using colour picking in my application.
This works on the PC, however I'm having trouble to get it working on a variety of devices.
This is probably due to the context being set up differently, depending on the device. For example, as far as I'm aware the PC is set to a colour of 888, whereas a device might default to 565.
I was wondering if there's a way in OpenGL to get the current pixel/colour format, so that I can retrieve the colour data properly?
This is the function I'm using which works fine on the PC:
inline void ProcessColourPick(GLubyte *out, KDfloat32 x, KDfloat32 y)
{
GLint viewport[4];
GLubyte pixel[3];
glGetIntegerv(GL_VIEWPORT,viewport);
//Read colour of pixel at a specific point in the framebuffer
glReadPixels(x,viewport[3]-y,1,1,
GL_RGB,GL_UNSIGNED_BYTE,(void *)pixel);
out[0] = pixel[0];
out[1] = pixel[1];
out[2] = pixel[2];
}
Any ideas?
Yes, but it's a bit complicated.
Querying the bitdepth of the current framebuffer is fairly easy in ES 2.0 (note: this is also legal in Desktop GL, but this functionality was removed in GL 3.1 core. It's still accessible from a compatibility profile). You have to get the bitdepth of each color component:
GLint bitdepth;
glGetIntegerv(GL_x_DEPTH, &bitdepth);
Where x is one of RED, GREEN, BLUE, or ALPHA.
Once you have the bitdepth, you can test to see if it's 565 and use appropriate pixel transfer parameters and color values.
The format parameter for glReadPixels must be either GL_RGBA (always supported) or the GL_IMPLEMENTATION_COLOR_READ_FORMAT_OES (different on different devices). It's a OpenGL ES restriction.
Related
I've been experimenting with OpenGL on OSX with a ES implementation as reference code. The goal is to render an image buffer (CVImageBuffer) which is in the yuv format.
I need to know how to specify the color format (or is it fixed to a particular type?) of the texture that gets created using CVOpenGLTextureCacheCreateTextureFromImage() API. I'm trying to figure this out so that I can appropriately access/process the colors in the fragment shader (RGB vs YUV).
Also, I can see that there is an "internalFormat" option that can be used to control this in the OpenGL-ES version of the API.
Thanks!
In my program I am using Qt's function: qApp->primaryScreen()->grabWindow(qApp->desktop()->winId(),x_offset,y_offset,w,h);
But its a little bit slowly for a "main task". So I've collide with a question above. The program able to work under Windows and Mac OS X. I heard about opengl as nice screengrabber since it is closer to GPU than native API plus its a cross platform solution. This is the first knowledge I want to get: Opengl as desktop screengrabber, is it real? I mean like a button "print screen".
If it is, how?If its not:
Windows: can you please give advice how to? BitBlt, GetDC, smthing like this?
Mac OS X: AVFoundation? Please, can you describe this or give some link about how to capture screenshot using this class? (Its a hard way since I know about Objective-C(++) almost nothing)
UPDATE: I read a lot about ways to capture screenshot. There are some knowledge:1. Opengl (maybe) is real as screengrabber, but use it will be irrcorrect for this software. Btw, I don't care, if there is some solution I will accept it.
2. DirectX it is not a way to solve my problem since this software does not work under Mac OS X.
Just to expand on #Zhenyi Luo's answer, here is a code snippet I have used in the past.
It also uses FreeImage for exporting the screenshot.
void Display::SaveScreenShot (std::string FilePath, SCREENSHOT_FORMAT Format){
// Create Pixel Array
GLubyte* pixels = new GLubyte [3 * Window::width * Window::height];
// Read Pixels From Screen And Buffer Into Array
glPixelStorei (GL_UNPACK_ALIGNMENT, 1);
glReadPixels (0, 0, Window::width, Window::height, GL_BGR, GL_UNSIGNED_BYTE, pixels);
// Convert To FreeImage And Save
FIBITMAP* image = FreeImage_ConvertFromRawBits (pixels, Window::width,
Window::height, 3 * Window::width, 24,
0x0000FF, 0xFF0000, 0x00FF00, false);
FreeImage_Save ((FREE_IMAGE_FORMAT) Format, image, FilePath.c_str (), 0);
// Free Resources
FreeImage_Unload (image);
delete [] (pixels);
}
glReadPixels( GLint x,
GLint y,
GLsizei width,
GLsizei height,
GLenum format,
GLenum type,
GLvoid * data), would read a block of pixels into client memory starting at location data.
I want to make reproductible tests of some custom widgets renderings. In order to do that, I paint them onto a QImage and save the result as PNG. The output is really different on Windows compared to MacOSX.
I took care of :
Selecting the same font on all platform (I provide the "TTF" font file and point the code to it)
Drawing onto a QImage and not a QPixmap, as the documentation says QImage painter is supposed to be platform independant
I also selected Antialisating and TextAntialiasing hints
Requesting the font via QFontDatabase::font() so that pointSize is specified and not pixelSize
How can I make sure the rendering is exactly the same on all platforms so that my test runs are reproductible ? In other words, is it maybe possible to force QT5 to use the same font engine on all platforms (for instance freetype) ?
**
I nailed down the issue to a simple rendering test program.
So the code is looks like :
QFontDatabase fontDb;
fontDb.addApplicationFont(".../fonts/Vera.ttf");
QImage result(width, height, QImage::Format_RGB32);
QPainter painter(&result);
painter.setRenderHint(QPainter::Antialiasing);
painter.setRenderHint(QPainter::TextAntialiasing);
QBrush background(QColor(205, 205, 205));
painter.fillRect(0, 0, 800, 600, background);
QFont font = fontDb.font("Bitstream Vera Sans", "Normal", 10);
painter.setFont(font);
painter.setPen(QColor(0, 0, 0));
painter.drawText(10, 10, "ABCD abcd 01234567");
The Bitstream Vera font can be downloaded on fontsquirel.com for instance.
See the result on MacOSX (left) and on Win32 (right), which are very different:
Following answer and comments by N1ghtLight below, and after reading the links he suggested, I changed the code to get the font to :
QFont font = fontDb_->font(("Bitstream Vera Sans", "Normal", -1);
qreal screenDPI = QApplication::primaryScreen()->physicalDotsPerInch();
qreal RENDER_DPI = 72;
int pixelSize = (int)((qreal)10 * screenDPI / RENDER_DPI);
font.setPixelSize(pixelSize);
This seems to mostly solve the problem of the font of very different size. At least on MacOSX, the font is exactly 10 pixels high now. On Windows though the font renders much thinner and a bit smaller too. I'm still lost and confused...
Here is the new result (left MacOSX, right Windows). The white scale indicates true 10 pixels size.
Following answer by G_G below I adapted the code (what about Linux ? Mobile platforms ? This gets very complicated...). Now the fonts are 10 pixels in the output on both Windows and MacOSX, still the rending remains very different (still MacOSX on the left, Windows on the right).
Thanks.
your render DPI variable should be 96 for Windows and 72 for OSX
according to:
http://www.rfwilmut.clara.net/about/fonts.html
On a Macintosh monitor, the notional resolution is 72 dots-per -inch
(dpi), so that a graphic 72 pixels wide would notionally be 1 inch
wide - though obviously the actual size would depend on the individual
monitor. However it will always print one inch wide.
But on a Windows monitor the resolution is (usually) 96 dpi. This
means that though the picture is still 72 pixels wide, it will print
at 0.75 inches.
QFont font = fontDb_->font(("Bitstream Vera Sans", "Normal", -1);
qreal screenDPI = QApplication::primaryScreen()->physicalDotsPerInch();
#ifdef WINDOWS
qreal RENDER_DPI = 96;
#else
qreal RENDER_DPI = 72;
#endif
int pixelSize = (int)((qreal)10 * screenDPI / RENDER_DPI);
font.setPixelSize(pixelSize);
It happens because you set size in pixels. You need to use setPointSize() instead.
From the Qt 5 docs:
void QFont::setPixelSize(int pixelSize)
Sets the font size to pixelSize pixels.
Using this function makes the font device dependent. Use setPointSize() or setPointSizeF() to set the size of the font in a device independent manner.
Also, for more info, you can check this post. This difference happens because of different display density on different OSs. I have experienced such problem myself on one OS X, Windows cross platform project.
[Updated] After my additional research, I've found, that current behavior of Qt fonts is just a bug. That's why solution above doesn't work (it worked for Qt 4)
Here is described different workarounds to solve this problem. Good luck!
You can try this. this will generate an image with the text which you will write. I using this code.
QPixmap photo;
QFont qfont;
QPainter painter;
QString txt;
QImage img(x,y,QImage::Format_RGB32);
painter.begin(&img);
img.fill(0xffffffff);
painter.drawPixmap(0,0,x,y,photo);
qfont.setFamily("Sampige");
qfont.setPixelSize(28);
painter.setFont(qfont);
txt = QString::fromUtf8("ನಮಸ್ಕಾರ");
painter.drawText(1,26,txt);
painter.end();
img.save("abcd.jpg");
I am trying to set up an OpenGL window with an alpha channel in its color buffer. Unfortunately, my current setup is creating a GL_RGB back and front buffer (as reported by gDEBugger, and as shown by my experiments).
I set up the window like so:
PIXELFORMATDESCRIPTOR pfd;
ZeroMemory(&pfd,sizeof(pfd));
pfd.nSize = sizeof(pfd);
pfd.nVersion = 1;
pfd.dwFlags = PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER;
pfd.iPixelType = PFD_TYPE_RGBA;
pfd.cColorBits = 24; //note by docs that alpha doesn't go here (though putting 32 changes nothing)
pfd.cDepthBits = 24;
pfd.iLayerType = PFD_MAIN_PLANE;
I have also tried more specifically:
PIXELFORMATDESCRIPTOR pfd = {
sizeof(PIXELFORMATDESCRIPTOR),
1,
PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER,
PFD_TYPE_RGBA,
24,
8,0, 8,8, 8,16, 8,24, //A guess for lil endian; usually these are 0 (making them 0 doesn't help)
0, 0,0,0,0,
24, //depth
8, //stencil
0, //aux
PFD_MAIN_PLANE,
0,
0, 0, 0
};
My understanding is that when I call ChoosePixelFormat later, it's returning a format without an alpha channel (though why, I don't know).
I should clarify that when I say alpha buffer, I mean just a simple alpha buffer for rendering purposes (each color has an alpha value that fragments can be tested against, and so on). I do NOT mean a semi-transparent window or some other effect. [EDIT: So no, I am not at this time interested in making the window itself transparent. I just want an alpha channel for the default framebuffer.]
[EDIT: This is part of a cross-platform windowing backend I'm writing. My code is always portable. I am not using a library that provides this functionality since I need more control.]
There are two key points to consider here:
You can do this with the default framebuffer. However, you need to request it properly. Windows's default selection mechanism doesn't seem to weight having RGBA very highly. The best course seems to be to enumerate all possible pixelmodes and then select the one you want "manually" as it were. By doing this, I was also able to specify that I wanted a 24 bit depth buffer, an accumulation buffer, and an 8-bit stencil buffer.
The OpenGL context must be fully valid for alpha blending, depth testing, and any even remotely advanced techniques to work. Curiously, I was able to get rendered triangles without having a fully valid OpenGL context, leading to my confusion! Maybe it was being emulated in software? I figured it out when glewInit (not to mention making a few VBOs) failed miserably. The key point is that the OpenGL context must be valid. In my case, I wasn't setting it up properly.
This problem was in the context of my currently writing a lightweight cross-platform windowing toolkit. Currently, it supports Windows through the Win32 API, Linux through X11, and I just started porting to Mac OS through X11.
At the request of some in the comments, I hereby present my humble effort thereof that others may benefit. Mac support doesn't work yet, menus aren't implemented on Linux, and user input on Linux is only half-there. As of 1/18/2013, it may temporarily be found here (people of the future, I may have put new versions on my website (look for "Portcullis")).
We are having an application ported from Windows to Mac OS, and colors are being displayed differently on both platforms. Here's an example:
In this case, we're telling the application to use green 0,140,0 and blue 25,0,75. On windows, this works great (top image). On the Mac, apparently OS X decides to "reinterpret" the colors and displays them differently (bottom image).
Is there something we can do to tell the operating system to stop being creative with our color definitions? It's going to be hard to make things look good on both platforms if the mac arbitrarily changes our color definitions by ~10%.
Edit: Here's an example of the code we're using to set the color for the blue used above:
m_colour = CGColorCreateGenericRGB(25 / 255.0, //r
0 / 255.0, //g
75 / 255.0, //b
1.0); //a
Thanks.
The Mac uses a complex colourspace system called ColorSync to ensure colours appear identical on different devices. As a result, colours may sometimes be shifted slightly in the RGB space so that they appear to the eye identical on properly calibrated displays, printers, etc.
If you show us the code you use to generate that shade of green, we can show you how to modify it to avoid this colour correction. However, unless there's a pressing reason why you want to avoid it, it's usually better to let it happen, as you do not have a wide range of display models to test against.
Edit: CGColorCreateGenericRGB() creates a colour in the generic RGB colour space, so it's going to end up shifting slightly depending on your display calibration. Unfortunately for you, it is no longer possible (as of Mac OS X 10.4) to create an instance of CGColor that is device-dependent (and therefore not subject to calibration.) You can, however, create a CGColor in the colour space of the target drawing context--that will tell Quartz that no conversion is necessary.
If you've created the context yourself, you should keep a reference to the colour space you used (of the type CGColorSpaceRef.) If it's at the Cocoa level (such as the context created by -[NSImage lockFocus] or by -[NSView drawRect:] then you should use the relevant NSColor APIs instead of the CGColor APIs (i.e. +[NSColor colorWithDeviceRed:green:blue:alpha:].)
If you must use Quartz drawing, you can call CGContextSetRenderingIntent() to tell the context how you want to have colours converted, but there is no guarantee that a conversion will not take place.