QT5 font rendering different on various platforms - windows

I want to make reproductible tests of some custom widgets renderings. In order to do that, I paint them onto a QImage and save the result as PNG. The output is really different on Windows compared to MacOSX.
I took care of :
Selecting the same font on all platform (I provide the "TTF" font file and point the code to it)
Drawing onto a QImage and not a QPixmap, as the documentation says QImage painter is supposed to be platform independant
I also selected Antialisating and TextAntialiasing hints
Requesting the font via QFontDatabase::font() so that pointSize is specified and not pixelSize
How can I make sure the rendering is exactly the same on all platforms so that my test runs are reproductible ? In other words, is it maybe possible to force QT5 to use the same font engine on all platforms (for instance freetype) ?
**
I nailed down the issue to a simple rendering test program.
So the code is looks like :
QFontDatabase fontDb;
fontDb.addApplicationFont(".../fonts/Vera.ttf");
QImage result(width, height, QImage::Format_RGB32);
QPainter painter(&result);
painter.setRenderHint(QPainter::Antialiasing);
painter.setRenderHint(QPainter::TextAntialiasing);
QBrush background(QColor(205, 205, 205));
painter.fillRect(0, 0, 800, 600, background);
QFont font = fontDb.font("Bitstream Vera Sans", "Normal", 10);
painter.setFont(font);
painter.setPen(QColor(0, 0, 0));
painter.drawText(10, 10, "ABCD abcd 01234567");
The Bitstream Vera font can be downloaded on fontsquirel.com for instance.
See the result on MacOSX (left) and on Win32 (right), which are very different:
Following answer and comments by N1ghtLight below, and after reading the links he suggested, I changed the code to get the font to :
QFont font = fontDb_->font(("Bitstream Vera Sans", "Normal", -1);
qreal screenDPI = QApplication::primaryScreen()->physicalDotsPerInch();
qreal RENDER_DPI = 72;
int pixelSize = (int)((qreal)10 * screenDPI / RENDER_DPI);
font.setPixelSize(pixelSize);
This seems to mostly solve the problem of the font of very different size. At least on MacOSX, the font is exactly 10 pixels high now. On Windows though the font renders much thinner and a bit smaller too. I'm still lost and confused...
Here is the new result (left MacOSX, right Windows). The white scale indicates true 10 pixels size.
Following answer by G_G below I adapted the code (what about Linux ? Mobile platforms ? This gets very complicated...). Now the fonts are 10 pixels in the output on both Windows and MacOSX, still the rending remains very different (still MacOSX on the left, Windows on the right).
Thanks.

your render DPI variable should be 96 for Windows and 72 for OSX
according to:
http://www.rfwilmut.clara.net/about/fonts.html
On a Macintosh monitor, the notional resolution is 72 dots-per -inch
(dpi), so that a graphic 72 pixels wide would notionally be 1 inch
wide - though obviously the actual size would depend on the individual
monitor. However it will always print one inch wide.
But on a Windows monitor the resolution is (usually) 96 dpi. This
means that though the picture is still 72 pixels wide, it will print
at 0.75 inches.
QFont font = fontDb_->font(("Bitstream Vera Sans", "Normal", -1);
qreal screenDPI = QApplication::primaryScreen()->physicalDotsPerInch();
#ifdef WINDOWS
qreal RENDER_DPI = 96;
#else
qreal RENDER_DPI = 72;
#endif
int pixelSize = (int)((qreal)10 * screenDPI / RENDER_DPI);
font.setPixelSize(pixelSize);

It happens because you set size in pixels. You need to use setPointSize() instead.
From the Qt 5 docs:
void QFont::setPixelSize(int pixelSize)
Sets the font size to pixelSize pixels.
Using this function makes the font device dependent. Use setPointSize() or setPointSizeF() to set the size of the font in a device independent manner.
Also, for more info, you can check this post. This difference happens because of different display density on different OSs. I have experienced such problem myself on one OS X, Windows cross platform project.
[Updated] After my additional research, I've found, that current behavior of Qt fonts is just a bug. That's why solution above doesn't work (it worked for Qt 4)
Here is described different workarounds to solve this problem. Good luck!

You can try this. this will generate an image with the text which you will write. I using this code.
QPixmap photo;
QFont qfont;
QPainter painter;
QString txt;
QImage img(x,y,QImage::Format_RGB32);
painter.begin(&img);
img.fill(0xffffffff);
painter.drawPixmap(0,0,x,y,photo);
qfont.setFamily("Sampige");
qfont.setPixelSize(28);
painter.setFont(qfont);
txt = QString::fromUtf8("ನಮಸ್ಕಾರ");
painter.drawText(1,26,txt);
painter.end();
img.save("abcd.jpg");

Related

Twips, please give information how to handle this [duplicate]

I would like to get the actual screen dpi/ppi, not the dpi setting used for font in C++.
I tried with the following codes:
Version 1, reports 72 dpi, which is wrong.
SetProcessDPIAware(); //true
HDC screen = GetDC(NULL);
double hSize = GetDeviceCaps(screen, HORZSIZE);
double vSize = GetDeviceCaps(screen, VERTSIZE);
double hRes = GetDeviceCaps(screen, HORZRES);
double vRes = GetDeviceCaps(screen, VERTRES);
double hPixelsPerInch = hRes / hSize * 25.4;
double vPixelsPerInch = vRes / vSize * 25.4;
ReleaseDC(NULL, screen);
return (hPixelsPerInch + vPixelsPerInch) * 0.5;
Version 2, reports 96 dpi, which is the Windows dpi setting for font, but not the actual screen dpi.
SetProcessDPIAware(); //true
HDC screen = GetDC(NULL);
double hPixelsPerInch = GetDeviceCaps(screen,LOGPIXELSX);
double vPixelsPerInch = GetDeviceCaps(screen,LOGPIXELSY);
ReleaseDC(NULL, screen);
return (hPixelsPerInch + vPixelsPerInch) * 0.5;
I'm honestly confused by the answers here.
Microsoft has a GetDpiForMonitor method:
https://msdn.microsoft.com/en-us/library/windows/desktop/dn280510(v=vs.85).aspx
And monitors DO expose their physical dimensions to tools. You can read your monitors width and height, in centimeters, using the HWiNFO64 tool. So if they're getting it (DDI?), it stands to reason that you can access that information yourself.
Even a different Stack Overflow post mentions using WmiMonitorBasicDisplayParams to get the data.
How to get monitor size
So the top post is flat-out, 100%, wrong.
What you're asking for is, unfortunately, not possible in the general case.
Windows doesn't know the physical screen size. Windows might know that your screen has 1024x768 pixels, but it doesn't know how big the screen actually is. You might pull the cable out of your old 13" screen and connect it to a 19" monitor without changing the resolution. The DPI would be different, but Windows won't notice that you changed monitors.
You can get the true physical dimensions and DPI for a printer (assuming the driver isn't lying), but not for a screen. At least not reliably.
UPDATED
As others have pointed out, there are standards for two-way communication between newer monitors and the OS (EDID), that might make this information available for some devices. But I haven't yet found a monitor that provides this information.
Even if EDID were universally available, it's still not solvable in the general case, as the display could be a video projector, where the DPI would depend on the zoom, the focus, the lens type, and the throw distance. A projector is extremely unlikely to know the throw distance, so there's no way for it to report the actual DPI.
Getting DPI information is found to produce exact value using the below method.
ID2D1Factory* m_pDirect2dFactory;
D2D1CreateFactory(D2D1_FACTORY_TYPE_SINGLE_THREADED, &m_pDirect2dFactory);
FLOAT dpiX, dpiY;
m_pDirect2dFactory->GetDesktopDpi( &dpiX, &dpiY );
I think what you're after is:
GetDeviceCaps(hdcScreen, LOGPIXELSX);
GetDeviceCaps(hdcScreen, LOGPIXELSY);

Mask missing from HICON on Win10 but not Win7

I am trying to use some system icons such as SIID_DOCNOASSOC and SIID_FOLDER and draw them.
I have the problem that while my code works as expected in Windows 7, on Windows 10 the retrieved images are missing their mask. I cannot figure out why (the PICONINFO.hbmMask field that I can retrieve with GetIconInfo is non-null, indicating that there is a mask, indeed).
My code is written in Xojo, which uses a dialect of VB, but that should hardly matter, as I got it working in Win 7, I'd think:
dim info as SHSTOCKICONINFO
info.cbSize = SHSTOCKICONINFO.Size
SHGetStockIconInfo (SIID_DOCNOASSOC, SHGSI_ICON, info)
dim iconHandle as Integer = info.hIcon
dim destDC as Integer = ... // intialized outside
DrawIconEx (destDC, 0, 0, iconHandle, 0, 0, 0, 0, DI_MASK)
The above code fetched the icon for a plain file and then draws its mask. While the mask is correct on Win 7, the mask is all black over the entire icon's area on Win 10.
Why would that happen?
Windows XP added support for 32-bit ARGB icons with alpha transparency. These icons still contain a black and white mask bitmap but it is often not correct, it depends on the icon editor used and how the artist drew the image! They often look like the my documents icon in this article.
Vista added support for PNG images in icons (often called "compressed" in icon editors) and contain no mask bitmap. It is not documented what GetIconInfo does to create the mask for these.
The days of playing with HICON masks are long gone, if you want to draw a icon you should let windows do it for you without extracting the parts of a HICON. ImageList_DrawEx has some blending support if you need it.
If you absolutely need a mask for some reason then you should build it yourself when the icon contains alpha transparency. Pick some sort of threshold (25, 50, whatever) and treat everything higher than that as transparent when you inspect the alpha values.

Differences between GetDC() and BeginPaint()?

I am having trouble with some of my owner drawn listboxes on High DPI monitors on Windows 10 in a dialog box. The text is chopped off at the bottom. We saw the problem on Windows 7 and were able to fix it. It is not necessarily High DPI, but when the user sets a different text scaling. I solved the problem, so I thought (!), by using a CClientDC (wrapper around GetDC()) and calling GetTextMetrics() to determine the text height. Previously, our icons had always been taller than our text so it was not a problem. With larger DPI monitors we saw some customers reporting problems when they scaled the text.
Now we are getting new reports under Windows 10. The former problem is fine under Windows 7--but Windows 7 only scales to 100, 125, and 150 percent. Windows 10 (and maybe 8? -- but no customer reports) allows user defined scaling.
So, I tracked down the problem somewhat... I knew what the font height was when I called GetTextMetrics() during WM_MEASUREITEM. I went and put some code in to debug what GetTextMetrics() was during my WM_DRAWITEM. Well, they were different--20 pixels high during WM_MEASUREITEM, and 25 pixels high during WM_DRAWITEM. Obviously, that is a problem. I want the GetTextMetrics() to have the same results in both places.
My thought was that the only real difference I could think of was that during WM_MEASUREITEM I am calling GetDC() via CClientDC constructor, and that during WM_DRAWITEM I am using an already constructed HDC (which probably was from a return of GetPaint() inside GDI32.dll or another system DLL).
I thought maybe the BeginPaint() does something like select the windows HFONT into the HDC...
So, inside my WM_MEASUREITEM after getting the DC, I select the font of the listbox into the HDC, and then I call GetTextMetrics(). Lo and behold, the numbers match now in WM_MEASUREITEM and WM_DRAWITEM.
However, I don't know if I just got lucky. It's all just guesswork at this point.
Does BeginPaint() select the window font into the DC whereas GetDC() does not? Does the default handler of WM_PAINT for an owner drawn LISTBOX or COMBOBOX do something like select the window font into the paint DC?
BOOL DpiAwareMeasureGraphItem(LPMEASUREITEMSTRUCT lpM, CWnd* pWnd)
{
int iItemHeight = INTERG_BITMAP_HEIGHT + 4;
if (pWnd)
{
CClientDC dc(pWnd);
if (dc.GetSafeHdc())
{
CFont* pOldFont = dc.SelectObject(pWnd->GetFont()); // seems to fix it on Windows 10, but is it luck?
TEXTMETRIC tm;
memset(&tm, 0, sizeof(tm));
dc.GetTextMetrics(&tm);
LONG tmHeight = tm.tmHeight + 4; //pad
iItemHeight = max(iItemHeight, tmHeight);
dc.SelectObject(pOldFont);
}
}
lpM->itemHeight = iItemHeight;
return (TRUE);
}
Neither GetDC() or BeginPaint() initialise the DC they return with anything other than the default system font. But WM_DRAWITEM is different - it gives you an already-initialised DC to draw into.
The method you stumbled across is the right one. WM_MEASUREITEM doesn't supply a DC at all, so if you need one for size calculations you're responsible for obtaining it and setting it up with the appropriate font.

How to scale NSTextView to show rich text in actual size?

The rich text in an NSTextView displays on my screen (27" Mac) a lot smaller than the font size would imply, although it prints correctly, and is the correct size if pasted in to another app (e.g. OpenOffice). TextEdit shows the same behaviour.
The following lines in awakeFromNib fixes this more or less exactly.
[myTextView scaleUnitSquareToSize:NSMakeSize(96.0/72, 96.0/72)];
myTextView.layoutManager.usesScreenFonts = NO;
So it looks as if the screen is using 96 points per inch. If I don't have the 2nd line, the text is slightly squashed up, and monotext mangled. Obviously I shouldn't hard code the scale factor, but where can I find the factor to put there? From [NSScreen mainScreen].deviceDescription ( a dictionary) I get NSDeviceResolution = {72, 72}", so it seems that's not what's being used.
I'm not sure this is wise, but you can get the "real" DPI from the display mode:
CGDirectDisplayID displayID = [window.screen.deviceDescription[#"NSScreenNumber"] unsignedIntValue];
CGSize size = CGDisplayScreenSize(displayID);
CGDisplayModeRef mode = CGDisplayCopyDisplayMode(displayID);
NSSize dpi;
dpi.width = CGDisplayModeGetWidth(mode) * 25.4 / size.width;
dpi.height = CGDisplayModeGetHeight(mode) * 25.4 / size.height;
CGDisplayModeRelease(mode);
[myTextView scaleUnitSquareToSize:NSMakeSize(dpi.width/72, dpi.height/72)];
myTextView.layoutManager.usesScreenFonts = NO;
Some display modes are letterboxed. These will presumably never be used as the general mode for the Mac GUI; they'd only be used by full-screen games. However, if you want your app to handle such a mode, it should account for the fact that the mode size does not correspond to the full screen size in one dimension and the DPI calculation above will be off. I don't think there's a direct way to figure that out. You have to examine all available display modes to see if there's one that's the same in relevant properties (size, pixel encoding, etc.) as the current mode but whose IOKit flags indicate that it's stretched (CGDisplayModeGetIOFlags(mode) & kDisplayModeStretchedFlag != 0) while the current mode is not. In that case, you probably want to assume the pixels are square and pick the smaller of dpi.width and dpi.height to use for both.

Getting pixel colour not accurate

I'm currently using colour picking in my application.
This works on the PC, however I'm having trouble to get it working on a variety of devices.
This is probably due to the context being set up differently, depending on the device. For example, as far as I'm aware the PC is set to a colour of 888, whereas a device might default to 565.
I was wondering if there's a way in OpenGL to get the current pixel/colour format, so that I can retrieve the colour data properly?
This is the function I'm using which works fine on the PC:
inline void ProcessColourPick(GLubyte *out, KDfloat32 x, KDfloat32 y)
{
GLint viewport[4];
GLubyte pixel[3];
glGetIntegerv(GL_VIEWPORT,viewport);
//Read colour of pixel at a specific point in the framebuffer
glReadPixels(x,viewport[3]-y,1,1,
GL_RGB,GL_UNSIGNED_BYTE,(void *)pixel);
out[0] = pixel[0];
out[1] = pixel[1];
out[2] = pixel[2];
}
Any ideas?
Yes, but it's a bit complicated.
Querying the bitdepth of the current framebuffer is fairly easy in ES 2.0 (note: this is also legal in Desktop GL, but this functionality was removed in GL 3.1 core. It's still accessible from a compatibility profile). You have to get the bitdepth of each color component:
GLint bitdepth;
glGetIntegerv(GL_x_DEPTH, &bitdepth);
Where x is one of RED, GREEN, BLUE, or ALPHA.
Once you have the bitdepth, you can test to see if it's 565 and use appropriate pixel transfer parameters and color values.
The format parameter for glReadPixels must be either GL_RGBA (always supported) or the GL_IMPLEMENTATION_COLOR_READ_FORMAT_OES (different on different devices). It's a OpenGL ES restriction.

Resources