What scaling factor to use for mapping the Font size on a high resolution monitor? - windows

We have a requirement where our application needs to support high resolution monitors. Currently, when the application comes up in High res monitor, the text that it displays is too small. We use Arial 12 point font by default.
Now to make the text visible, I need to change the font size proportionally. I am finding it tough to come up with a formula which would give me the target font size given the monitor resolution.
Here is my understanding of the problem.
1) On windows, by default 96 pixels correpond to 1 Logical inch. This means that when the monitor resolution increases, the screen size in logical inches also increase.
2) 1 Point font is 1/72 of a Logical Inch. So combined with the fact that there are 96 Pixels per Logical inch, it turns out that, there are 96/72 Pixels per Point of Font.
This means that for a 12 point font, The number of Pixels it will occupy is 12*96/72 = 16 Pixels.
Now I need to know the scaling factor by which I need to increase these Number of Pixels so that the resultant Font is properly visible. If I know the scaled pixel count, I can get the Font size simply by dividing it by (96/72)
What is the suggested scaling factor which would ensure properly scaled Fonts on all monitor resolutions?
Also, please correct if my understanding is wrong.

There's an example on the MSDN page for LOGFONT structure. Your understanding is correct, you need to scale the point size by vertres / 72.
lfHeight = -PointSize * GetDeviceCaps(hDC, LOGPIXELSY) / 72;

If you set the resolution in Windows to match that of the physical monitor, no adjustment should be needed. Any well written program will do the multiplication and division necessary to scale the font properly, and in the newest versions of Windows the OS will lie about the resolution and scale the fonts automatically.
If you wish to handle this outside of the Windows settings, simply multiply your font size by your actual DPI and divide by 96.
Edit: Beginning with Windows Vista, Windows will not report your actual configured DPI unless you write a DPI-aware program. Microsoft has some guidance on the subject. You might find that the default scaling that Microsoft provides for non-DPI-aware programs is good enough for your purposes.

Related

Windows System DPI: GetDeviceCaps(hdc, LOGPIXELSX/Y) vs GetDpiForSystem()

Recently, I read this excellent page about DPI on Win32:
DPI and device-independent pixels
However, I am confused about GetDeviceCaps(hdc, LOGPIXELSX/Y) vs GetDpiForSystem(). On systems where I tested, all three values always match.
Questions:
Is it possible for GetDeviceCaps(hdc, LOGPIXELSX) and GetDeviceCaps(hdc, LOGPIXELSY) to return different values? I assume that LOGPIXELS means DPI. (Please correct me if wrong!)
If the previous answer is yes, then is GetDeviceCaps(GetDC(NULL), LOGPIXELSX/Y) the same as GetDpiForSystem()?
When possible, I should be using GetDpiForWindow(hWnd), but I want to clarify my understanding about "system" DPI in my questions above.
As far as I'm concerned, GetDeviceCaps(hdc, LOGPIXELSX/Y) is not a match with GetDpiForSystem().
LOGPIXELSX:Number of pixels per logical inch along the screen width.
In a system with multiple display monitors, this value is the same for
all monitors.
LOGPIXELSY:Number of pixels per logical inch along the screen height.
In a system with multiple display monitors, this value is the same for
all monitors.
This function is not able to return a per-monitor DPI. For that, you should use GetDpiForWindow().
GetDpiForWindow() also returns a different value based on the DPI_AWARENESS value.
GetDpiForSystem() returns the system DPI. The return value will be dependent based upon the calling context. If the current thread has a DPI_AWARENESS value of DPI_AWARENESS_UNAWARE, the return value will be 96.
Is it possible for GetDeviceCaps(hdc, LOGPIXELSX) != GetDeviceCaps(hdc, LOGPIXELSY)? I assume that "LOGPIXELS" means DPI. (Please correct me if wrong!)
For monitors, I believe LOGPIXELSX == LOGPIXELSY even if your display has non-square pixels (which is extremely rare nowadays). There are still many printers out there that have different horizontal and vertical dot densities. In some cases, the printer driver may hide that, giving the illusion of square pixels. Among ones that don't, you not only have to be careful to use the right value for the context, but you should be aware the some drivers forget to swap the horizontal and vertical values when you change the page orientation to landscape.
LOGPIXELSX and LOGPIXELSY refer to the number of pixels per logical inch, an idea that's been buried in recent versions of Windows. You used to be able to tell Windows that, when a program wants to display something that's 1-inch long, use my logical inch value rather than the actual DPI. (In the olden days of CRT monitors, it was usually impossible for Windows to discover the actual DPI anyway.)
Common values for logical inches were 96 and 120. If you had a really high-end monitor, you might've used 144. If you were into WYSIWYG applications or desktop publishing, you'd usually choose a value that was about 20% larger than an actual inch on your screen. I've heard various rationales for this, but I usually chose a higher value for easier reading.
When possible, I should be using GetDpiForWindow(hWnd)
That's fine. I use LOGPIXELSX and LOGPIXELSY because I'm old school and have always written high-DPI aware code.
but I want to clarify my understanding about "system" DPI in my questions above.
I believe the system DPI is the scaled DPI of the primary monitor. The scaling gives the programmer the same functionality as a logical inch, but it's presented to the user in a different way conceptually.
On systems where I tested, all three values always match.
Yes, it's very common for all of the methods to report the same DPI.
If your program is not high-DPI aware, you'll get 96 regardless of how you ask.
If it is aware, the system DPI is the DPI of the primary monitor. (Well, it's the possibly scaled native DPI of the monitor, but that's the same value you'll be told for the DPI of monitor.)
That covers a lot of common cases. To truly and thoroughly test, you'd need a second monitor with a different native DPI than the primary.
A couple points to keep in mind.
GetDeviceCaps approach is specific to the device context you reference with the HDC parameter. Remember that you can have printer DCs, enhanced metafile DCs, and memory DCs in addition to screen DCs. For screens, the return value will depend on the DPI awareness.
DPI awareness comes into play for screens (not printers). Your program's UI thread can be:
DPI unaware, in which all methods will return 96 dots per inch and actual differences will/might be handled by the system scaling things on the backend.
DPI aware, in which case most will return the system DPI. I believe the system DPI is the (scaled?) DPI of the primary monitor, but I'm not sure about that.
per-monitor DPI aware, in which case the GetDeviceCaps and GetDpiForWindow will return the actual DPI of the monitor that's referenced by the DC. I don't recall what it returns if the DC spans multiple monitors.
It might by the actual DPI if the monitors spanned have the same DPI, or it might be the system DPI, or it might be the DPI of one of the spanned monitors.
GetDpiForMonitor ignores DPI awareness altogether. I assume it remains for backward compatibility.

Why are fonts of the same size displayed with different heights in different programs?

I am currently working on a GUI at a Windows 7 64-bit PC. While comparing the visualisation of text in different programs, I recognized, that there are differences in how big text is displayed on my monitor, given the same text style and size.
Does anyone has an idea where this comes from?
I created this behaviour by typing a text in Arial Regular 12pt containing the letter T in a program and scaling the view to 100%. Afterwards I measured the height of the letter T in pixels with the help of a screenshot.
Programs I testet:
MS Word 2010: T is 12 pixels high
LibreOffice Writer 5.2.7.2 (x64): T is 12 pixels high
Scribus 1.4.6: T is 12 pixels high
GIMP 2.8.14: T is 9 pixels high
Java 8 Update 181 (which I use for my own GUI): T is 9 pixels high
pt (point) is a unit for physical sizes, typically 1/72th of an inch.
In order to transfer this to a size in pixels, you need to know how many pixels will be in one inch on your screen. This value is known as Pixels Per Inch (PPI), sometimes somewhat ambiguously called Dots Per Inch (DPI).
Note that this value will usually be different for an application UI and the documents you are working on.
From the values you provided, it looks like MS Word, LibreOffice and Scribus assume 72 PPI (or at least the documents you are working on do), whereas GIMP and Java use 96 PPI.
It's not obvious whether you are referring to the size of text ion the respective applications' UI or documents opened in them, though, so I could be totally off.

Icon resolutions: Pixels vs DPI

When I try to do some research on making icons for Windows, and what size/resolution images I should leave in my .ico files before saving, there's too much weird information.
Some say put 16x16, 24x24, 32x32, 48x48 ... and so on in 96 DPI.
This is what irks me, and I feel it doesn't make any sense.
Isn't 1 pixel = 1 pixel?
Why do they insist on mixing DPI into this?
What is always true is that 1 pixel = 1 pixel. What does change is how big that pixel is on various displays that have different screen densities. That is what DPI describes - number of dots (pixels) per inch. But using DPI in context of image size only makes sense when you use it in combination with inches (centimeters). For instance "create image 10x10 inches at 300 DPI" and from that statement you can calculate that image has to be 3000x3000 pixels in size.
As far as Windows is concerned what does count is font scaling setting that can be set from 100% to 200%
So when you are creating your icons make sure that you have at least 1x and 2x dimensions. If the icon has to be 16x16 pix under normal resolution, that means that you would also create 32x32 pix icon. Other commonly used scaling are 125% and 150%, so it would be good idea to provide icon for those sizes too.
You can freely ignore statements like "Make the icon x pixels wide and x pixels high in x DPI" because those people have no clue what they are talking about.

Is there an easy way to force Windows to calculate text extents using a fixed DPI value, instead of the current DPI setting?

I am wondering if there is an easy way to calculate the text extent of a string (similar to GetTextExtentPoint32), but allows me to specify the DPI to use in the calcuation. In other words, is there a function that does exactly what GetTextExtentPoint32 does, but allows me to pass the DPI as a parameter, or a way to "trick" GetTextExtentPoint32 into using a DPI that I can specify?
Before you ask "Why on earth do you want to do that?", I'll try to explain, but bear with me, the reasons behind this request are somewhat involved.
Ultimately, this is for a custom word-wrap algorithm that breaks a long string into smaller blocks of text that need to fit neatly on a Crystal Report with complex text layout requirements (it mimics the paper form used by police officers to file criminal complaints, so the state is in charge of the layout, not us, and it has to match the paper form almost exactly).
It's impossible for Crystal Reports to lay this text out properly without help (the text has to fit inside a small box on one page, followed by "standard-sized" continuation pages if the text overflows the small block), so I wrote code to break the text into multiple "blocks" that are stored in the reporting database and then rendered individually by the report.
Given the required dimensions (in logical inches), and the font information, the code first fits the text to the required width by inserting line breaks, then breaks it into correctly-size blocks based on the text height. The algorithm uses VB6's TextHeight and TextWidth functions to calculate extents, which returns the same results that the GetTextExtentPoint32 function would (I checked).
This code works great when the display is set to 96dpi, but breaks at 120 DPI: Some lines end up with more words in them they would have had at 96 DPI.
For example, "The quick brown fox jumps over the lazy dog" might break as follows:
At 96 DPI
The quick brown fox jumps over
the lazy dog
At 120 DPI
The quick brown fox jumps over the
lazy dog
This text is then further broken up by Crystal Reports, since the first line no longer fits in the corresponding text field on the report, so the actual report output looks like this:
The quick brown fox jumps over
the
lazy dog
At first, I thought I could compensate for this by scaling the results of TextHeight and TextWidth down by 25%, but apparently life isn't so simple: it seems numerous rounding errors creep in (and possibly other factors?), so that the text extent of any given string is never exactly 25% larger at 120 DPI compared to 96 DPI. I didn't expect it to scale perfectly, but it's not even close at times (in one test, the width at 120 DPI was only about 18% bigger than at 96 DPI).
This doesn't happen to any text on the report that is handled directly by Crystal Report: it seems to do a very good job of scaling everything so that the report is laid out exactly the same at 96 DPI and 120 DPI (and even 144 DPI). Even when printing the report, the text is printed exactly as it appears on the screen (i.e. it truly seems to be WYSIWYG).
Given all of this, since I know my code works at 96 DPI, I should be able to fix the problem by calculating all my text extents at 96 DPI, even if Windows is currently using a different DPI setting.
Put another way, I want the result of my FitTextToRect function to return the same output at any DPI setting, by forcing the text extents to be calculated using 96 DPI. This should all work out since I'm converting the extents back to inches and then comparing them against required width and height in inches. I think it just so happens that 96 DPI produces more accurate results than 120 DPI when converting back and forth between pixels and inches.
I've been pouring over the Windows Font and Text Functions, seeing if could roll my own function to calculate text extent at a given DPI, looking at GetTextMetrics and other functions to see how easy or difficult this might be to do. If there is an easier way to accomplish this, I'd love to know before I start creating my own versions of existing Windows API functions!
GetTextMetrics accepts a DC. It uses the DPI settings from that DC (for example, you couldn't possibly use screen settings and expect data to come out formatted acceptably for a printer).
So all you need to do is supply a DC with the right DPI. I think you might be able to directly control the DPI of a metafile DC.
Metafiles are vector graphics so they don't even look like they have DPI.
You can control DPI with CreateDIBitmap, but then there's no way to get a matching DC. You could see if the DPI changes if you select that DIB into a memory DC (CreateCompatibleDC).
Or you could use GDI+, create a Bitmap with the desired DPI, use the Graphics constructor that operates on an Image, and then that Graphics will have the right DPI so GDI+ text measurement functions would then use your DPI of choice.
I found a much simpler solution. It took me awhile to convince myself that it really does makes sense. The solution was so obvious I feel almost silly posting it here.
Ultimately, what I really want is for my FitTextToRect function to produce the same text layout at any DPI setting. It turns out, in order to make this happen, it's actually easier to measure everything in pixels. Since any other unit of measure will by definition take the current DPI setting into account, using anything other than pixels can throw things off.
The "trick" then is to force all the calculations to work out to the same result they would have had at 96 DPI. However, instead of calculating text extents first and then scaling them down, which adds significant error to the calculations, you can get more accurate results (i.e. the results will be equal or near equal at any DPI) if you temporarily scale the font size down before calculating any text extents. Note that that original font size is still used in the print preview and in the printed output.
This works because of the fact that Windows internally measures font size in device units, which for the screen, means pixels. The font's "point size" is converted to pixels by the application that let you select the font, using the current DPI setting:
font_height_in_pixels = (point_size * current_dpi / 72)
That is, Windows never deals directly with the font's point size: it's always dealing with pixels. As a result, it calculates text extents in terms of pixels as well.
This means you can work out a scaling factor based on the current DPI and font point size that will scale the font down to a new point size which always comes out to the same number of pixels at any DPI (I used 96 as my "baseline" DPI):
scaled_point_size = (point_size * 96 / current_dpi)
By effectively forcing the font to fit the same number of pixels at any DPI, this ensures that text extents will be the same at any DPI, and therefore the algorithm will lay the text out the same way at any DPI.
The only other thing I needed to do was ensure that the height and width parameters passed to the function, which are measured in inches, got converted to pixels correctly. I couldn't use VB6's existing inches-to-pixels conversion function, since it takes the current DPI into account (which would create inconsistent results, since the text height and width is "normalized" to 96dpi), so instead I just multiplied the height and width by 96, which converts them to what their pixel measurements would be at 96 DPI.

Calculating pixel length of an image

May I know what are the ways to calculate the length of 1 pixel in centimeters? The images that I have are 640x480. I would like to compare 2 pixels at different places on the image and find the difference in distance. Thus I would need to find out what's the length of the pixel in centimeters.
Thank you.
A pixel is a relative unit of measure, it does not have an absolute size.
Edit. With regard to your edit: again, you can only calculate the distance between two pixels in an image in pixels, not in centimeters. As a simple example, think video projectors: if you project, say, a 3×3px image onto a wall, the distance between the leftmost and the rightmost pixels could be anything from a few millimeters to several meters. If you moved the projector closer to the wall or farther away from it, the pixel size would change, and whatever distance you had calculated earlier would become wrong.
Same goes for computer monitors and other devices (as Johannes Rössel has explained in his answer). There, the pixel size in centimeters depends on factors such as the physical resolution of the screen, the resolution of the graphical interface, and the zooming factor at which the image is displayed.
A pixel does not have a fixed physical size, by definition. It is simply the smallest addressable unit of picture, however large or small.
This is fully dependent on the screen resolution and screen size:
pixel width = width of monitor viewable area / number of horizontal pixels
pixel height = height of monitor viewable area / number of vertical pixels
Actually, the answer depends on where exactly your real-world units are.
It comes down to dpi (dots per inch) which is the number of image pixels along a length of 2.54 cm. That's the resolution of an image or a target device (printer, screen, &c.).
Image files usually have a resolution embedded within them which specifies their real-world size. It doesn't alter their pixel dimensions, it just says how large they are if printed or how large a “100 %” view on a display would be.
Then there is the resolution of your screen, as others have mentioned, as well as the specified resolution your graphical interface uses (usually 96 dpi, sometimes 120)—and then it's all a matter of whether programs actually honor that setting ...
The OS will assume some dpi (usually 96 dpi on windows) however the screens real dpi will depend on the physical size of the display and the resolution
e.g a 15" monitor should have a 12" width so depending on the horizontal resolution you will get a different horizontal dpi, assuming a 1152 pixel screen width you will genuinely get 96 dpi

Resources