I am trying to find the default system font size using SystemParametersInfo() with SPI_GETNONCLIENTMETRICS.
While on Vista the LOGFONT structures inside the returned NONCLIENTMETRICS actually have the correct font height in lfHeight, when I run the exact same app on XP, lfHeight (and lfWidth) are always zero.
Why is that so, and what is the correct way to retrieve the font size on both systems?
Are you setting cbSize member of NONCLIENTMETRICS to sizeof(NONCLIENTMETRICS)?
According to MSDN, you'll need a runtime system version check, and subtract the size of the iPaddedBorderWidth when running under Windows XP.
I don't know the 100% correct answer, but according to MSDN the value of zero has a special meaning for both lfHeight and lfWidth:
this is taken from MSDN, article Windows GDI, "LOGFONT"
lfHeight - if 0 - The font mapper uses a default height value when it searches for a match.
lfWidth - If lfWidth is zero, the aspect ratio of the device is matched against the digitization aspect ratio of the available fonts to find the closest match, determined by the absolute value of the difference.
Related
The Problem
I am loading the classic serife.fon file from Microsoft Windows using FreeType.
Here is how I set the size:
FT_Set_Pixel_Sizes(face, 0, fontHeight);
I use 0 for the fontWidth so that it will be auto-calculated based on the height.
How do I find the correct value for fontHeight such that the resulting font will be exactly 9 pixels tall?
Notes
Using trial and error, I know that the correct value is 32 - but I don't understand why.
I am not sure how relevant this is for bitmap fonts, but according to the docs:
pixel_size = point_size * resolution / 72
Substituting in the values:
point_size = 32
resolution = 96 (from FT_Get_WinFNT_Header)
gives:
pixel_size = 42.6666666
This is a long way from our target height of 9!
The docs do go on to say:
pixel_size computed in the above formula does not directly relate to the size of characters on the screen. It simply is the size of the EM square if it was to be displayed. Each font designer is free to place its glyphs as it pleases him within the square.
But again, I am not sure if this is relevant for bitmap fonts.
fon files are exe files with a fnt payload, where the fnt payload can be a vector or raster font. If this is a raster font (which is most likely) then the dfPixHeight value in the fnt header will tell you what size it's meant to be, which is exposed by FreeType2 as the pixel_height field of the FT_WinFNT_Header.
(And of course, note that using any size other than "the actual raster-size of the FNT" is going to lead to hilarious headaches because bitmap scaling is the kind of madness that's so bad, OpenType instead went with "just embed as many bitmaps as you need, at however many sizes you need, because that's the only way your bitmaps are going to look good")
The FNT-specific FT2 documentation can be found over on https://www.freetype.org/freetype2/docs/reference/ft2-winfnt_fonts.html but you may need to read it in conjunction with https://jeffpar.github.io/kbarchive/kb/065/Q65123 (or https://web.archive.org/web/20120215123301/http://support.microsoft.com/kb/65123) to find any further mappings that you might need between names/fields as defined in the FNT spec and FT2's naming conventions.
I'm maintaining an application using Borland C++ Builder 6 running on Windows 7.
The application is incorrectly drawing text using font Courier New because each letter is being slighty cut off. The issue is when calling the method GetTextMetrics because it is filling the TEXTMETICS struct with differing tmAveCharWidth and tmMaxCharWidth values. The application then uses tmAveCharWidth to calculate character width with is wrong because that value it can be less than tmMaxCharWidth. That issue I will be fixing.
I courious why GetTextMetrics is returning differing tmAveCharWidth and tmMaxCharWidth values for Courier New? My understanding was that Courier New is a monospaced font and that tmAveCharWidth and tmMaxCharWidth should be the same. I tested with other monospaced fonts that that assumption is correct.
This is the section of code with the issue:
hFont = CreateFontIndirect(&lpInstData->lf);
hDC = GetDC(hWnd);
hFontOld = SelectObject(hDC, hFont);
GetTextMetrics(hDC, &tm);
lpInstData->nCharHeight = tm.tmHeight;
lpInstData->nCharWidth = tm.tmAveCharWidth; <--- Should be using tmMaxCharWidth
Here is the code running when I selected size 12 Courier New.
Parameter passed to CreateFontIndirect
TEXTMETRICS structure returned from GetTextMetrics
I found this was indeed ClearType at work (thanks Deanna). Turning off ClearType corrects the display issue without changing any code, although I still need to correct how the application works with ClearType.
I also found the issue was not present on Windows XP because ClearType is turned off by default, whereas in Windows 7 (and Vista) it is turned on by default.
I'm using accessibility with the AccessibleObjectFromPoint function, and I'd like it to work correctly on a per-monitor DPI environment. Unfortunately, I can't get it to work. I tried many things, and the situation for now is:
My app is marked as per-monitor-DPI-aware in the manifest. (True/PM)
I use GetCursorPos and then AccessibleObjectFromPoint.
How can the problem be reproduced:
Have two monitors, one with 100% DPI, the other with 125%.
Run Chrome on the 125% monitor.
Use AccessibleObjectFromPoint on one of the tab names, it won't work.
It works with some apps (DPI-aware, it seems, like explorer), but doesn't work with others. I tried several relevant functions, such as GetPhysicalCursorPos and PhysicalToLogicalPointForPerMonitorDPI, but nothing works.
It's worth noting that Microsoft's inspect.exe works as expected.
I’ve been struggling with this exact same problem for several weeks and can now tell you my findings. Unfortunately I can’t give you more than a hint of code, because the project I am working on, is proprietary.
The issue started at Windows 8.1. The problem did not exist on Windows 7 or Vista, because AccessibleObjectFromPoint always used raw physical coordinates, as documented here: https://msdn.microsoft.com/en-us/library/windows/desktop/dd317984(v=vs.85).aspx .
“Microsoft Active Accessibility does not use logical coordinates. The following methods and functions either return physical coordinates or take them as parameters.” This has not been true since Windows 8.1.
AccessibleObjectFromPoint now uses a flawed calculation that cannot always find the correct window for reasons similar to my question here: High DPI scaling, mouse hooks and WindowFromPoint .
My findings lead me to one conclusion: The API is broken. This does not mean it is not possible though.
Possible solutions I have partially tested that seem to work follow.
Prerequisites are that you
1/. Make your process per monitor DPI aware, NOT USING THE MANIFEST (more on that later).
2/. Determine the hWnd of the window you want to query (WindowFromPoint() variants)
3/. Determine the monitor DPI of the queried hWnd
4/. Determine the DPI of your process
5/. Determine the DPI of the queried hWnd
6/. Determine the monitor origin and offset for the queried hWnd (MonitorFromWindow() and GetMonitorInfo() )
Next, depends on your platform
Windows 10.0.14393+
Write a function that finds the IAccessible (AccessibleObjectFromWindow() ) from the top level window, and then recursively call IAccessible::accHitTest until you reach the bottom-most IAccessible and perhaps ChildID data. Return that as if you would call AccessibleObjectFromPoint.
To call it successfully, you will need to scale the (x,y) co-ordinates into the scale system of the queried hWnd, using the DPIs and co-ordinates fetched in the list above. Watch out for systems where monitors are not the same size or if monitors are partially offset, or above and below.
And now for the important part for 10.0.14393 – Set your thread to the same DPI_AWARENESS_CONTEXT of the hWnd you are querying. Now call your new function. Now revert your thread to monitor DPI aware, and voila, it works, even if the window is not maximised. This is why you must not use the manifest.
If you are on Windows 8.1 to 10.0.10586 you have a tougher task.
Instead of calling accHitTest, as above, you have to recursively call AccessibleChildren and iterate the call IAccessible::accLocation to determine if your test point is within each child. This is tricky and starts to get really messy when you get to e.g. combo boxes in products like Office, which is only system DPI aware.
That’s all I can give you for now.
To do it successfully on multi-platform (mine has to work from Vista to Windows-Current) the only really safe bet is to write a wrapper DLL in C++ that can determine at runtime which OS it is on and change code path accordingly. The reason you want to do it in C++ is to avoid passing IAccessible objects across the .Net/unmanaged marshalling boundary. You can call IUnknown::Release on objects you don’t need to return n the unmanaged side. You can do it all in .Net, but it will be slow.
P.S. also watch out for Chrome returning infinite trees where parents are children of their parents, some snity checks are required. Also, Chrome does not return accRole correctly, and will give you HTML tags instead of VT_I4.
Good luck
A fairly workable solution is as follows, in your IAccessible recursive function:
Use getwindowrect to capture the physical right on main window
Use accChild.accLocation in loop to capture left and Width on each Object
Add this simple test
If l > rct2r.Right And l > arrIACC.x2 Then
arrIACC.x2 = l + w
End If
if dpi = 100 then no Object is furter out than physical right
if dpi > 100 then closebutton is...x pix offset
Use the difference to rescale all values you are in use of Width
arrIACC.w1 = CInt(((-rct2r.Left + arrIACC.w1) / arrIACC.x2) * rct2r.Right)
This solution is from an Excel plugin I have developed, I was testing the Width of the quick access toolbar qat and my result was +- 5 pixels regardless of any DPI.
In Windows, the CreateFontIndirect() call can silently substitute compatible fonts if the requested font is not requested. The GetObject() call does not reflect this substitution; it returns the same LOGFONT passed in. How can I find what font was actually created? Alternately, how can I force Windows to only return the exact font requested?
In Windows, the CreateFontIndirect() call can silently substitute compatible fonts if the requested font is not requested. The GetObject() call does not reflect this substitution; it returns the same LOGFONT passed in.
It's not CreateFontIndirect that's doing the substitution. The substitution happens when the font is selected into the DC. CreateFontIndirect just gives you a handle to an internal copy of the LOGFONT. That's why GetObject gives you the same LOGFONT back.
How can I find what font was actually created?
If you select the HFONT into the target DC, you can then ask the DC for the information about the font that was actually chosen as the best match to the LOGFONT.
The face name is available with GetTextFace.
You can get metrics with GetTextMetrics.
If the selected font is TrueType or OpenType, you can get additional metrics with GetOutlineTextMetrics.
That essentially tells you what font was actually created.
Aside:
When doing something like print preview, you can start with a LOGFONT, select it into the printer DC (or IC), grab the details of the actual font (printers often substitute fonts), and then create a new LOGFONT that's more representative of the actual font. Select that into the screen DC, and--with appropriate size conversions--do a pretty good match of what the user will actually get.
To get the appropriate font on different language versions of the OS,
call EnumFontFamiliesEx with the desired font characteristics in the
LOGFONT structure, then retrieve the appropriate typeface name and
create the font using CreateFont or CreateFontIndirect.
While it's not a universal way to get the actual font name from an HFONT, you can check beforehand what CreateFontIndirect would (most likely) return.
Judging from how MSDN suggest this as a good solution for getting a font family from the attributes it seems like the way Windows internally performs the substitution.
I have a cursor what the size size 128x128, but when i used LoadCursor to load and show it, it only has 32x32. Which API can make it correctly? It seems MS resize it. Thanks.
Windows XP does not include any system cursors that are larger than 32x32. (If larger cursors were included, they would be stretched down to 32x32 when the standard APIs load the cursors.)
For high-DPI systems, Windows XP has adjusted the SM_CXCURSOR and SM_CYCURSOR values to be 64x64 pixels. This size adjustment is to prevent the mouse pointer from virtually disappearing because it is too small to be effectively used. Although the other aspects of the system scale with DPI, the mouse pointer does not scale. Microsoft does not try to enforce a DPI-independent size for the mouse pointer.
The system also provides the SetSystemCursor API function that you can use to change the system cursor for specific categories. You can use this function to set a cursor of any size. However, you must call the function programmatically, and you can only use it to set a cursor for a specific category. You cannot use it to make all cursors on the system the same size.
http://support.microsoft.com/kb/307213
Don't use LoadCursor, use LoadImage() instead.
SM_CXCURSOR by SM_CYCURSOR is the only cursor size the system can currently use.
Use GetSystemMetrics to find out those values.