how to find the width of fixed pitch font [duplicate] - winapi

I'm maintaining an application using Borland C++ Builder 6 running on Windows 7.
The application is incorrectly drawing text using font Courier New because each letter is being slighty cut off. The issue is when calling the method GetTextMetrics because it is filling the TEXTMETICS struct with differing tmAveCharWidth and tmMaxCharWidth values. The application then uses tmAveCharWidth to calculate character width with is wrong because that value it can be less than tmMaxCharWidth. That issue I will be fixing.
I courious why GetTextMetrics is returning differing tmAveCharWidth and tmMaxCharWidth values for Courier New? My understanding was that Courier New is a monospaced font and that tmAveCharWidth and tmMaxCharWidth should be the same. I tested with other monospaced fonts that that assumption is correct.
This is the section of code with the issue:
hFont = CreateFontIndirect(&lpInstData->lf);
hDC = GetDC(hWnd);
hFontOld = SelectObject(hDC, hFont);
GetTextMetrics(hDC, &tm);
lpInstData->nCharHeight = tm.tmHeight;
lpInstData->nCharWidth = tm.tmAveCharWidth; <--- Should be using tmMaxCharWidth
Here is the code running when I selected size 12 Courier New.
Parameter passed to CreateFontIndirect
TEXTMETRICS structure returned from GetTextMetrics

I found this was indeed ClearType at work (thanks Deanna). Turning off ClearType corrects the display issue without changing any code, although I still need to correct how the application works with ClearType.
I also found the issue was not present on Windows XP because ClearType is turned off by default, whereas in Windows 7 (and Vista) it is turned on by default.

Related

Differences between GetDC() and BeginPaint()?

I am having trouble with some of my owner drawn listboxes on High DPI monitors on Windows 10 in a dialog box. The text is chopped off at the bottom. We saw the problem on Windows 7 and were able to fix it. It is not necessarily High DPI, but when the user sets a different text scaling. I solved the problem, so I thought (!), by using a CClientDC (wrapper around GetDC()) and calling GetTextMetrics() to determine the text height. Previously, our icons had always been taller than our text so it was not a problem. With larger DPI monitors we saw some customers reporting problems when they scaled the text.
Now we are getting new reports under Windows 10. The former problem is fine under Windows 7--but Windows 7 only scales to 100, 125, and 150 percent. Windows 10 (and maybe 8? -- but no customer reports) allows user defined scaling.
So, I tracked down the problem somewhat... I knew what the font height was when I called GetTextMetrics() during WM_MEASUREITEM. I went and put some code in to debug what GetTextMetrics() was during my WM_DRAWITEM. Well, they were different--20 pixels high during WM_MEASUREITEM, and 25 pixels high during WM_DRAWITEM. Obviously, that is a problem. I want the GetTextMetrics() to have the same results in both places.
My thought was that the only real difference I could think of was that during WM_MEASUREITEM I am calling GetDC() via CClientDC constructor, and that during WM_DRAWITEM I am using an already constructed HDC (which probably was from a return of GetPaint() inside GDI32.dll or another system DLL).
I thought maybe the BeginPaint() does something like select the windows HFONT into the HDC...
So, inside my WM_MEASUREITEM after getting the DC, I select the font of the listbox into the HDC, and then I call GetTextMetrics(). Lo and behold, the numbers match now in WM_MEASUREITEM and WM_DRAWITEM.
However, I don't know if I just got lucky. It's all just guesswork at this point.
Does BeginPaint() select the window font into the DC whereas GetDC() does not? Does the default handler of WM_PAINT for an owner drawn LISTBOX or COMBOBOX do something like select the window font into the paint DC?
BOOL DpiAwareMeasureGraphItem(LPMEASUREITEMSTRUCT lpM, CWnd* pWnd)
{
int iItemHeight = INTERG_BITMAP_HEIGHT + 4;
if (pWnd)
{
CClientDC dc(pWnd);
if (dc.GetSafeHdc())
{
CFont* pOldFont = dc.SelectObject(pWnd->GetFont()); // seems to fix it on Windows 10, but is it luck?
TEXTMETRIC tm;
memset(&tm, 0, sizeof(tm));
dc.GetTextMetrics(&tm);
LONG tmHeight = tm.tmHeight + 4; //pad
iItemHeight = max(iItemHeight, tmHeight);
dc.SelectObject(pOldFont);
}
}
lpM->itemHeight = iItemHeight;
return (TRUE);
}
Neither GetDC() or BeginPaint() initialise the DC they return with anything other than the default system font. But WM_DRAWITEM is different - it gives you an already-initialised DC to draw into.
The method you stumbled across is the right one. WM_MEASUREITEM doesn't supply a DC at all, so if you need one for size calculations you're responsible for obtaining it and setting it up with the appropriate font.

Why does Core Text return Myriad Pro Semibold when requesting a bold version of Myriad Pro

I have the common Adobe Myriad Pro fonts installed. These include Myriad Pro Regular, Myriad Pro Bold and Myriad Pro Semibold. Assume that I have a CTFontRef baseFont that points to Myriad Pro Regular, and that the font size I desire is size. I run the following code:
CTFontRef boldFont = CTFontCreateCopyWithSymbolicTraits(baseFont, size, NULL, kCTFontBoldTrait, kCTFontBoldTrait);
The returned font is Myriad Pro Semibold, not Myriad Pro Bold.
Is there a way of coercing this to return Myriad Pro Bold instead, other than requesting the named style 'Bold'? I wanted to keep this code entirely generic without hard-wiring style names.
I have tried this in various permutations, including passing the bold trait as part of an attribute dictionary when I initially create my font, avoiding the two-step process described here, but it still returns the semibold font in preference to the normal bold. I've also poked around the fonts themselves a little. The full bold font has a weight of 700 in its <OS/2> table, and the semibold font has a weight of 600. The PANOSE weights correspond with this. However, the macStyle fields in the <head> table of the semibold and bold fonts both have the bold flag set, so presumably this is what Core Text is using. But is there any way to make it more discriminating?
Based on a reading of the documentation, backed up by some knowledge of font handling in general but not Core Text specifically, I'd say it may be possible, but it's not straightforward.
The CTFontCreateCopyWithSymbolicTraits() documentation specifies that the symTraitValue and symTraitMask parameters have type CTFontSymbolicTraits. The CTFontDescriptor() documentation defines that "Bold" value that you are using as
kCTFontBoldTrait = (1 << 1)
So this is clearly a boolean trait. However, as you've seen, font weight is a spectrum, not a boolean trait, even though decades of "bold" buttons in word processor UIs have presented it as a boolean trait. CTFontCreateCopyWithSymbolicTraits() doesn't have the expressive power you need.
One other approach which might work is to try calling CTFontDescriptorCreateMatchingFontDescriptors(). You pass this function a CTFontDescriptorRef to an initial font, and a CFSetRef with attributes which must be present. This function returns an array of font descriptors, all of which match the attributes you requested.
So, you could pass it a CTFontDescriptorRef for Myriad Pro Regular, and maybe a CFSetRef saying you want bold, and then look through every font descriptor in the returned array to find the one with the heaviest weight.
I haven't written this code, and my ignorance of Core Text means I may be missing something, but that seems like a plausible approach.
For the CTFontDescriptor you can specify an attribute kCTFontTraitsAttribute which should be an CFDictionaryRef where you can specify the kCTFontWeightTrait which takes a CFNumberRef that represents floating point between -1 and 1, giving you a spectrum of weights, 1 being the most bold variant, and 0 being the regular/medium.

Why doesn't FONTSIGNATURE reflect lfCharSet?

I'm enumerating Windows fonts like this:
LOGFONTW lf = {0};
lf.lfCharSet = DEFAULT_CHARSET;
lf.lfFaceName[0] = L'\0';
lf.lfPitchAndFamily = 0;
::EnumFontFamiliesEx(hdc, &lf,
reinterpret_cast<FONTENUMPROCW>(FontEnumCallback),
reinterpret_cast<LPARAM>(this), 0);
My callback function has this signature:
int CALLBACK FontEnumerator::FontEnumCallback(const ENUMLOGFONTEX *pelf,
const NEWTEXTMETRICEX *pMetrics,
DWORD font_type,
LPARAM context);
For TrueType fonts, I typically get each face name multiple times. For example, for multiple calls, I'll get pelf->elfFullName and pelf->elfLogFont.lfFaceName set as "Arial". Looking more closely at the other fields, I see that each call is for a different script. For example, on the first call pelf->elfScript will be "Western" and pelf->elfLogFont.lfCharSet will be the numeric equivalent of ANSI_CHARSET. On the second call, I get "Hebrew" and HEBREW_CHARSET. Third call "Arabic" and ARABIC_CHARSET. And so on. So far, so good.
But the font signature (pMetrics->ntmFontSig) field for all versions of Arial is identical. In fact, the font signature claims that all of these versions of Arial support Latin-1, Hebrew, Arabic, and others.
I know the character sets of the strings I'm trying to draw, so I'm trying to instantiate an appropriate font based on the font signatures. Because the font signatures always match, I always end up selecting the "Western" font, even when displaying Hebrew or Arabic text. I'm using low level Uniscribe APIs, so I don't get the benefit of Windows font linking, and yet my code seems to work.
Does lfCharSet actually carry any meaning or is it a legacy artifact? Should I just set lfCharSet to DEFAULT_CHARSET and stop worrying about all the script variations of each face?
For my purposes, I only care about TrueType and OpenType fonts.
I think I found the answer. Fonts that get enumerated multiple times are "big" fonts. Big fonts are single fonts that include glyphs for multiple scripts or code pages.
The Unicode portion of the FONTSIGNATURE (fsUsb) represents all the Unicode subranges that the font can handle. This is independent of the character set. If you use the wide character APIs, you can use all the included glyphs in the font, regardless of which character set was specified when you create the font.
The code page portion of the FONTSIGNATURE (fsCsb) represents the code pages that the font can handle. I believe this is only significant when the font is not a "big" font. In that case, the fsUsb masks will be all zeros, and the fsCsb will specify the appropriate character set(s). In those cases, it's important to get the lfCharSet correct in the LOGFONT.
When instantiating a "big" font and using the wide character APIs, it apparently doesn't matter which lfCharSet you specify.

How to draw vertical text in Windows GUI?

I need to draw a column of vertical text (in Japanese language - it is drawn top-to-bottom instead of left-to-right) in my native C++ Win32 GUI application. I've looked through MSDN and only found how to draw right-to-left text.
How do I output top-to-bottom text except drawing each character separately?
The straight Win32 API has no way to draw (unrotated) vertical text (with an arbitrary font) in that way except 1 character at at time.
You can do more complex text output with GDI+
But that probably isn't what you want either, since the text will be vertical, but the characters will also be rotated.
Similarly, you can use CreateFont with an lfEscapement value of 900 or 2700 to get rotated text, but this will rotate everything. So that doesn't help either.
To do Japanese Top to Bottom drawing, you want the characters to be unrotated, but the placment of each character to advance in Y but not in X. Windows has no API that does this for all fonts. (you can do right-to-left and left-to-right, but not top-to-bottom).
In theory creating a font with an Orientation of 900 and an escapement of 2700 would do what you want, but it appears that if you set the escapement, then the orientation is ignored for most fonts. It's possible that for Japanese fonts, this will work differently. It's worth spending some time to play with. (see the addendum for more information on this)
I think your best bet is a probably a loop drawing one character at a time with ExtTextOut which gives you full control over the placement of each character.
If you use ETO_OPAQUE to draw the first character in a column, and not with all of the others, then you will be permitted to kern the characters vertically if you need to.
Addendum
Roygbiv points to an interesting article that says that fonts whose names begin with an # behave differently then other fonts when you use CreateFont a font with an lfEscapement value of 2700, These special fonts produce upright characters while still advancing down the page. So while there is no way to do what you want for arbitrary fonts, you may be able to get it working using certain fonts.
Options for Displaying Text
Out of curiosity, i wrote a small console app to enum fonts and list the names. My Windows Server 2003 machine has not fonts with names beginning with #. But my Windows 7 machine has a few. All seem to be Chinese fonts though, I see no Japanese fonts in the default Windows 7 Ultimate install.
The correct answer is:
There are three methods to do this:
Using the Edit or RichEdit controls to render your text
Using the Uniscribe API
Using the TextOut function with a font face name that begins with an at sign (#).
Here is an article that discusses some of these approaches.
Fortunately, with Win32 you do not need to write code to rotate characters. To display text vertically on Windows 2000 and Windows XP, enumerate the available fonts as usual, and select a font whose font face name begins with the at sign (#). Then create a LOGFONT structure, setting both the escapement and the orientation to 270 degrees. Calls to TextOut are the same as for horizontal text.
In Win32, use the lfEscapement member of a LOGFONT structure to define the rotation of a font:
LOGFONT LogFont
LogFont.lfEscapement = 900; // 90 degreees rotated text
... // Many more initializations
HFONT newFont = CreateFontIndirect(LogFont);
SelectObject(hdc, newFont);
char tx[255];
strcpy(tx, "vertical text");
TextOut(hdc, x, y, tx, strlen(tx)); // draw a vertical font
For More Information see the online Help of LOGFONT structure and of the CreateFontIndirect Function
HFONT gui_font = CreateFont( -MulDiv( 9, GetDeviceCaps( GetDC( hWnd ), LOGPIXELSY ), 72 ),
0,
900, // here
0,
FW_THIN, 0, 0, 0,
DEFAULT_CHARSET,
OUT_DEFAULT_PRECIS,
CLIP_DEFAULT_PRECIS,
DEFAULT_QUALITY, FF_MODERN | FIXED_PITCH,
L"Segoe UI" );
Using lfEscapement (and if necessary lfOrientation) is superior in many ways to making the rectangle minimally wide (for instance: the dutch word 'wij' would have the 'i' and 'j' next to each other, because their combined width is less than the 'w'), or inserting a newline after each character.
The method this library uses sounds slow, but if do want it, it appears source code is provided:
http://www.ucancode.net/faq/CDC-DrawText-Drawing-Vertical-Text.htm
You may also find this discussion useful - http://www.eggheadcafe.com/forumarchives/win32programmergdi/Aug2005/post23542233.asp - apparently you need a vertical font (one beginning with #) and the API will take care of the rest.
As a quick hack type of answer, what happens if you use a standard control (CEdit for instance) and insert a new-line after every character typed?
Just an idea:
Did you try using DrawText or DrawTextEx using a very narrow rectangle that just fits the widest character?

SystemParametersInfo behaves differently on Vista and XP

I am trying to find the default system font size using SystemParametersInfo() with SPI_GETNONCLIENTMETRICS.
While on Vista the LOGFONT structures inside the returned NONCLIENTMETRICS actually have the correct font height in lfHeight, when I run the exact same app on XP, lfHeight (and lfWidth) are always zero.
Why is that so, and what is the correct way to retrieve the font size on both systems?
Are you setting cbSize member of NONCLIENTMETRICS to sizeof(NONCLIENTMETRICS)?
According to MSDN, you'll need a runtime system version check, and subtract the size of the iPaddedBorderWidth when running under Windows XP.
I don't know the 100% correct answer, but according to MSDN the value of zero has a special meaning for both lfHeight and lfWidth:
this is taken from MSDN, article Windows GDI, "LOGFONT"
lfHeight - if 0 - The font mapper uses a default height value when it searches for a match.
lfWidth - If lfWidth is zero, the aspect ratio of the device is matched against the digitization aspect ratio of the available fonts to find the closest match, determined by the absolute value of the difference.

Resources