Correct combining characters positions with Harfbuzz - text-rendering

I am trying to render text with Harfbuzz and a signed distance field atlas.
The code is basically this:
void drawText(const std::wstring &str, Vec2 pos)
{
// Init harfbuzz
hb_buffer_t *hbBuf = hb_buffer_create();
hb_buffer_set_direction(hbBuf, HB_DIRECTION_LTR);
hb_buffer_set_script(hbBuf, HB_SCRIPT_LATIN);
hb_buffer_set_language(hbBuf, hb_language_from_string("en", 2));
// Process string
hb_buffer_add_utf32(hbBuf, reinterpret_cast<const uint32_t*>(str.c_str()), -1, 0, -1);
hb_shape(font.hb, hbBuf, nullptr, 0);
// Display string
unsigned int nbGlyphs;
hb_glyph_info_t *glyphInfos = hb_buffer_get_glyph_infos(hbBuf, &nbGlyphs);
hb_glyph_position_t *glyphPos = hb_buffer_get_glyph_positions(hbBuf, &nbGlyphs);
for(unsigned int i = 0; i < nbGlyphs; i++)
{
Vec2 drawPos = pos + Vec2(glyphPos[i].x_offset, glyphPos[i].y_offset) / 64.f;
drawGlyph(glyphInfos[i].codepoint, drawPos);
pos.x += glyphPos[i].x_advance / 64.f;
pos.y += glyphPos[i].y_advance / 64.f;
}
}
The text looks correctly shaped for an English phrase, but when I test it with diacritics, they look misplaced.
I am testing it with aâa aâ̈a bb̂b bb̂̈b bb̧b bb͜b bb︠︡b. The Unicode string does not contain precombined characters. Harfbuzz uses the precombined character â, which makes this one look good. Most other diacritics are off.
Text with diacritics on the left of where they should be
When I multiply x_offset by 0.5, the combining characters are better placed. The accents and the cedilla are at the right x position. The accents do not stack and are too low on the b. The arc under BBB (U+035C) should join the two last letters instead of being centered on the 2nd b.
I also tried with U+FE20 and U+FE21 on the previous group of b. In my tests, U+FE21 is on the 2nd b, but it looks like it should be on the 3rd.
Test with glyphPos[i].x_offset * 0.5f, better but still wrong
I tried with several fonts, but of those fonts, only NotoSansDisplay-Regular.ttf had combining characters. I did not manage to make a program display that string as expected on my Debian system (testing, with HarfBuzz 2.6.4-1).
With Windows, I got better results. Here is what I expect: the accents are stacked, the combining double breve below it at the right place, the cedilla is off.
Text rendering closer to what I expect
Am I doing something wrong with HarfBuzz, or I am testing to niche cases that HarfBuzz does support yet?
EDIT:
The actual problem was not described above.
I loaded a font with FreeType FT_New_Face then created a hb_font_t with hb_ft_font_create.
For every string drawn, I called FT_Set_Pixel_Sizes but kept that hb_font_t.

You should try shaping the same text and font with hb-view / hb-shape. That would help you narrow down where the problem is. I'm making a wild guess that the problem is in how / whether you are accounting for glyph origin in your atlas.

Create a new hb_font_t with hb_ft_font_create every time the font size i changed with FT_Set_Pixel_Sizes.

Related

How to make a menu with this effect in unity3d?

Sorry, I'll try to explain better.
I want to make a sliding menu where you can select a character. I want the character that is at the center increases in size to know that this is the current character. The effect can be seen in the game of Crossy Road when you want to select a character.
Sorry but I can't upload a imagen because i am new in the forum
I think I might be able to help you without needing too much borrowed code. There are two possibilities here:
You have a perspective camera so the "selected" item can just be closer to the camera.
You have an orthographic camera so you will have to scale things.
For perspective:
List<GameObject> characters; //contains all character.
int selectedIndex = 0; //index to the selected character.
float spacing = 10; //space between chars
void Update()
{
if(Input.GetKeyDown("RightArrow"))
{
selectIndex++;
ApplyChanges();
}
if(Input.GetKeyDown("LeftArrow"))
{
selectIndex--;
ApplyChanges();
}
}
void ApllyChanges()
{
// Make sure the selected index is within range and then space the characters.
selectedIndex = selectedIndex % character.Count();
SpaceCharacters();
}
void SpaceCharacters()
{
for(int i = 0; i < characters.Count(); ++i)
{
// characters on the left will have a negative spacing and so will be placed to the left and vice versa for characters on the right.
int offset = i - selectedIndex;
characters[i].transform.position = new Vector3(offset * spacing, 0, 0);
}
// Move the selected character closer.
characters[selectedIndex].transform.position = new Vector3(0,0,spacing);
}
For orthographic camera you will need to set the select characters transform.scale to a larger vector.
This won't animate anything or look cool. This code will just snap your characters into position.
The solution I adopted was to attach objects to the transparent buttons in a scrool rect, so as to manage 3d objects with the convenient of scrool rect interface.
Here you can find the official documentation for use scrool rect: http://docs.unity3d.com/Manual/script-ScrollRect.html
Maybe my assets can serve you ;)
https://www.assetstore.unity3d.com/en/#!/content/60233

setLineWidth works differently on different test machines

I make a game in which I draw bezier curve like this:
final VertexBufferObjectManager vbom = engine.getVertexBufferObjectManager();
final HighPerformanceMeshVertexBufferObject pMeshVBOM = new HighPerformanceMeshVertexBufferObject(vbom, pBufferData, pBufferData.length, DrawType.DYNAMIC, true, Mesh.VERTEXBUFFEROBJECTATTRIBUTES_DEFAULT);
final HighPerformanceLineChainVertexBufferObject pLeftCurbLineChainVBOM = new HighPerformanceLineChainVertexBufferObject(vbom, triangleCount * 3, DrawType.DYNAMIC, true, leftCurb.VERTEXBUFFEROBJECTATTRIBUTES_DEFAULT);
final HighPerformanceLineChainVertexBufferObject pRightCurbLineChainVBOM = new HighPerformanceLineChainVertexBufferObject(vbom, triangleCount * 3, DrawType.DYNAMIC, true, rightCurb.VERTEXBUFFEROBJECTATTRIBUTES_DEFAULT);
leftCurb = new LineStrip(0, 0, 10f, triangleCount, pLeftCurbLineChainVBOM){
#Override
protected void onManagedUpdate(final float pSecondsElapsed) {
super.onManagedUpdate(pSecondsElapsed);
drawByBezier(curveOffset);
};
void drawByBezier(float curveOffset){
for (int triangleIndex = 0; triangleIndex < triangleCount; triangleIndex++) {
this.setX(triangleIndex, getBezierX(triangleIndex, -curveBottom, -curveControlPoint, -curveTop + curveOffset));
this.setY(triangleIndex, triangleIndex * heightIncrement);
}
}
By changing a value of curveOffset I change the look of the curve.
The first parameter (10f) is a line width. When I test it on Galaxy S5 (android 5) the line width is drawn as about 2 pixels wide and if I put lower value there, like 1.5f the drawn line is very thin. On the other hand putting large numbers like 100f doesn't do anything - line stays at same (small) width.
I tested this on Galaxy S3 mini (android 4.1.2) and the line width works there (performance is other thing though...). Line is drawn as I wanted to. How can I do that in GalaxyS5 (android 5). For me it looks like device or OS specific problem (openGL version?), but is there any way to overcome this?
OpenGL ES implementations do not have to support drawing of wide lines. You can query the range of available line widths with:
float[] range = new float[2];
GLES20.glGetFloatv(GLES20.GL_ALIASED_LINE_WIDTH_RANGE, range, 0);
// range[0] is the minimum supported line width.
// range[1] is the maximum supported line width.
This gives you the range supported by the specific device you're running on. Compliant implementations can have a maximum as low as 1.0. This means that you cannot use wide lines if you want your code to run on all devices.
If you want something that has the appearance of wide lines, and will work on any device, you have to draw polygons. You can draw something that looks like a line as a thin quad that is oriented towards the viewer.

Unwanted padding in CATextLayer

I am having a problem of drawing a character in a CATextLayer such that the layer is exactly the size of the character.
I use the code below to get the size of the glyph corresponding to the character in the string. At the moment I am neglecting diacretics, so that I am assuming that there is a one to one correlation between glyphs and characters.
I also get nice bounding box values for several characters for the Helvetica font at size 128pt:
Character | x | y | width | height |
B | 9.4 | 0.0 | 70.8 | 91.8 |
y | 1.3 | -27.4 | 61.2 | 96.0 |
I am not sure where the origin of the coordinate system is located in which the coordinates are expressed. I am assuming that (0,0) is at the very left and vertically located on the baseline of the font. That is why 'y' has a negative y value.
I am using this code to calculate the size of a capital B and resize its CATextLayer accordingly.
- (CATextLayer *) testTextLayer
{
CATextLayer *l = [CATextLayer layer];
l.string = #"B";
NSUInteger len = [l.string length];
l.fontSize =128.f;
CGColorRef blackColor = CGColorCreateGenericGray(0.f, 1.f);
l.foregroundColor = blackColor;
CGColorRelease(blackColor);
// need to set CGFont explicitly to convert font property to a CGFontRef
CGFontRef layerFont = CGFontCreateWithFontName((CFStringRef)#"Helvetica");
l.font = layerFont;
// get characters from NSString
UniChar *characters = (UniChar *)malloc(sizeof(UniChar)*len);
CFStringGetCharacters((__bridge CFStringRef)l.string, CFRangeMake(0, [l.string length]), characters);
// Get CTFontRef from CGFontRef
CTFontRef coreTextFont = CTFontCreateWithGraphicsFont(layerFont, l.fontSize, NULL, NULL);
// allocate glyphs and bounding box arrays for holding the result
// assuming that each character is only one glyph, which is wrong
CGGlyph *glyphs = (CGGlyph *)malloc(sizeof(CGGlyph)*len);
CTFontGetGlyphsForCharacters(coreTextFont, characters, glyphs, len);
// get bounding boxes for glyphs
CGRect *bb = (CGRect *)malloc(sizeof(CGRect)*len);
CTFontGetBoundingRectsForGlyphs(coreTextFont, kCTFontDefaultOrientation, glyphs, bb, len);
CFRelease(coreTextFont);
l.position = CGPointMake(200.f, 100.f);
l.bounds = bb[0];
l.backgroundColor = CGColorCreateGenericRGB(0.f, .5f, .9f, 1.f);
free(characters);
free(glyphs);
free(bb);
return l;
}
This is the result I am getting from the above code. It seems to me that the size is correct, however there is some kind of padding taking place around the character.
Now my questions
Am I right with the assumption of the origin of the bounding box of the glyph?
How can one draw the letter such that it fits neatly into the layer, without this padding? Or alternatively, how can one control this padding?
Maybe I am missing an obvious point here. Is there now way after setting the size and the font of the layer to shrink wrap the layer around the character in a defined way (meaning with optional padding, a bit like in CSS)?
How about creating a CGPath from a glyph with CTFontCreatePathForGlyph and then getting its bounding box with CGPathGetBoundingBox?
An alternative would be to create a CTRun somehow and use the CTRunGetImageBounds function which also returns a "tight" bounding box, but this probably requires more lines of code than the aforementioned approach and you'd need to have a graphics context.
I assume this has to do with the built-in space around letters. Their bounding box usually includes some amount of space that serves as a general spacing if you assemble glyphs in a line. These are then optimized with kerning tables.
And yes, extracting the actual bezier path is a very good method of getting a tight bounding box. I have been doing that for years, though I have no experience using CATextLayers.

JLabel not displaying all the characters even after dynamically changing font size

I am trying to fit a sentence that changes often, in to a few jlabels. Widths of my 3 jlabels stay unchanged all the time. What I am doing is changing the font size so all the characters can fit with out non being out of the display range of the labels. What I do is call below code snippet when ever sentence is changed.
Here is my code
String sentence = "Some long sentence";
int SentenceLength = sentence.length();
int FontSize = 0;
// sum of widths of the three labels
int TotalLblLength=lbl_0ValueInWords.getWidth()+lbl_1ValueInWords.getWidth()+lbl_1ValueInWords.getWidth();
/*decide the font size so that all the characters can be displayed
with out exceeding the display renge(horizontal) of the 3 labels
Inconsolata -> monopace font
font size == width of the font*2 (something I observed, not sure
if this is true always) */
FontSize=(TotalLblLength/SentenceLength)*2;
// max font size is 20 - based on label height
FontSize=(FontSize>20)?20:FontSize;
lbl_0ValueInWords.setFont(new java.awt.Font("Inconsolata", 0,FontSize));
lbl_1ValueInWords.setFont(new java.awt.Font("Inconsolata", 0,FontSize));
lbl_2ValueInWords.setFont(new java.awt.Font("Inconsolata", 0,FontSize));
int CharCount_lbl0 = width_lbl0 / (FontSize / 2);
int CharCount_lbl1 = width_lbl1 / (FontSize / 2);
int CharsCount_lbl2 = width_lbl2 / (FontSize / 2);
/*Set texts of each label
if sentence has more than the number of characters that can fit in the
1st label, excessive characters are moved to the 2nd label. same goes
for the 2nd and 3rd labels*/
if (SentenceLength > CharCount_lbl0) {
lbl_0ValueInWords.setText(sentence.substring(0, CharCount_lbl0));
if (SentenceLength > CharCount_lbl0 + CharCount_lbl1) {
lbl_1ValueInWords.setText(sentence.substring(CharCount_lbl0, CharCount_lbl0 + CharCount_lbl1));
lbl_2ValueInWords.setText(sentence.substring(CharCount_lbl0 + CharCount_lbl1, SentenceLength));
} else {
lbl_1ValueInWords.setText(sentence.substring(CharCount_lbl0, SentenceLength));
}
} else {
lbl_0ValueInWords.setText(sentence);
}
But even after resetting font size sometimes the last character goes out of the display range. I have removed margines from the jlabels that may cause this. This happens for random length sentences. I can solve the problem for the application by reducing label width used for the calculations(hopefully)
Can anyone explain me the reason? Could be because of some defect in the fonts symmetry?
There is no such thing as font symmetry?
There are 2 types of fonts for what you are dealing with. Monospace fonts, and non-monospace fonts. Monospace fonts have the same exact width for every single character you can type. The others do not.
On top of that, fonts are rendered differently across different OS's. Something on windows will be around 10-20% longer on Mac because they space out the fonts differently.
Whatever it is you are trying to do with JLabels, stop. You should not be using 3 JLabels to show 3 lines of text because they dont fit. Scrap them and use a JTextArea. It has text wrap, you can set the font, and remove the margin/border/padding and make it non-editable. You can customize it very easily so it is indistinguishable from a JLabel, but it will save you a ton of work.
Pick the right tool for the right job.

Windows: Getting glyph outlines for substitution characters from other fonts

I need to render fonts into a 3d game world, so I use the GetGlyphOutline outline function to get the glyph shapes to render into a texture. However, I want to be able to handle the case where characters are not present in the given font (as is often the case for asian other other international text). Windows text rendering will automatically substitute fonts which have the needed characters. But GetGlyphOutline will not. How can I detect this case, and get the outlines for the substituted glyphs? Mac OS X Core Text has a function to get a matching substitution font for a given font and a string - is there anything similar on windows?
Found out what I needed to know myself: The IMLangFontLink interface, especially the MapFont method contain the needed functionality to find out which substitution fonts should be used on windows.
I too have puzzled with GetGlyphOutline. I'm not sure if you were able to do the same, but I was able to get mixed-script text outlines by using TextOut() in combination with BeginPath(), EndPath() and GetPath().
For example, even with the Arial font, I am able to get the path of the Japanese text 「テスト」 (using C++, but can easily be done in C as well):
SelectObject(hdc, hArialFont);
BeginPath(hdc);
TextOut(hdc, 100, 100, L"\u30c6\u30b9\u30c8"); // auto font subbing
EndPath(hdc);
// get number of points in path
int pc = GetPath(hdc, NULL, NULL, 0);
if (pc > 0)
{
std::vector<POINT> points(pc);
std::vector<BYTE> types(pc); // PT_MOVETO, PT_LINETO, PT_BEZIERTO
GetPath(hdc, &points[0], &types[0], pc);
// it seems the first four points are the bounding rect
// subsequent points match up to their types
for (int i = 4; i < pc; i++)
{
if (types[i] == PT_LINETO)
LineTo(hdc, points[i].x, points[i].y); // etc
}
}

Resources