- From: fantasai <fantasai.lists@inkedblade.net>
- Date: Mon, 12 Jul 2010 08:38:35 -0700
- To: John Daggett <jdaggett@mozilla.com>
- CC: Yuzo Fujishima <yuzo@google.com>, www-style@w3.org, www-font <www-font@w3.org>
On 07/11/2010 09:51 PM, John Daggett wrote: > Yuzo Fujishima wrote: > >> I believe it would help readers if the specification explicitly >> states how characters must be normalized, how glyphs are chosen, and >> what is left for UA's discretion. > > I agree. It's not a simple task so if you have specific suggestions > for wording that would be very helpful. I'm not too clear on how this works, but here's a rough suggestion: * The UA must choose a single font for each grapheme cluster. [UAX10] * In the absence of fallback behavior, this must be the font chosen for the decomposed (NFD) representation's base character (if any, else the first character in the grapheme cluster). If this font does not contain all necessary glyphs for the grapheme cluster, then the fonts that have been assigned to the other characters in the grapheme cluster must be tried (in the order corresponding to the order of the characters in the grapheme cluster) until a font is found that contains all characters in the grapheme cluster. * If no font is found in the previous step, the next one in the fallback list is tried. * First composed, then decomposed representations are tried when searching for appropriate glyphs in the font. (Note: If composed and decomposed and decomposed representations return visually different results, then there is a bug in the font as these are supposed to be equivalent.) Maybe you can figure out how to plug that in properly. :) A related topic would be what happens when the base character is in one element, the combining character in another, and the two are assigned different fonts. Do we want different behavior for that than for assigning different fonts through unicode-range? ~fantasai
Received on Monday, 12 July 2010 19:35:10 UTC