- From: John Daggett <jdaggett@mozilla.com>
- Date: Mon, 12 Jul 2010 20:06:02 -0700 (PDT)
- To: fantasai <fantasai.lists@inkedblade.net>
- Cc: Yuzo Fujishima <yuzo@google.com>, www-style@w3.org, www-font <www-font@w3.org>
fantasai wrote: > * The UA must choose a single font for each grapheme cluster. [UAX10] Did you mean to refer to UAX10? UAX10 is the Unicode Collation Algorithm and deals with how to compare two Unicode strings. Matching character streams to font character maps is a different problem, since the goal is not to determine equivalence but rather to render a character sequence correctly. > A related topic would be what happens when the base character is in > one element, the combining character in another, and the two are > assigned different fonts. Do we want different behavior for that > than for assigning different fonts through unicode-range? This starts to push into the realm of saving users from themselves. In general, I don't think we should burden implementations with complex error handling requirements like this unless it's really a common occurrence that's hard for the author to work around. Codepoints defined in unicode-range descriptors act as a filter on the codepoints in the cmap of a font, I don't think we should blur that based on complex conditions. Cheers, John
Received on Tuesday, 13 July 2010 03:06:37 UTC