- From: John Hudson <tiro@tiro.com>
- Date: Sat, 26 Feb 2011 11:11:30 -0800
- To: Koji Ishii <kojiishi@gluesoft.co.jp>
- CC: Jonathan Kew <jonathan@jfkew.plus.com>, John Daggett <jdaggett@mozilla.com>, Asmus Freytag <asmusf@ix.netcom.com>, www-style@w3.org, public-i18n-cjk@w3.org, www-international@w3.org
Koji Ishii wrote: > I don't think this will happen ever. When Unicode was introduced in 90's, there were similar discussions. Unicode has all characters coded uniquely, so why would we ever want font fallback? Every font can support every glyph defined in Unicode, so we won't need such system. It was never anyone's intention that every font would support every character (not glyph) in Unicode; indeed, with today's popular font format this isn't even possible because there is a 64k limit on the number of glyphs in a TrueType/OpenType font. The few number of fonts like Arial Unicode were designed to provide a temporary glyph fallback mechanism for any character in a particular early version of Unicode, the goal being to have one font at which you could throw any text and be assured of getting something at least recognisable. The result was not intended to by typographically sophisticated or even more than minimally acceptable. Recently, there has been ISO movement on standardising a composite font format, i.e. a mechanism to define virtual fonts from collections of individual fonts, specifying which component fonts should be used for which characters. There are limits to the typographic sophistication of such an approach that will require fairly careful use of the mechanism (layout features will not apply across component font boundaries). But it seems to me that something like this could provide a solution to the problems you have raised. JH
Received on Saturday, 26 February 2011 19:12:07 UTC