- From: Chris Lilley <chris@w3.org>
- Date: Fri, 14 Aug 2015 11:48:46 +0200
- To: Alan Stearns <stearns@adobe.com>, "public-houdini@w3.org" <public-houdini@w3.org>
- CC: John Daggett <jdaggett@mozilla.com>
Hello Alan, (apologies for late reply, I have been on travel). Wednesday, July 29, 2015, 3:02:42 AM, you wrote: > Hey all, > I initially thought that exposing raw font data would be the right first > step for finding a way to correctly polyfill initial-letter, line grid, > etc. Conversations I’ve had since the last Houdini meeting have moved me > towards John Daggett’s point of view, that exposing the typographic > *results* of line boxes is a better stepping-stone. Better for all uses of the font metrics API, or better for the specific ones that involve constructing new line boxes? It doesn't seem to me that forcing users to indirectly intuit font metrics based on effect on line boxes is a win, in general. It seems that the developer would end up making a bunch of temporary elements and fiddle with them to get enough data that they would have preferred to access directly in most cases. On the other hand, maybe people more familiar with CSS would prefer to get the data in terms of things they are familiar with (whilst typographers would, equally, prefer the font metrics with a close 1:1 relationship to the underlying OpenType data). But I haven't heard John's arguments. This would be a very good topic for the Houdini f2f, with a whiteboard. > Getting those results > and making small changes is better than trying to reverse-engineer what > the browser is doing with the raw font data. If the developer is indeed recalculating something the browser has already done, I agree. But I don't see an argument for making this the sole method. If people want detailed linebox information - which I am sure they will - we should provide that. And if people want font metric information, we should provide that. It seems likely that any typographic polyfill would want access to both types of information. It also seems that metrics are just one set of properties a Font object might expose, and other non-metric information could also be of interest. Indeed there was an example on this week's call, how can you know if the current font supports true (non-synthesized) small caps? Or in general, what features does the current font support? For those situations where a more complex conditional set of property assignments are desired, which cannot be readily achieved with just fallback via the cascade, giving stylesheet authors an API to directly query the font capabilities and execute logic depending on that information seems like a key extensibility win. In my mind, font metrics is just the first step here towards a richer font-querying API. > The box tree API currently talks about DeadFragmentInformation, which > gives you a basic geometry result. My current thinking is that there > should be a way to get more fine-grained typographic information within > that basic geometry. It might look like: > interface DeadFontMetrics { > readonly attribute double baseline; > readonly attribute double alphabeticBaseline; > readonly attribute double ideographicBaseline; > readonly attribute double ascent; > readonly attribute double descent; > readonly attribute double xHeight; > readonly attribute double capHeight; > } What would that expose, for a fragment where different characters in the text content had different metrics? It seems that the metrics info in OpenType GPOS (as opposed to the simpler OS2 table info for TrueType) can be quite complex, with the BASE table allowing per-script, per-language and even per-character baseline metrics [1]. > partial interface DeadFragmentInformation { > Promise<DeadFontMetrics> getFontMetrics(); > }; > Where the DeadFontMetrics attributes give you the relevant typographic > position of the first in-flow line box in pixels from the top of the > fragment. [1] http://www.microsoft.com/typography/otspec/base.htm -- Best regards, Chris Lilley Technical Director, W3C Interaction Domain
Received on Friday, 14 August 2015 09:48:51 UTC