Re: [css3-fonts] font descriptor default values

Hi Michael,

> In particular, we have changed the behaviour of @font-face rules so
> that descriptors only take one value and there is no "all" value,
> which does make it easier to write rules that match appropriately.
> 
> We are also going to change our implementation of the "src" descriptor
> such that it follows the spec and is not used for character fallback,
> instead only the first loadable source will be loaded. Combining more
> than one font for character fallback will require multiple @font-face
> rules with the same font-family, which seemed awkward at first, but
> makes a lot of sense once you start using the unicode-range descriptor.

Great!

> However, I'm still a bit unsure of how local() is supposed to work. In
> Prince we implement it very simply: it is a font family name, and we
> query the operating system for fonts with that family and pick the
> most appropriate one. 

Since a single @font-face rule defines a single *face* within a family,
local is a way of selecting a single *face* within a font family,
addressed via a name unique to each face.  The easiest way to define
this would be to say that this name is the Postscript name which is
intentionally constructed as a unique key independent of
platform/language.  But as others have noted, unfortunately lookups
based on Postscript names are not universally supported on all platforms
(e.g. Windows GDI).  Lookups based on the full name (TT/OT name rec id =
4) are possible on most platforms.  Under Windows, a local name like
"Arial Italic" can be used as a key passed into CreateIndirectFont.
Under Mac OS X, ATSFontFindFromName can be used.  With fontconfig on
Linux, use the FC_FULLNAME property.  Naturally, each of these have
their quirks.

The current draft requires support for looking up individual faces via
the fullname with Postscript name look optional for those platforms that
support it.  As Thomas Phinney and others pointed out, the fullname for
.otf fonts is different across platforms, it's equivalent to the
Postscript name under Windows but takes the family + style name form
under Mac OS X (although FontBook actually shows the Postscript name as
the fullname instead of the fullname in the underlying font data!).

> For example X in the spec:
> 
> @font-face {
>    font-family: Arial;
>    src: local(Helvetica);
> }
> 
> If a document requests "bold italic 12pt Arial" we will check for fonts
> in the "Arial" font family, and choose the one which is the closest
> match to the requested properties. This involves using Fontconfig on
> Linux, and EnumFontFamiliesEx() on Windows, and some ATS / ATSUI
> functions on MacOS X, but it's all very easy: you put in a family name,
> you get out some font references, you get more information about those
> fonts to check their boldness, style, width/stretch, and that's it.

Hmmm, this is probably a bad example then, because it doesn't
distinguish between aliasing families vs. defining font faces that
reference locally available faces.

I don't think it makes sense to use local() for aliasing *families*,
that leads to all sorts of confusion when other font descriptors are
involved.  It seems like much more straight-forward syntax for this
would be:

  @font-alias MyFont {
    font-family: Helvetica, Arial;
  }

When "MyFont" appeared in a font-family list, "Helvetica, Arial" would 
automatically be substituted.  

Regards,

John Daggett
Mozilla Japan

Received on Tuesday, 3 March 2009 03:58:48 UTC