Re: [css3-fonts] font descriptor default values

Hi John,

Following on from our discussion last month we have been modifying 
Prince to implement the latest revision of CSS3 Fonts.

In particular, we have changed the behaviour of @font-face rules so that 
descriptors only take one value and there is no "all" value, which does 
make it easier to write rules that match appropriately.

We are also going to change our implementation of the "src" descriptor 
such that it follows the spec and is not used for character fallback, 
instead only the first loadable source will be loaded. Combining more 
than one font for character fallback will require multiple @font-face 
rules with the same font-family, which seemed awkward at first, but 
makes a lot of sense once you start using the unicode-range descriptor.

However, I'm still a bit unsure of how local() is supposed to work. In 
Prince we implement it very simply: it is a font family name, and we 
query the operating system for fonts with that family and pick the most 
appropriate one. For example X in the spec:

@font-face {
   font-family: Arial;
   src: local(Helvetica);
}

If a document requests "bold italic 12pt Arial" we will check for fonts 
in the "Arial" font family, and choose the one which is the closest 
match to the requested properties. This involves using Fontconfig on 
Linux, and EnumFontFamiliesEx() on Windows, and some ATS / ATSUI 
functions on MacOS X, but it's all very easy: you put in a family name, 
you get out some font references, you get more information about those 
fonts to check their boldness, style, width/stretch, and that's it.

The spec seems to hint at doing more than that, eg. by matching 
"Futura-Medium" and "HoeflerText-Ornaments" which look like PostScript 
names rather than font family names, however the mechanism is not fully 
explained. What is the use case and expected mechanism for this?

Best regards,

Michael

-- 
Print XML with Prince!
http://www.princexml.com

Received on Friday, 27 February 2009 00:28:34 UTC