W3C home > Mailing lists > Public > www-style@w3.org > July 2013

Re: real vs. synthetic width glyphs

From: Florian Rivoal <florian@rivoal.net>
Date: Tue, 09 Jul 2013 20:51:43 +0200
To: www-style@w3.org
Message-ID: <op.wzyxchczf5de51@localhost.localdomain>
On Tue, 09 Jul 2013 20:09:25 +0200, Sylvain Galineau <galineau@adobe.com>  

> By 'if using 1/n glyphs is not optimal' do you mean 'if *the author* does
> not want to use the 1/n glyphs provided by the font'? If so then yes,  
> it's
> absolutely fine for them to be able to override the default behavior.  
> It's
> also fine for UAs to do something interesting when no such glyphs are
> present in the specified font. I do not think any of this is really the
> issue though.
> The argument is about *requiring* interoperable behavior when the font
> does provides 1/n glyphs. My understanding of the resolution is that it
> does not actually do so.

My understanding of the resolution is the same as yours. As requiring the  
of special glyphs when they are all available leaves the door open for  
#12 use case, I think the only question left is Elika's "MI" use case.

In the example posted by John, I agree that MI is nicer in case (5) than  
but MM is not. So this could indeed be a reason to let the UA be smart.  
But should
that be by default, or opt in? Given that MI isn't the main use case, and  
for digits (which are), (4) is always better than (5), I must admit that I  
find the opt-in solution tempting. But at least now, I can see that there  
situations where all variant glyphs are available but using them isn't the  

  - Florian
Received on Tuesday, 9 July 2013 18:52:11 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 11 February 2015 12:35:29 UTC