Re: real vs. synthetic width glyphs



On 7/9/13 11:51 AM, "Florian Rivoal" <florian@rivoal.net> wrote:

>On Tue, 09 Jul 2013 20:09:25 +0200, Sylvain Galineau <galineau@adobe.com>
> 
>wrote:
>
>> By 'if using 1/n glyphs is not optimal' do you mean 'if *the author*
>>does
>> not want to use the 1/n glyphs provided by the font'? If so then yes,
>> it's
>> absolutely fine for them to be able to override the default behavior.
>> It's
>> also fine for UAs to do something interesting when no such glyphs are
>> present in the specified font. I do not think any of this is really the
>> issue though.
>>
>> The argument is about *requiring* interoperable behavior when the font
>> does provides 1/n glyphs. My understanding of the resolution is that it
>> does not actually do so.
>
>My understanding of the resolution is the same as yours. As requiring the
> 
>use
>of special glyphs when they are all available leaves the door open for
>Koji's
>#12 use case, I think the only question left is Elika's "MI" use case.
>
>In the example posted by John, I agree that MI is nicer in case (5) than
>(4),
>but MM is not. So this could indeed be a reason to let the UA be smart.
>But should
>that be by default, or opt in? Given that MI isn't the main use case, and
> 
>that
>for digits (which are), (4) is always better than (5), I must admit that
>I  
>do
>find the opt-in solution tempting. But at least now, I can see that there
> 
>are
>situations where all variant glyphs are available but using them isn't
>the  
>ideal.
>
>  - Florian

If the opt-in model is right for the main use-case *and* it is consistent
with
the rest of CSS then why wouldn’t it be required? 

Received on Tuesday, 9 July 2013 20:21:07 UTC