Re: [css-font-loading] load, check and testing for character coverage

Hi,

While I agree that check() shouldn't throw any errors, I am not sure why it
needs to return true.

Right now, this method is called just `check()` and this is really
confusing what it checks. For me, as web developer, 'check font' always
meant 'check if font is available to use, if not, load from network' and
this is what I expect from this method. But clearly now, it turns out to be
wrong assumption. This is how many web dev would treat this method:

```
var myFont = '16px MyFont';
var testText = '∂ƒß';

var hasFont = document.fonts.check(myFont);
var hasFontForGivenUnicodeRange = document.fonts.check(myFont, testTest);
// just check agains @font-face's unicode-range

if (!hasFont) {
  var face = new FontFace('MyFont', 'url(latin font)');
  face.load().then(() => document.fonts.add(face));
}

if (!hasFontForGivenUnicodeRange) {
  var face = new FontFace('MyFont', 'url(special chars font)');
  face.load().then(() => document.fonts.add(face));
}
```

So this algorithm is now useless. And yes, it's expected by check() to
return `false` if browser has that font and @font-face's unicode-range
doesn't have those symbols specified in it.

I understand that this could be achieved someway like this:
```
var face = new FontFace('MyFont', 'url(latin font)');

document.fonts.add(face);
document.fonts.load(myFont[, testText]);
```

But this is a bit weird, especially when you need to load few different
files for the same font family.

Thanks.
-- 
@nekrtemplar <https://twitter.com/nekrtemplar>

Received on Friday, 1 April 2016 07:42:22 UTC