- From: Phillips, Addison <addison@lab126.com>
- Date: Sun, 19 Jun 2011 10:46:28 -0700
- To: Anne van Kesteren <annevk@opera.com>
- CC: "public-i18n-core@w3.org" <public-i18n-core@w3.org>, "www-style@w3.org" <www-style@w3.org>, fantasai <fantasai.lists@inkedblade.net>
I fail to see what the size of the code units (or choice of Unicode encoding form) have to do with it? Sequences of code points and their comparison are the issue and it would not be a revolution to do so in a normalized manner. Normalization of stylesheets and other documents may not make sense, but that doesn't address the problem of selection. See my recent emails and the I18N WG's work on same. Addison Sent from my iPhone On Jun 19, 2011, at 8:16 AM, "Anne van Kesteren" <annevk@opera.com> wrote: > On Fri, 08 Apr 2011 02:11:20 +0200, fantasai > <fantasai.lists@inkedblade.net> wrote: >> So I guess the question is, what's the right way forward here? > > I think the right way forward is to not worry about Unicode Normalization > as we previously decided for e.g. Selectors (which made it to PR; I do not > get why we have the exact same discussion again for CSS Namespaces, which > are not used by most authors). We could maybe encourage validators to say > something about it (as Validator.nu already does, I believe), but at this > point it is simply too late to change the entire platform around. For > better or worse, we are stuck with 16-bit code units. > > (Sorry for jumping in so late. I was away.) > > > -- > Anne van Kesteren > http://annevankesteren.nl/ >
Received on Sunday, 19 June 2011 17:47:10 UTC