- From: Anne van Kesteren <annevk@opera.com>
- Date: Sat, 07 Feb 2009 22:50:54 +0100
- To: "Phillips, Addison" <addison@amazon.com>, "Aryeh Gregor" <Simetrical+w3c@gmail.com>
- Cc: "David Clarke" <w3@dragonthoughts.co.uk>, "Henri Sivonen" <hsivonen@iki.fi>, "www-archive@w3.org" <www-archive@w3.org>
On Sat, 07 Feb 2009 21:09:49 +0100, Phillips, Addison <addison@amazon.com>
wrote:
> ... from a vendor perspective. From a specification perspective, we are
> talking about what Selectors should do. But quite obviously Selectors is
> only one example of the normalization issue, a point which is key in
> I18N's official comments (under review within the WG currently).
My point is that we cannot consider Selectors standalone without knowing
what the overall solution will be like.
>> Changing just Selectors does not solve the problem. It merely
>> fragments equality checks in implementations leading to more bugs and
>> inconsistencies. It also fragments the Web platform. If you want to
>> solve the problem you cannot just look at Selectors.
>
> Changing Selectors addresses the problem with Selectors. It doesn't
> "solve" the overall problem. But it is one element of the problem and
> one that is useful in discussing the problem.
That I can agree with.
>> No it is not.
>
> Probably I didn't phrase that quite right. My point is not that
> "substring matching should be dog slow". My point is that there are
> relative levels of acceptable performance and it is possible that
> substring matching (such as :contains), since it used in a different way
> than DOM tree navigation, might have different measure of "acceptable"
> in handling normalization.
I think that any regression here in terms of speed would cause a lot of
bad PR.
>> :contains is _highly_ sensitive to very fast equality
>> checks since CSS is live. (E.g. changes to the DOM requires checking if
>> Selectors still match or not.) In fact, precisely because of performance
>> reasons has this feature not been implemented yet in rendering engines.
>
> I don't disagree, but would then note that normalization isn't the
> performance barrier, now is it?
It certainly complicates matters.
>> The same goes for many DOM operations.
>
> We need to resolve what the proper behavior should be ("requirements").
> Implementations then have to meet requirements and may find diverse ways
> to achieve. Acceptable performance is a requirement.
> Normalization-related behavior may be. I see declarations that
> normalization "makes things too slow" without any empirical evidence (on
> either side) and the efforts seem to be to reject the proposed
> requirements solely on the basis of performance.
No. The sheer complexity, and compatibility and interoperability issues
are also reasons I have given in the various threads. Frankly, those worry
me the most given that like you I haven't seen the actual performance
implications yet.
> If the requirements are the right ones, then it is up to implementers to
> create implementations that meet them (including performance
> requirements). Balancing performance with other needs is the key here.
> And, again, I don't believe I18N is saying that normalization must take
> place solely at the selectors level. It is that it needs to take place
> at the right level. This, in turn, may result in specifications (such as
> Selectors) having conformance requirements related to normalization.
Ok.
--
Anne van Kesteren
<http://annevankesteren.nl/>
<http://www.opera.com/>
Received on Saturday, 7 February 2009 21:52:26 UTC