W3C home > Mailing lists > Public > www-style@w3.org > April 2007

Re: Select a parrent node with s CSS selector?

From: Andrew Fedoniouk <news@terrainformatica.com>
Date: Tue, 24 Apr 2007 16:52:29 -0700
Message-ID: <003801c786cd$3356c250$f502000a@internal.toppro.net>
To: "Allan Sandfeld Jensen" <kde@carewolf.com>, <www-style@w3.org>


----- Original Message ----- 
From: "Allan Sandfeld Jensen" <kde@carewolf.com>
To: <www-style@w3.org>
Sent: Tuesday, April 24, 2007 9:13 AM
Subject: Re: Select a parrent node with s CSS selector?


>
> On Tuesday 24 April 2007 04:42, Andrew Fedoniouk wrote:
>>
>> This is not about complexity of implementation but about computational
>> complexity of algorithm needed to calculate such a selector:
>>
>> See: http://en.wikipedia.org/wiki/Computational_complexity_theory
>>
>> For example proposed selector
>>
>> * < *:link  /* element that has at least one link inside */
>>
>> requires deep scan of all children of any given element to verify the
>> selector.
>>
>> Formula O( n * n ) means that document having twice more
>> elements will require four times more time to match selectors.
>> (In practice this can be lot less - this is just a worst case - but 
>> still)
>>
>> Again it is possible to compute this but selectors that have exponential
>> nature are *highly* non-desirable.
>>
> O(n*n) is quadratic not exponential,

Thanks, Allan, quadratic indeed.

> ... and many CSS implementations are worst
> case quadratic anyway.

CSS implementations or the nature of CSS selectors?

So far no one of proposed CSS selectors has purely quadratic nature.

The are bad ones like bunch of  :only-of-type, :first-of-type, etc. pseudos 
and
the Indirect adjacent combinator A ~ B.
Use of them may potentially kill the design without any warning to the 
author.
But no quadratic algorithms even in worst case.

Side note, interesting thing:
"Evil" pseudo class "6.6.6 Content pseudo-class" in CSS3 has note:
  Editor's note : need to add here a strong warning about performance issues
  related to this pseudo.
But, say, A ~ B considered to be less evil for some reasons.

>
> And no, it can still be done in linear time O(n) using a standard NFA to 
> DFA
> conversion (consider the CSS selectors a collection of NFA rules).

It would be extremely nice if it will be always possible to reduce O(n*n) to 
O(n).

> The problem is that it can't be done conclusively for all cases until the 
> entire
> document has been parsed. This either makes the page keep changing style
> while loading and parsing, or delays the display until end of page-load. 
> CSS
> has been designed to be applied incrementally (expect for a few odd CSS3
> selectors).

Exactly.

There are precisely 5 problematic pseudo-classes
:last-of-type, :not-last-of-type, :only-of-type, :not-only-of-type
and :last-child, :not-last-child that requre look ahead on the
DOM tree (so conflict with incremental rendering).
But these are only on the same siblings level.

Andrew.


>
> That said there have been suggestions on this list that could be applied
> better incrementally and achieve many of the same effects, and there is 
> also
> the idea to just allow the web authors shoot themselves in the foot if 
> they
> use the parent selector on BODY (worse case scenario).
>
>
> `Allan
>
> 
Received on Wednesday, 25 April 2007 00:03:56 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 27 April 2009 13:54:50 GMT