W3C home > Mailing lists > Public > public-webapi@w3.org > February 2006

Asymptotic computational complexity bounds for DOM methods

From: Maciej Stachowiak <mjs@apple.com>
Date: Sun, 26 Feb 2006 15:07:13 -0800
Message-Id: <06EB48D8-7D8F-421B-96F2-D85542CAC3AC@apple.com>
To: "Web APIs WG (public)" <public-webapi@w3.org>



We should consider placing computational complexity bounds on DOM  
methods in the future. For example, it would be perfectly legal to  
make getElementById be O(N) in the number of elements in the  
document, or even O(N^2). But in practice unless it is O(1) your  
implementation will not be interoperable with web content, as there  
will be web apps that will lock the user's machine for hours or even  
days.

These don't necessarily have to be the tightest possible bounds. For  
example, requiring Node.nextSibling to be O(1) might limit  
implementations too much and an upper bound of O(log N) might be  
sufficient. Sometimes there are also information-theoretic limits to  
consider. For instance it is impossible to make both  
Node.insertBefore and Node.childNodes.item O(1) in the same  
implementation, although you could make either O(1) and the other O 
(N), or (I think) both O(log N).

So maybe this is not a useful excercise for all methods, but for some  
where the performance issues are non-obvious, it may be helpful.  
There are also issues with testing this.

Regards,
Maciej
Received on Sunday, 26 February 2006 23:07:14 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 8 January 2008 14:18:53 GMT