W3C home > Mailing lists > Public > public-webapps@w3.org > July to September 2009

Re: [selectors-api] Summary of Feature Requests for v2

From: Jonas Sicking <jonas@sicking.cc>
Date: Thu, 24 Sep 2009 13:44:58 -0700
Message-ID: <63df84f0909241344m1b867d70lcf42a580483c7886@mail.gmail.com>
To: John Resig <jresig@mozilla.com>
Cc: Lachlan Hunt <lachlan.hunt@lachy.id.au>, public-webapps <public-webapps@w3.org>
On Thu, Sep 24, 2009 at 8:59 AM, John Resig <jresig@mozilla.com> wrote:
> Another alternative
> would be to implement the merge/sort/unique method and have it return a
> NodeList (which would, then, have qSA).
>
> For example:
>     document.createNodeList([ ... some elements ... ]).querySelectorAll("em,
> strong");
>
> createNodeList would create a NodeList holding the DOM nodes in document
> order (with duplicates removed). Since it's a proper NodeList we could then
> use qSA to find the elements that we want.

My concern with this API is that it forces the implementation to
always sort the array, even if already sorted, and then do a merge
sort on the individual results from querySelectorAll. It would be
faster to simply run the query on each node, and then merge sort the
results.

> If this is how it's implemented it actually becomes really useful to have
> the NodeList-based element filtering.
>
>     document.createNodeList([ ... some elements ... ]).filterSelector("em,
> strong")
>
> (Since this would be much faster than using Array.filter or some other
> method.)

Are you sure that Array.filter would result in a significant perf hit?
What with recent jitting and stuff it would be great if we don't have
to rely on making everything a native method, but can rely on
javascript to do part of the work without taking too much of a perf
hit.

/ Jonas
Received on Thursday, 24 September 2009 20:46:04 GMT

This archive was generated by hypermail 2.3.1 : Tuesday, 26 March 2013 18:49:33 GMT