- From: Jonas Sicking <jonas@sicking.cc>
- Date: Thu, 24 Sep 2009 13:44:58 -0700
- To: John Resig <jresig@mozilla.com>
- Cc: Lachlan Hunt <lachlan.hunt@lachy.id.au>, public-webapps <public-webapps@w3.org>
On Thu, Sep 24, 2009 at 8:59 AM, John Resig <jresig@mozilla.com> wrote: > Another alternative > would be to implement the merge/sort/unique method and have it return a > NodeList (which would, then, have qSA). > > For example: > document.createNodeList([ ... some elements ... ]).querySelectorAll("em, > strong"); > > createNodeList would create a NodeList holding the DOM nodes in document > order (with duplicates removed). Since it's a proper NodeList we could then > use qSA to find the elements that we want. My concern with this API is that it forces the implementation to always sort the array, even if already sorted, and then do a merge sort on the individual results from querySelectorAll. It would be faster to simply run the query on each node, and then merge sort the results. > If this is how it's implemented it actually becomes really useful to have > the NodeList-based element filtering. > > document.createNodeList([ ... some elements ... ]).filterSelector("em, > strong") > > (Since this would be much faster than using Array.filter or some other > method.) Are you sure that Array.filter would result in a significant perf hit? What with recent jitting and stuff it would be great if we don't have to rely on making everything a native method, but can rely on javascript to do part of the work without taking too much of a perf hit. / Jonas
Received on Thursday, 24 September 2009 20:46:04 UTC