W3C home > Mailing lists > Public > www-style@w3.org > July 2013

Re: [selector-profiles] confusion

From: Brian Kardell <bkardell@gmail.com>
Date: Wed, 10 Jul 2013 12:18:03 -0400
Message-ID: <CADC=+jfw23psb4VXxEcuDod1MtUV6vfNDGNXvqW1bOymu7denQ@mail.gmail.com>
To: Boris Zbarsky <bzbarsky@mit.edu>
Cc: "www-style@w3.org" <www-style@w3.org>
I am going to answer these in kind of reverse order as i think that makes
the most sense to my case....


> So the worst-case scenario in some sense with a "slow" selector for which
finding the right set of affected nodes is too slow is that every DOM
mutation just marks the whole tree as needing recomputation and then you do
that recomputation async.  Now the bad part is, doing recomputation on the
whole tree can be pretty slow even if you only have a few thousand
elements, depending on how many _selectors_ you have around.  It can quite
easily take tens of milliseconds, and it's not too super-hard to find cases
where it takes order of a second in current browsers.

> Is your suggestion that this worst-case behavior is not actually all that
bad and we should just live with it until we figure out a way to not hit
it, basically?

In a nutshell, yes, I am suggesting that worst-case behavior is not that
bad, and that there are probably some simple things we can do to avoid the
*really worst case* cases too.  If a handfull of subject selectors, for
example, takes 10-20ms, I don't honestly think that is a tradeoff many
designers just couldn't live with...And yes, then it becomes a fun game of
which browser can come up with the best strategy to be faster than the next


> Define "done"?

It's debatable - I don't think that is the critical sticking point here...
Just delayed beyond when CSS is doing hot/heavy mutation work.

> Let's talk for a second about how DOM mutations are handled in current
CSS implementations.  I'll focus on Gecko and WebKit, since those are what
I'm familiar with, though I suspect Servo will end up with something quite
a bit different here.
> When a DOM mutation happens, a CSS implementation will make a note to
recompute styles on some part of the DOM tree.  Then later, when it
processes style changes (typically off an async event of some sort; in
Gecko it happens from a layout flush or requestAnimationFrame tick,
basically) it will do the recomputation.  Two things are in tension here:
on the one hand, you want to mark as little of the DOM tree as possible as
needing recomputation, and on the other hand you want your marking to be as
fast as possible because it happens on every DOM mutation.

Yes, in fact, it was your explanation of how some of those things work and
discussions about it that led to our developing a strategy for hitch... We
catch mutation data and mark nodes in a throttle, at some point we say
'just do the whole body' - we also do simple filtering on the rules that
need to be tested similarly and I think that helps.  It seems acceptably
fast for a number of use cases in CSS - and yes - it comes with the warning
that these selectors are not optimized to calculate until after 'native
CSS'... I'm just wondering if something similar might be a better
definition for a "profile" that at least is useable in a sheet without
other hackery (which will always be slower anyway) might be better than
simply not supporting at all.
Received on Wednesday, 10 July 2013 16:18:35 UTC

This archive was generated by hypermail 2.4.0 : Friday, 25 March 2022 10:08:32 UTC