- From: Boris Zbarsky <bzbarsky@MIT.EDU>
- Date: Wed, 10 Jul 2013 11:16:34 -0400
- To: Brian Kardell <bkardell@gmail.com>
- CC: "www-style@w3.org" <www-style@w3.org>
On 7/10/13 10:58 AM, Brian Kardell wrote: > When DOM changes are complete. Define "complete"? > A large number of pages have at most a few thousand elements - even a full body scan > one the after the DOM is done Define "done"? Let's talk for a second about how DOM mutations are handled in current CSS implementations. I'll focus on Gecko and WebKit, since those are what I'm familiar with, though I suspect Servo will end up with something quite a bit different here. When a DOM mutation happens, a CSS implementation will make a note to recompute styles on some part of the DOM tree. Then later, when it processes style changes (typically off an async event of some sort; in Gecko it happens from a layout flush or requestAnimationFrame tick, basically) it will do the recomputation. Two things are in tension here: on the one hand, you want to mark as little of the DOM tree as possible as needing recomputation, and on the other hand you want your marking to be as fast as possible because it happens on every DOM mutation. So the worst-case scenario in some sense with a "slow" selector for which finding the right set of affected nodes is too slow is that every DOM mutation just marks the whole tree as needing recomputation and then you do that recomputation async. Now the bad part is, doing recomputation on the whole tree can be pretty slow even if you only have a few thousand elements, depending on how many _selectors_ you have around. It can quite easily take tens of milliseconds, and it's not too super-hard to find cases where it takes order of a second in current browsers. Is your suggestion that this worst-case behavior is not actually all that bad and we should just live with it until we figure out a way to not hit it, basically? -Boris
Received on Wednesday, 10 July 2013 15:17:09 UTC