Re: Positioned Layout proposal

On 10/21/10 12:59 PM, Shelby Moore wrote:
> Ditto what I wrote above. That D^N complexity assumes the domain is set of
> all random possibilities.

Yes, and my point is that in this case that set is huge.  Ridiculously so.

You can reduce its size hundreds of order of magnitude via optimizations 
and it remains huge.

You're assuming that when all the optimizing is done what will remain is 
tractable.  I'm not convinced this assumption is founded (but of course 
willing to be convinced by either proof or demonstration!).

>> But in our case N is order of 1e2--1e5 and D is infinite in theory.  In
>> practice, D is order of 2^{30} in Gecko, say.
>
> Typo? Don't you mean in Gecko D^N is on order of 2^30?

No, I meant what I said.  D is order of 2^30: the set of 30-bit signed 
integers.  Those are the possible values of CSS lengths in Gecko.

D^N is therefore order of 2^{3000} to 2^{3000000}.

>> Where is this estimate coming from?  What are we measuring, even?  flops
>> are operations/second; what does it mean to measure problem complexity
>> in flops?
>
> I was just making some wild guessestimate of complexity and typical use
> case in my head, and extracting to ops/sec.

Ops/sec to do what in how much time?  It's easy to have low ops/sec if 
you give yourself lots of time for the task.

Look, the relevant constraint here is that incremental layout of pages 
with tens of thousands of boxes and complicated constraints on the 
sizing needs to take order of tens to thousands of ms on modern 
hardware; that's what current implementations manage.

> We would really need to dig more into this.  Afaics, you are raising
> issues which are for the next stage things I wrote I want to contemplate.
> So I will have to put that off for now, and come back to it later.

OK.  As long as you don't start demanding that people rewrite CSS in 
terms of stuff that's not sanely implementable, I don't much mind.  ;)

-Boris

Received on Thursday, 21 October 2010 17:13:37 UTC