Re: Positioned Layout proposal

> On 10/21/10 12:59 PM, Shelby Moore wrote:
>> Ditto what I wrote above. That D^N complexity assumes the domain is set
>> of
>> all random possibilities.
>
> Yes, and my point is that in this case that set is huge.  Ridiculously so.

Afaics, you were more correct when you wrote before it is infinitely so--
not just huge or ridiculous. Infinity is an interesting concept, because
afaik we can only model in terms of asymptotes on functions, we otherwise
have no mathematical model for it. Some where I wrote about this, I need
to dig it up. I want to say more about this, because it is sort of fallacy
of science, but I rather not digress here...

> You can reduce its size hundreds of order of magnitude via optimizations
> and it remains huge.

We can't reduce infinity. So that right there tells me there is some
fallacy in your way of thinking about the issue. We know for a fact the
page has a solution, because the designer had some mutual information in
mind.

> You're assuming that when all the optimizing is done what will remain is
> tractable.  I'm not convinced this assumption is founded (but of course
> willing to be convinced by either proof or demonstration!).

The key is that in life we always dealing with infinite possibilities, but
they only occur on the long-tail, because if they occured every day, we
wouldn't exist.

So the domain we have to consider is not random. It is just a matter of
defining our domain. That is the details I need to dig into later.

>>> But in our case N is order of 1e2--1e5 and D is infinite in theory.  In
>>> practice, D is order of 2^{30} in Gecko, say.
>>
>> Typo? Don't you mean in Gecko D^N is on order of 2^30?
>
> No, I meant what I said.  D is order of 2^30: the set of 30-bit signed
> integers.  Those are the possible values of CSS lengths in Gecko.

Yeah I realized that and corrected myself:

http://lists.w3.org/Archives/Public/www-style/2010Oct/0494.html

Btw, I had another typo, I obviously meant {2^{30}}^5 = 2^{35}. Btw, you
should note that I nearly blind and have no vision in one eye (apologies
for the large quantity of typos).

Good point on how you quantify the entropy.

> D^N is therefore order of 2^{3000} to 2^{3000000}.
>
>>> Where is this estimate coming from?  What are we measuring, even?
>>> flops
>>> are operations/second; what does it mean to measure problem complexity
>>> in flops?
>>
>> I was just making some wild guessestimate of complexity and typical use
>> case in my head, and extracting to ops/sec.
>
> Ops/sec to do what in how much time?  It's easy to have low ops/sec if
> you give yourself lots of time for the task.
>
> Look, the relevant constraint here is that incremental layout of pages
> with tens of thousands of boxes and complicated constraints on the
> sizing needs to take order of tens to thousands of ms on modern
> hardware; that's what current implementations manage.

Yup. But the big limitation right now in mobile least-common-denominator
is the battery, not the CPU.

And battery energy density is increasing fairly rapidly and with
nanotechnology, I expect it to probably pace with Moore's Law, and it will
take us years to build a more generalized CSS any way.

And on top of all that, I actually expect the more generalized model to be
faster than current CSS, because we will automatically discover
optimizations that are obscured to us by our current proliferation special
cases way-of-thinking about and implementing the relationships in code.

>> We would really need to dig more into this.  Afaics, you are raising
>> issues which are for the next stage things I wrote I want to
>> contemplate.
>> So I will have to put that off for now, and come back to it later.
>
> OK.  As long as you don't start demanding that people rewrite CSS in
> terms of stuff that's not sanely implementable, I don't much mind.  ;)

Oh I expect it to be much more sane, otherwise I wouldn't be expending my
time on it. :)

Received on Thursday, 21 October 2010 17:37:33 UTC