W3C home > Mailing lists > Public > www-style@w3.org > October 2010

Re: Positioned Layout proposal

From: Shelby Moore <shelby@coolpage.com>
Date: Thu, 21 Oct 2010 12:59:54 -0400
Message-ID: <1345bcfc074b7473d8ff29c57fb6db93.squirrel@sm.webmail.pair.com>
To: "Boris Zbarsky" <bzbarsky@MIT.EDU>
Cc: "www-style list" <www-style@w3.org>
> On 10/21/10 12:32 PM, Shelby Moore wrote:
>> Good point. I am glad we can get some initial thinking about
>> quantification.
>> The key may be how multi-level hierarchy is handled.  I was supposing it
>> is necessary that the hierarchy places some restrictions on
>> combinations,
>> as is the case with the CSS box model.
> It's reasonably common to have thousands to tens of thousands of
> siblings in the CSS box model.

But they probably have a repeating pattern, and I am supposing such
repeating patterns will present an opportunity for optimization.

It is very unlikely to have maximum entropy (randomization) amongs 10,000
elements in a page. I doubt any thing could be comprehended from the page,
because by definition, there is no Shannon-Entropy information in that
case of maximum entropy.

What CSS currently does is model some of the most popular repeating
patterns, e.g. table cols and rows.

I am supposing we can generalize the search for patterns and optimizing them.

>>> And these are constraints on variables with, typically, positive reals
>>> or reals as domains, correct?
>> I have not yet contemplated how large the per table entry data structure
>> is
> The question is not of data structure size, but of algorithmic
> complexity.  Most constraint satisfaction algorithms I have been able to
> find seem to assume finite domains for the variables, and have
> complexity at least D^N where N is the number of variables and D is the
> domain size, at first read.  Again, maybe I'm just totally
> misunderstanding them....

Ditto what I wrote above. That D^N complexity assumes the domain is set of
all random possibilities.

> But in our case N is order of 1e2--1e5 and D is infinite in theory.  In
> practice, D is order of 2^{30} in Gecko, say.

Typo? Don't you mean in Gecko D^N is on order of 2^30?

>> but if is less than 1024 bytes, then n = 1024 is less than 1 GB of
>> virtual memory (hopefully physical memory)
> Per page, yes?

Yes but virtual memory, and that is before any such optimization for
repeating patterns. And 1024 was just pulled out of the air, it might be
1/10 or 10 times that (but I lean towards it will be 1/10).

>> and let's just ballpark guesstimate on the order of less than Gflops.
> Where is this estimate coming from?  What are we measuring, even?  flops
> are operations/second; what does it mean to measure problem complexity
> in flops?

I was just making some wild guessestimate of complexity and typical use
case in my head, and extracting to ops/sec.

We would really need to dig more into this.  Afaics, you are raising
issues which are for the next stage things I wrote I want to contemplate.
So I will have to put that off for now, and come back to it later.
Received on Thursday, 21 October 2010 17:00:22 UTC

This archive was generated by hypermail 2.3.1 : Monday, 2 May 2016 14:38:40 UTC