- From: Boris Zbarsky <bzbarsky@MIT.EDU>
- Date: Thu, 21 Oct 2010 12:43:38 -0400
- To: shelby@coolpage.com
- CC: www-style list <www-style@w3.org>
On 10/21/10 12:32 PM, Shelby Moore wrote:
> Good point. I am glad we can get some initial thinking about quantification.
>
> The key may be how multi-level hierarchy is handled. I was supposing it
> is necessary that the hierarchy places some restrictions on combinations,
> as is the case with the CSS box model.
It's reasonably common to have thousands to tens of thousands of
siblings in the CSS box model.
>> And these are constraints on variables with, typically, positive reals
>> or reals as domains, correct?
>
> I have not yet contemplated how large the per table entry data structure
> is
The question is not of data structure size, but of algorithmic
complexity. Most constraint satisfaction algorithms I have been able to
find seem to assume finite domains for the variables, and have
complexity at least D^N where N is the number of variables and D is the
domain size, at first read. Again, maybe I'm just totally
misunderstanding them....
But in our case N is order of 1e2--1e5 and D is infinite in theory. In
practice, D is order of 2^{30} in Gecko, say.
> but if is less than 1024 bytes, then n = 1024 is less than 1 GB of
> virtual memory (hopefully physical memory)
Per page, yes?
> and let's just ballpark guesstimate on the order of less than Gflops.
Where is this estimate coming from? What are we measuring, even? flops
are operations/second; what does it mean to measure problem complexity
in flops?
-Boris
Received on Thursday, 21 October 2010 16:44:12 UTC