Re: Coverage analysis

Hi Bryan, all,

sorry for taking so long to answer. Due to a major screw-up at my end a 
lot of the emails for these conversations were being filtered to very 
much the wrong corner of my email client :( I've fixed that, catching up 
on the feedback now.

On 13/02/2013 16:43 , SULLIVAN, BRYAN L wrote:
> Some requests for clarification:
>
> 1.How was the “tests” column determined? Is this current tests or
> estimated tests needed?

It is current tests. The way it was estimated was by running every 
single test file in the repository and listening to the callbacks that 
testharness.js triggers in order to count them.

> 2.If the former, do you intent to add a column for “estimated tests needed”?

My plan with the coverage was to provide input information so that we 
can figure out goals such as this, yes. Whether that would go in the 
document is a matter of where we think that should best live.

> 3.Why the huge “Tests” # for “parsing”?

It has a lot of tests :) Those were written for html5lib. In fact, 
according to James, if I understand correctly, I've undercounted those 
by a factor of *three* because that specific sub testsuite runs three 
different batches (I'm guessing the same parsing but in different 
contexts) depending on the query string.

> 4.I’ve heard that there are ~11K tests in the HTML WG suite, but there
> are only approved test 1500 files in the repository. Is there a mapping
> somewhere between these?

One file can contain more than one test, and some files are there for 
support purposes and contain no test at all. I'm not sure what kind of 
mapping you're looking for, but I can tell you how many tests there are 
in any given file.

That said, please note that we have a goldmine of tests in the 
"submitted" directory of html-testsuite. Reviewing it all is a fair bit 
of work, but it should really help increase our testing coverage. A lot 
of the gaps that show up in the coverage report are actually in there 
(assuming the submitted tests are good).

> 5.In the testing meeting, I heard (it’s in the minutes) of “10000”
> features in HTML5 and estimates of $100-$200/test with a rollup to
> $1-$2M in effort to develop them. These numbers don’t seem to match up.
> If there are already ~11K tests, how many tests are expected across all
> 10000 features of HTML5? The resulting rolled up cost would seem to be
> much more than $2M unless “features” was meant rather than “tests”, re
> the cost per each.

To be honest, I don't really know how to estimate this part properly (I 
don't really know how to count features). I just count the JavaScript 
thingies and let the grown ups like Tobie deal with the money :)

-- 
Robin Berjon - http://berjon.com/ - @robinberjon

Received on Thursday, 7 March 2013 14:58:41 UTC