- From: Tobie Langel <tobie@fb.com>
- Date: Mon, 11 Feb 2013 17:13:24 +0000
- To: Robin Berjon <robin@w3.org>, "'public-html-testsuite@w3.org'" <public-html-testsuite@w3.org>
- CC: public-test-infra <public-test-infra@w3.org>
On 2/11/13 4:47 PM, "Robin Berjon" <robin@w3.org> wrote: >Hi all, > >a couple of weeks ago we had a meeting about testing. One of the things >that came out of it was that it would helpful to get a feel for the >coverage level that we have for specs, and for larger specs to have that >coverage per section, along with other measures to contrast the number >of tests with. > >I've now done this analysis for the HTML and Canvas specs (I would have >done Microdata too, but it doesn't seem to have approved tests yet). > >You can see it here, but be warned that you might not understand it >without reading the notes below: > > http://w3c-test.org/html-testsuite/master/tools/coverage/ > >I'm copying public-test-infra; in case anyone wants to do the same for >other specs I'd be happy to collaborate. If people think it would be >useful to provide such data on a regular basis, we can certainly >automate it. Note that for this purpose having the data in one big repo >would help. Thanks for doing this. This is great. I absolutely think we should be doing this for all specs. With better visuals, finer tuning of the weight of each metric (maybe even per spec tuning?), and data on the actual number of tests written for each section, this could give us fantastic overview of testing coverage at W3C, with the ability to dig into specifics when needed. Care to share the script(s) and discuss how to best move this forward? --tobie
Received on Monday, 11 February 2013 17:13:55 UTC