- From: Robin Berjon <robin@w3.org>
- Date: Mon, 28 Apr 2014 15:12:47 +0200
- To: James Graham <james@hoppipolla.co.uk>, public-test-infra@w3.org, "public-web-and-tv@w3.org" <public-web-and-tv@w3.org>
On 22/04/2014 18:56 , James Graham wrote: > On 22/04/14 17:22, Giuseppe Pascale wrote: >> 2. TTWF website points to this post ( >> http://testthewebforward.org/blog/2013/02/20/testing-the-open-web-platform.html) >> under "Want to learn more about our plans?". Is that post still 100% >> accurate? If not, would be good to get an update post/page about what >> TTWF >> is and what the plans around testing are. > > I agree an update would be useful. We have come a long way in the last > year, and achieved many of the goals that Tobie set out. Does someone care to jump on this one? I would but it would have to wait at least a week. > For companies looking to contribute to web-platform-tests, putting > effort into review is just as valuable as putting effort into writing > tests. I'm quoting the above paragraph in the hope that people read it again :) > If people are going to submit hundreds of tests, it it worth > knowing that there's a rule that internal review in a public location > may be carried forward to the test repository. Not only that, but the only person who is not allowed to review a given test is the author. Absolutely anyone else, including someone from the same company, can review. > It's very hard to automatically invalidate tests when the spec changes. The only way to do that would be to have the tests be more integrated with the spec itself. It would certainly be an interesting idea and it's something that I'd like to explore at some point. But it's not something we're anywhere near to. > In practice the way we expect to deal with these things is to have > implementations actually run the tests and see what breaks when they are > updated to match the new spec. I would like to strengthen this point because it is actually, IMHO, not just a very good strategy but also the only fully viable one (at least without major expense) for a test suite of the size and scope of what we have (let alone what we'd like to have). We are working to get vendors to run the test suite on as regular as basis as possible (daily, in CI, etc.). This is a great way of detecting broken tests since there is a clear incentive to inspect failures. It effectively creates a virtuous feedback loop that improves the quality of both the tests and implementations over time. >> 5. IIRC not all WGs are using the process/tools from TTWF. Is this >> documented somewhere? Will these other groups continue with their tool >> for >> the time being or is there any plan to merge the various efforts at some >> point? > > CSS, at least, currently use different repositories. I think there is a > considerable advantage to everyone sharing the same infrastructure, but > at the moment there are not concrete plans to merge the repositories. That's not actually true. There is a strong incentive for other groups that are producing specifications for the Web platform to merge with WPT, and some of us regularly harass the laggards about this. The plan is to have everyone there. At this point the majority of Web platform test suites are in fact in WPT. The biggest glaring omission (and it is a big one) is CSS. During the Extensible Web Summit a couple of weeks back there were discussions around this and the current plan is that a few changes will be made to CSS's repo to reshape the directory layout (I don't know the details) and after that is done the CSS tests will be incorporated as a "css" submodule in WPT. That won't reuse the submission/review infrastructure, but will reuse everything to do with running tests and reporting results and will present a unified view on testing. I don't know the status (Rebecca or Peter maybe?) but once that's done we really won't be missing much. -- Robin Berjon - http://berjon.com/ - @robinberjon
Received on Monday, 28 April 2014 13:12:58 UTC