Re: WebTV Help for Getting Engaged in W3C Test Effort

Hi Giuseppe,

On 23/04/2014 14:07 , Giuseppe Pascale wrote:
> Not sure why, maybe I wasn't clear. All I was asking is a peace of info
> saying "when I developed this test, I was looking at version X of the
> spec". or, when someone check it later on "last time I checked, this
> test was valid for version Y of the spec". It wouldn't be too much work
> IMO.

Experience shows that this is unrealistic. Over the past few years I 
have found myself more than once at the receiving end of such metadata 
(at least back when most test suites tried to capture some), writing 
tools to process it. And it is almost systematically wrong.

The reason for that is simple, and is a well-known problem with 
non-vernacular metadata in any system: any information contained in the 
document that does not affect the way the document operates at runtime 
will drift to become wrong over time.

It's a very simple process. When you first create a test, you *might* 
get the metadata right. (Even then it's a big, big "might" because most 
people will copy from an existing file, and through that get wrong 
metadata.) But when it's updated what's your incentive to update the 
metadata? What points you to remember to update it? Pretty much nothing. 
If it's wrong, what will cause you to notice? Absolutely nothing since 
it has no effect on the test.

So far, in the pool of existing contributors and reviewers, we have 
people who benefit greatly from a working test suite, but to my 
knowledge no one who would benefit from up to date metadata. Without 
that, I see no reason that it would happen.

This can of course change. If there are people who would benefit from 
metadata I would strongly encourage them to contribute. IMHO the best 
way to do that would be to have an external service that would pull in 
the list of files (from the published manifest) and allow people 
interested in metadata to maintain it there, through a nice and simple 
Web interface. That system could easily poll for updates and queue up 
required verification by the community in charge of metadata. That would 
avoid interfering directly with version control (making changes that 
impact only metadata adds noise) and the review queue (where most of the 
existing reviewers would not be interested in validating metadata changes.

I believe everything is in place for the system described above to be 
implemented relatively easily. I am fully confident that if there is a 
community that genuinely requires testing metadata they could bash 
together such a tool in under a month. And we're happy to help answer 
questions and provide hooks (e.g. GitHub update hooks) where needed.

This is a volunteer and so far largely unfunded project. It is also by a 
wide margin the best thing available for Web testing today. Its shape 
and functionality matches what current contributors are interested in; 
if there are new interests not so far catered to, the solution is 
simple: just bring in new contributors interested in this!

-- 
Robin Berjon - http://berjon.com/ - @robinberjon

Received on Monday, 28 April 2014 12:59:04 UTC