W3C home > Mailing lists > Public > public-media-capture@w3.org > May 2012

Testing getUserMedia

From: Dominique Hazael-Massieux <dom@w3.org>
Date: Thu, 31 May 2012 12:35:22 +0200
Message-ID: <1338460522.11737.33.camel@cumulustier>
To: public-media-capture@w3.org

As getUserMedia matures and is getting deployed, the need to ensure
interoperability across implementation increases; the only right way to
ensure that interoperability is to have tests.

While formally we only need to show interoperability during Candidate
Recommendation, I think it's worthwhile starting to create tests now,
even if that means that some of these tests will have to modified to
keep up with changes in the spec.

This message tries to serve as a general intro to how we do testing at

At a high level, a test case for a JavaScript API is an HTML file that
exercises a specific aspect of the API and tries to determine if the API
behaves as specified or not when run in the browser under test.

W3C groups working in this space use a common framework to develop test
cases that facilitate automating the run of these test cases, as well as
the collection of results from browsers running them. That test harness
is described at:

Unless there is a strong reason not to, I think we too should adopt that
harness for the development of our test cases.

Process-wise, I think we should also follow the way of other groups:
* have someone in the group designed as the test facilitator, that
ensures that test cases get submitted, reviewed, approved

* test cases should be submitted either by email or better by uploading
them to a dedicated mercurial repository; I've created
https://dvcs.w3.org/hg/media-capture/file/tip to that end, to which
anyone in DAP and WebRTC should have read-write access

* test cases are first put into the "submitted" directory; they'll get
moved to "approved" once the group gets a chance to review and approve

* we can also accept contribution of test cases from non group
participants; I can explain more about the logistics of this when needed

We probably need to define how we want to review and approve test cases;
different groups have had different approaches. But that's probably
easier done once we have found a test facilitator for the spec :)

I've started creating test cases which I hope can also serve as useful
starting points for other contributors; I'll give more details about
this in a separate mail.

Received on Thursday, 31 May 2012 10:36:13 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:24:35 UTC