MWI Test Harness for CSS

One way to get more people involved in CSS conformance testing
and to encourage implementors to send in implementation reports
would be creating a system that makes it easy for anyone to
submit pass/fail data. Lucky for us, the Mobile Web Initiative
has already created such a system. :)

There are some improvements I'd like to see before we start
using it to generate real implementation reports, however, the
first major one being to get the pass/fail buttons out of the
test file so they don't interfere with the test.

HP has volunteered to make improvements to the test harness,
and they asked me for a wishlist. Here's what I sent:

--------------------------------------------------------------

I promised you a test suite harness wishlist you could discuss.

But first, here are the links I sent over the telecon:

   CSS Test Suite wiki:
     http://csswg.inkedblade.net/test/css2.1
   Microsoft's tests:
     http://samples.msdn.microsoft.com/csstestpages/
   Mobile Web Initiative harness:
     http://www.w3.org/2007/03/mth/harness
   Mobile Web Initiative harness source code:
     http://dev.w3.org/cvsweb/2007/mobile-test-harness/

One of the very neat things about the MWI test harness is that it associates
the results with a user agent string. This means there's no need for users to
log in or to select their UA. They just load a test and click Pass/Fail/Can't
Tell.

Very briefly, the improvements I'd like to see are (numbered approximately
by priority):

New Harnesses:

   1. Harness using <iframe> to contain the test and pass/fail buttons on
      the containing page rather than inside the test file. (This format
      is good for desktop browsers.)

   2. Harness using links targetted at a new window to open the test and
      pass/fail buttons on the page containing the link rather than inside
      the test file. (This format is necessary for print.)

   In both cases the test itself should be referenced as a link, not fed through
   the CGI script. (This avoids tampering with the HTTP headers that normally
   get served up with the tests.)

   8. It would be nice if these harnesses could include some meta information
      about the test in addition to the buttons. E.g. the test ID (filename
      before extension), test title, any requirements documented in the test
      ("Warning: Must install Ahem font." etc). (I can extract this information
      for you into a flat-file database during the test suite build process
      so the harness doesn't have to do any analysis of the tests themselves.)

Better Reporting:

   3. Ability to consolidate results for various user agent strings under
      one category name. E.g. consolidate results for all UA strings that
      represent Opera 9.25 Beta 1 regardless of OS and localization.

   4. Pass/fail scores for the whole test suite

   5. Ability to report consolidated pass/fail scores for a named groups of
      tests. (To create e.g. a summary of what features are supported and to
      what level.)

   6. Interface for generating reports based on various parameters. URLs to
      these reports should be short and clean so they can be passed around
      in blogs/IM/email etc

   9. Prettier reports. :)

Data cleanup:

   7. Ability to delete all data for a given test (so that when a test is
      changed we can invalidate the results for that test). This process
      could be triggered manually on the command line rather than via CGI--
      that avoids the need for a login system.

~fantasai

Received on Monday, 21 April 2008 18:07:11 UTC