Implementation Reports for CSS2.1

I was looking through the links to the CSS 2.1 Spec as well as the Test Suite, and I realized that the current test suite results[1] may cause confusion in the web developer community. That particular result summary (a snapshot of the test suite from March 23, 2011) reflects what was presented to the Director in order to move the spec to Proposed Recommendation; it does not reflect accurate pass/fail rates of the different web browser engines against that test suite. I've posted Internet Explorer's Implementation Report[2] in order to try to make Interoperability more transparent.

The current results contain many inconsistencies that may prove confusing to web developers who are trying to figure out which features have been implemented by which vendors. The important distinction here is that the snapshot demonstrates that the spec is clear enough to be implemented: by looking at those test suite results, you can see that at least two vendors pass every test in the test suite, which means it is Implementable. However, those results are not meant to show Interoperability, which is what most people would expect to find there.

I am wondering how long it might be before we can publish the next snapshot that includes the currently approved tests as well as accurate results demonstrating how each implementation does against each test; doing so would make it easier to understand Interoperability for the developer community. Additionally, web developers expect and should be able to find a chart that shows how each browser actually does against the current test suite snapshot, rather than just a snapshot of results from the test harness that does not include official reports from the vendors.

The Test Suite summary page[3] has been modified to point to the test harness site[4] (which is definitely helpful), but the harness includes tests that have not yet been reviewed; until that stabilizes, developers will not know where to go to find out about interoperability at a high level (for example, sorting the result set to see what CSS construct has _not_ been implemented by all major browsers, rather than what _has_ been implemented by two). Web developers will look at the Implementation Report sent to the Director and think that they are looking at a chart that is meant to describe Interoperability, but that is not the case. For example, IE actually passes 98.78% of the tests in this snapshot, but on the summary it looks like IE only passes 89.9% of the tests (not due to failures, but due to the removal of passing results because tests were changed between the previous implementation report and the snapshot in March). In other cases there are "can't tell" results across all vendors, which could be fixed if vendors submit their own official Implementation Reports to the W3C.

To that end, I've submitted the IE9 version 9.0.8112.16421 (RTM) Implementation Report[2] for the 18762 tests in the March 23, 2011 version of the Test Suite (including all formats).

Pass: 18528 - 98.78%
Fail: 228 - 1.21%
Invalid: 2
N/A: 4

-John Jansen
Senior Test Lead
Microsoft

[1] http://www.w3.org/Style/CSS/Test/CSS2.1/20110323/reports/results.html
[2] http://lists.w3.org/Archives/Public/www-archive/2011Jun/0018.html 
[3] http://www.w3.org/Style/CSS/Test/ 
[4] http://test.csswg.org/harness/review/CSS21_DEV/ 

Received on Monday, 13 June 2011 02:57:33 UTC