- From: Ian Hickson <ian@hixie.ch>
- Date: Wed, 19 Jan 2005 14:53:33 +0000 (UTC)
- To: www-qa@w3.org
"1.2 Good Practice B" suggests that an ICS form be provided with yes/no questions: "1. Create a list, table or form listing all features (capabilities) and indicating if it is mandatory or not. 2. Provide space for the implementer to check: Yes, No, Not Applicable". However, this is unrealistic. For example, take CSS user agents. How is an implementor to determine if he has implemented margin collapsing correctly? All that can really be said is that the user agent passes a certain set of tests. For any even mildly complicated specification it will always be possible to show that a user agent is in some way non-compliant, it's just a matter of finding a suitable test. Therefore I would suggest changing this section to instead suggest leaving space for the implementor to list the URIs to (publically available) tests that the implementor has used to verify interoperability and compliance, listing which tests the implementor determined passed in the user agent and which failed (if any). Specification authors may wish to provide a list of URIs to the tests that form part of the specification's formal test suite (as used to check for interoperability as per the CR exit criteria), although naturally such a test suite can never be complete enough to really be used to claim conformance so implementors would be expected to also provide links to other tests that they used. (The existing suggestions could be kept for the rare specs in which a test suite is inappropriate, such as the two examples the spec currently gives: the QA spec guidelines and the WCAG. However, this applies to very few specifications and so should not IMHO be the primary suggestion in the document.) -- Ian Hickson U+1047E )\._.,--....,'``. fL http://ln.hixie.ch/ U+263A /, _.. \ _\ ;`._ ,. Things that are impossible just take longer. `._.-(,_..'--(,_..'`-.;.'
Received on Wednesday, 19 January 2005 14:53:34 UTC