Background
- Modularity of specifications is a Good Thing
- Specs can be re-used (eg, XSLT and XForms use XPath as their expression
language)
- There are issues (for spec authors)
- Component specs may advance at different rates
- Need to be explicit about versions of referenced specs
- Clearly state what is required and what is optional
- Excessive optionality can reduce interoperability
- Can we also create modular, reusable test suites?
The Complexity Continuum
- Simple: one spec in a single document (no external references)
- Develop tests for what's specified in the document
- More complex: one spec in multiple documents (no external
references)
- Develop tests for what's specified in the documents
- Beware...
- Advance the docs together where inter-dependencies exist
- Don't simultaneously support/promote different versions of one
doc
- Create an umbrella spec that explicitly states the versions of
component spec
- Complex: your spec references an "external" spec
(created by another group)
- You still need an umbrella spec
- Can you assume that the included functionality is tested (or out of
scope), or must you test it?
- If you must test it, should you test By reference, or test
By value?
- "Pass my tests and pass their tests too"
- Assumes there is an explicit, versioned test suite you can point
to
- Or (more likely) incorporate external tests into your test suite
- Hopefully the tests were developed with re-use in mind
Developing tests for re-use
- Create a test suite and not simply a random collection of tests
- Test, package, document, version, deal with errata
- State explicitly what versions of what specs the test suite addresses
- Indicate what portion of the spec each test addresses
- Do these tests apply to me? (I may not have implemented optional features)
- Provide data indicating how thoroughly the tests cover the spec
- How much confidence should implementors have if they "pass"?
- Do I need to augment these tests with others?
- Explain how to execute the tests and how to create a test harness
- Provide appropriate metadata
- Define expected results
- Report results in a consistent and useful format
- How should I interpret the results? Did I pass?
- If I didn't, what went wrong?
- Explain how to challenge the validity of a test or submit a bug report
Multiple specifications and the Java Platform
- J2SE Platform (umbrella) Spec references multiple standalone specs
- J2SE 6.0 spec has just been published at http://jcp.org/en/jsr/detail?id=270
- Some of these (not all) have an independent existence - and their own test
suites
- XML Digital Signature
- JDBC (database connectivity)
- JAXB (XML binding)
- JAX-RPC (SOAP)
- JAXP (XML parsing)
- Several of these in turn reference other specs (often from W3C)
- Subsetting and supersetting are strongly discouraged to preserve interoperability
- Must (usually) implement all component specs
- Must (usually) implement the same version of all component specs
Multiple test suites and the Java Platform
- All Sun-developed test suites use the same test harness
- Some differences nevertheless exist
- Different build mechanisms, test metadata formats
- Different execution environments (J2EE, J2SE, J2ME)
- Currently we test By Value
- Incorporating tests is (relatively) simple, since we develop to (relatively)
common standards
- Considering testing By Reference due to rapidly increasing size
of the platform test suites
- Need to improve the user-experience when multiple test suites are to
be executed
- We also incorporate W3C tests (eg, for XML Core, Schema)
- We'd really like to us By Reference for W3C specs
- Plug in any parser you want, so long as it passes version x.y of
the W3C parser test suite, located <here>