W3C home > Mailing lists > Public > www-qa@w3.org > October 2002

Re: Defining "public interfaces" in specifications

From: Alex Rousskov <rousskov@measurement-factory.com>
Date: Sun, 27 Oct 2002 10:36:20 -0700 (MST)
To: Eric van der Vlist <vdv@dyomedea.com>
cc: www-qa@w3.org
Message-ID: <Pine.BSF.4.44.0210271017260.31166-100000@measurement-factory.com>

On 27 Oct 2002, Eric van der Vlist wrote:

> That's because the specs do not provide "public interfaces" which
> they are commited to maintain over versions or explicitely
> deprecate. If each specification had to supply a list of definitions
> which will be maintained over versions or slowly deprecated, other
> specs would know what they can safely refer to and could afford to
> rely on the latest release.

The spec itself is such a public interface. It is an implied
assumption behind virtually every spec that future versions of the
spec will be "backword compatible" if it makes sense to support such
compatibility or "explicitely deprecated" if it does not.

> That was the situation with libraries before we used to define
> publics interfaces. The current situation isn't perfect, moving
> between different versions of libraries isn't tatally seamless but
> IMO that's much better and a step forward.

The current situation with libraries is pretty much equivalent to the
current situation with specs. A public interface of a library is
simply implementation of a [usually public] spec.

> >
> > In practically all US government specs, you will see other documents
> > included by reference, by always qualified by the phrase "of the exact issue
> > specified".  That is how it should be.
> But this is creating specs which are modular only by name and
> cascading updates such as we have now with XML 1.1 which requires to
> update 80% of the W3C X* specs to become effective.

I probably do not understand your notion of "modularity". What you
propose does not improve modularity (as I understand that term)
compared to the current best practice. It just makes it
administratively more difficult to change certain specs, that's all.

IMO, it should be up to spec authors whether to retain backward
compatibility or not. If another spec is cited, it should be named
exactly, including its version number if applicable (which, BTW, is
still not 100% precise because of the errata issues).

If a spec is introducing a concept that does not really belong to that
spec because it has a much broader scope, the authors should consider
extracting that concept into its own [smaller] spec. Similarly, if a
needed concept is found in another spec, it can be put into its own
spec if desired. I think such extraction and factorization events has
happened several times with URLs and other basic Web objects.

> as we have now with XML 1.1 which requires to update 80% of the W3C
> X* specs to become effective.

If XML 1.1 is backward compatible with XML 1.0, there should be no
reason to update 80% of X* specs. If it is not backward compatible, a
careful rewrite would be required anyway. Note that "backward
compatible" status may depend on a particular context/way in which an
X* spec is using XML 1.0.


                            | HTTP performance - Web Polygraph benchmark
www.measurement-factory.com | HTTP compliance+ - Co-Advisor test suite
                            | all of the above - PolyBox appliance
Received on Sunday, 27 October 2002 12:36:41 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 20:40:30 UTC