Re: DeltaV Draft

 > From: "Preston L. Bannister" <preston@home.com>

 > > We've already got a complex spec, so I try to leave out anything
 > > that doesn't directly contribute to interoperability.

 > The part about a complex spec is beginning to bother me :).

That is why the protocol identifies a simple "core versioning" 
subset.  If you are not a versioning expert, you should focus on
the core.  

 > Is there anything like a reference implementation?

The Versioning Protocol is intented to enter the standards track
at "Proposed Standard" level.  The point of a draft standard is
not to imply a particular implementation (reference or otherwise),
but rather is intended to allow the interoperation of very
different implementations.  In some ways, a "reference implementation"
is to be avoided, since it encourages clients to count on 
functionality provided by that implementation, rather than
restricting themselves to the functionality required by the
protocol specification.

In particular, here is the relevant section of RFC 2026:

 4.1.1  Proposed Standard

 A Proposed Standard specification is generally stable, has resolved
 known design choices, is believed to be well-understood, has received
 significant community review, and appears to enjoy enough community
 interest to be considered valuable.  However, further experience
 might result in a change or even retraction of the specification
 before it advances.

I believe this is an accurate characterization of the current state
of the versioning protocol.

 Usually, neither implementation nor operational experience is
 required for the designation of a specification as a Proposed
 Standard.  However, such experience is highly desirable, and will
 usually represent a strong argument in favor of a Proposed Standard
 designation.

My personal experience is that companies are reluctant to commit
resources to a versioning protocol until it has reached proposed
standard level.  A notable exception is the open source Subversion
effort, with Greg Stein's contribution to the DeltaV design (thanks,
Greg!).

 The IESG may require implementation and/or operational experience
 prior to granting Proposed Standard status to a specification that
 materially affects the core Internet protocols or that specifies
 behavior that may have significant operational impact on the
 Internet.

Since the versioning protocol operates at a level well above the core
Internet protocols, this section should not apply.

 A Proposed Standard should have no known technical omissions with
 respect to the requirements placed upon it.  However, the IESG may
 waive this requirement in order to allow a specification to advance
 to the Proposed Standard state when it is considered to be useful and
 necessary (and timely) even with known technical omissions.

There are no known technical omissions in the versioning protocol,
so this section should be satisfied.

 Implementors should treat Proposed Standards as immature
 specifications.  It is desirable to implement them in order to gain
 experience and to validate, test, and clarify the specification.
 However, since the content of Proposed Standards may be changed if
 problems are found or better solutions are identified, deploying
 implementations of such standards into a disruption-sensitive
 environment is not recommended.

This I believe is a correct characterization of the current
versioning protocol level.

 > The traffic in this list reminds me of any number of design
 > discussions in which I've been in the past.  Each participant
 > has an internal model of how the product (once built) may
 > function, and everyone's internal model is different.

That is intentional.  There are a variety of versioning models
out there (with implementations).  We are designing a protocol
which will allow a client to interoperate with a variety of
these models.

 > Once implementations exist you discover where your internal
 > models differ.  You also discover which bits are important, 
 > omitted, or downright boneheaded.

All of these models are implemented (many in large commercial
versioning repositories, some in open source format).  The
members of the design team are largely implementors of these
versioning systems.  The point of the protocol is to provide
a way for a client to interoperate with a variety of these
versioning systems, not to make up a new versioning model.
This avoids the downright boneheaded problem (those folks
either went out of business or their systems fell out of use).

 > I'm guessing there are people doing implementations, though
 > given the rate of change in the specification doubtless
 > no one is current with the spec. 
 > So where is the sanity check?

The sanity check is the existing versining repositories
(many of them with their own proprietary web access protocol).
The deltav protocol is a "unification" of that combined
experience.

Cheers,
Geoff

Received on Saturday, 13 January 2001 21:00:05 UTC