RE: Perspective on signing XML 2

   From: dee3@us.ibm.com
   Date: Fri, 23 Apr 1999 00:17:05 -0400

   [ . . . ]

   ### There is no requirement for the W3C to produce reference code.
   There is no requirement for the W3C to certify any implementation
   of anything as correct.  In fact, based on the assumption that the
   W3C's lawyers are not idiots, I doubt that the W3C ever has ever or
   ever will certify any particular software implementation of
   anything non-trivial as perfect and defect free. 

You're right, Donald.  The W3C produces only "sample implementation"
code (e.g., jigsaw, libwww, amaya).  This is a far cry from reference
code which defines a standard in the way the standard meter stick
housed in Paris (?) defines the meter or (to a large extent) the X
Consortium's code defined X Windows.  The W3C sample implementations
are exremely valuable reference materials because they reliably
instantiate W3C Recommendations and provide a solid foundation for
debugging, debate and comparison -- but they're not for "compliance
testing".

   ### Whether reference code is produced or not and whether or not
   bio-metric systems like Pen Ops are easily accommodated, I believe
   the mandatory to implement algorithms for interoperability with be
   cryptographic.  And they will be defined in terms of the crypto
   algorithm, not any particular software provider's API.  The
   mandatory to implement algorithm(s) will include something like
   "DSA" not something like "Foobar CryptoAPI with algorithm selector
   = 7".  Of course people should be able to select and do whatever
   sort of proprietary and/or non-interoperable stuff they want, but
   open standards bodies are not normally in the business of mandating
   particular products.

Yes!!!  "MUST implement" does NOT mean "MUST NOT implement anything
else".  I believe we can accommodate "signing" schemes based on
biometrics (nearly all of which appear to be proprietary) but it will
not be possible to produce an acceptable standard for either the W3C
or the IETF which either makes such schemes part of what's mandatory
or contains a load of special-purpose provisions to accomodate the
ones we happen to know about today (e.g., PenOP).

For proprietary and as-yet-unknown signing schemes (ones that produce
signature "blobs" beyond the scope of the spec) we'll need something
like what I'd suggested earlier -- pointer(s) to resource(s) from
which the application can get help with semantics, plug-in code to do
signature decoding, decryption, whatever.  There's an obvious risk in
using such pointers and resources because the semantics are entirely
application-dependent, and this is something we also need to state in
the spec.

   [ . . . ]

   ### If the only goal is to guarantee correctness in implementing
   the standard and exportability and there are no other requirements,
   we could probably achieve this by standardizing on null signatures.

... and in passing I'll just note that I think we'll still need to
include null signatures as part of the spec as "MUST" implement.


--Bede

Received on Friday, 23 April 1999 16:22:21 UTC