Re: Poll on Exclusive Canonicalization

Phillip Hallam-Baker wrote:
> 
> > Right. As soon as you do the simplest thing, like wrapping one level
> > of element around a signature or around an element with a signature as
> > a descendent or around an element signed by a detached signature or
> > removing one level of element around a signature or around an element
> > with a signature or xml signed by a detached signature as a
> > descendent, interoperability breaks, if namespaces are involved.
> >
> > >Rather than mess with the 'mandatory' status we should
> > recognize that the
> > >specification cannot mandate support for a specific canonicalization
> > >algorithm and it is entirely wrong and unjustified to do so.
> >
> > Why? Wouldn't your argument apply equally well to the DER
> > canonicalization of ASN.1? It's really a serialization of ASN.1 just
> > like "CanonicalXML" is a serialization of XML.
> 
> Actually, no, the DER canonicalization of ASN.1 is not actually part
> of the ASN.1 spec, it is 'specified' in  ten line side note in the
> X.509v1 spec.

Even more incorrect than Donald's comment about "colours"
earlier on. But true enough I suppose if you are stuck in
1988 and relying on tools that use X.208/X.209. But it is
X.690 which specifies the Distinguished Encoding Rules of
ASN.1, and X.509 is merely one ASN.1 based specification. 

> DER canonicalization of ASN.1 is probably the single biggest reason
> for the complexity of X.509 implementations. It is based on a premise
> that is entirely wrong, X.509 certificates are not broken apart for
> inclusion in the directory in any working commercial implementation.

I would disagree. The design of the ASN.1 notation used in
X.509 is probably the biggest contributor to its complexity.
In particular, the ASN.1 used to specify the DNs used in
X.509 is overly general and complex when compared to the 
actual needs of many users. It allows for anything, when
only a subset would do.

> DER canonicalization is for losers and the idea it embodies is. It
> does not work, never has and never will. It adds 6 months to the
> development of X.509 implementations and does nothing but subtract
> value.

ASN.1 DER is actually working every time an X.509
Certificate is issued or used. Last time I looked,
X.509 seemed to be at the center of just about all
of the ECommerce that currently exists on the web
today.

And general purpose tool kits have been in use for
some time that seem to have no problems at all in
converting arbitrary ASN.1 encodings into the DER
canonical form. C14N does not seem capable of doing
that to an arbitrary XML document.

Of course, there are some poorly written ASN.1 tools
still in use that apparently do a bad job. But I 
suppose that is true for XML tools as well.
 
> The purpose canonicalization in XML is entirely different, it allows
> the XML Signature engine to sit at a higher level in the XML stack
> and process the DOM parse tree rather than the raw bytestream.

Hmmm. In ASN.1 the term canonicalization has to do with
ensuring that there is single way in which a value of any
given data type is encoded. From this thread, it does appear
that you are correct and that this is not true, and may not
even be possible with XML. This is probably not a big deal
in general, as it is possible to carry an arbitrary piece
of XML in a CMS payload.
 
> > Was it a mistake to standardize DER?
> 
> Yes, total, complete and collosal. It was an utter failure.

Well, at least it managed to get SSL and the like
going. That's a little better than nothing I suppose.

> X.509 (the spec in which DER is specified) would work much
> better without it.

Actually, this is true, but perhaps not for the same
reasons you might think. When the ASN.1 defined in X.509
certificates is written using the X.680 open type notation
(such as the case for the ASN.1 found in ISO 15782, ECheck,
SET, and in some of the H-series standards), it is possible
for tools to perform signature operations using BER or even 
the highly compressed Packed Encoding Rules (PER) defined in
X.691.

The open type representation of a value is simply the value
of an ASN.1 type in its encoded form. This essentially means
that there is a form for every value of every ASN.1 type that
looks identical to an ASN.1 tool - the open type representation
of the value. These values always look the same, much like
simple octet string values, and are quite easy for tools to
handle.

When a value is specified using open type notation, tools never
need to decode (though they may choose to do so if directed) 
the actual value, but can merely look at the leading tag and 
length octets and use these to skip over the rest of the string.

This works equally well using PER, CER, DER or BER. So for a
certificate, saying

   Certificate ::= SIGNED { EncodedX509CertBody }

   EncodedX509CertBody ::= 
         TYPE-IDENTIFIER.&Type( CertificateInfo )

allows well designed ASN.1 tools to treat a value of type
EncodedX509CertBody as an opaque string, as a value of type
CertificateInfo in its encoded form. 

Of course this is precisely the representation a tool would
want the value in to verify or compute a signature. A very
simple situation regardless of the underlying complexity
hidden inside of the payload to be signed.

But if you're stuck in 1988 with X.208, there is no such
capability, the tools tend to be quite ignorant, flaky,
ineffective, and difficult to use. Not at all true of the
modern ASN.1 tools of today when used with well written
ASN.1.

> > Should interoperability of ASN.1 encoded signatures
> > have been entirely an application/protocol responsibility so that for
> > each protocol you would have had to figure out a
> > canonicalization/serialization?
> 
> The only data structure to which DER encoding is generally applied
> is X.509. I am not aware of any X.509 implementation that relies on
> the DER encoding being correct - which is just as well since most
> of the imlementations are actually faulty. Very few certificates
> actually have the SET OF constructions sorted into order.

It is up to the sender to get the encoding correct.
But since DER is merely a subset of BER, most well
behaved web tools will try to give the sender the
benefit of the doubt and accept BER encodings as
well as DER. So long as the signature is correct,
it doesn't really matter, and it tends to make the
web work a little smoother if folks try to get along.
 
> > If canonicalization is required, as it clearly is for SignedInfo for
> > XML applications, if there is no required C14N, there is no
> > interoperability.
> 
> There will be no interoperability in any case since what is specified
> now DOES NOT WORK.
> 
> Case in point we have been doing interop tests with another vendor
> and the failure of the C14N spec to provide for interoperability is
> the holdup.
> 
> > In that case, I would question whether we have
> > something that would qualify as an IETF standard anymore. Certainly
> > you no longer have anything where you can expect implementations to
> > interoperate.
> 
> The interoperability of XML Signature is irrelevant, the interoperability
> of systems built on top of XML Signature is the only relevant issue.
> 
> No SOAP like system is going to be able to specify the current C14N,
> ergo they will specify something else. Ergo whatever MUST you include
> in the specification has no effect on the specifications that matter.
> 
>                 Phill

Phil Griffin

Received on Wednesday, 20 June 2001 13:09:12 UTC