W3C home > Mailing lists > Public > w3c-ietf-xmldsig@w3.org > January to March 2001

Re: X509SerialNumber schema

From: Donald E. Eastlake 3rd <dee3@torque.pothole.com>
Date: Wed, 14 Mar 2001 14:52:12 -0500
Message-Id: <200103141952.OAA0000035578@torque.pothole.com>
To: Ken Goldman <kgold@watson.ibm.com>
cc: w3c-ietf-xmldsig@w3.org, <lde008@dma.isg.mot.com>
Hi Ken,

From:  Ken Goldman <kgold@watson.ibm.com>
Date:  Mon, 12 Mar 2001 16:50:14 -0500
Message-Id:  <200103122150.QAA25022@alpha.watson.ibm.com>
To:  w3c-ietf-xmldsig@w3.org
In-reply-to:  <4.3.2.7.2.20010312142432.02f72f08@rpcp.mit.edu> (reagle@w3.org)
CC:  "Donald Eastlake" <dee3@torque.pothole.com>, <lde008@dma.isg.mot.com>,
            tgindin@us.ibm.com, David.Solo@citicorp.com
References:   <4.3.2.7.2.20010312142432.02f72f08@rpcp.mit.edu>

>> Date: Mon, 12 Mar 2001 14:43:57 -0500
>> From: "Joseph M. Reagle Jr." <reagle@w3.org>
>> 
>> I'm not an X509 expert (or even novice) but I can speak to the XML issues 
>> and the spec.
>> 
>> At 15:30 3/9/2001 -0500, Ken Goldman wrote:
>> 
>> >2
>> >
>> >Assuming that (1) is right - If I have an X509SerialNumber from a
>> >certificate that is a long string of bits (Tom Ginden mentioned back on
>> >July that some certificates use a hash value of 160 bits) doesn't the
>> >binary to decimal conversion become computationally painful.

Basicly, as far as I know, X.509v3 defines CertificateSerialNumber as
INTEGER.  Why should we be different?  The computational effort
involved in binary to decimal conversion seems insignificant compared
with any crypto computation.

It is true that the unbounded size of this field in X.509 has lead to
"abuses" where entire DER ASN.1 structures are encoded into this field
treated as a binary quantity.  But it isn't our fault that instead of
just numbering their certificates 1, 2, 3, ... as was presumably the
original concept, some CA's seem to want to encode lots of private
extension information and the kitchen sink into this field or use a
hash or whatever.

>> >It would seem like this might be often required in verification to
>> >match the XML to the ASN.1 in the certificate.
>> 
>> The natural language part of xmldsig relies upon the [LDAP-DN] reference 
>> (RFC2253) to say what the content should be a string.
>> [snip]
>
>I don't think RFC2253 is relevant, as it talks about the distinguished
>name, not the serial number.
>
>> >3
>> >
>> >Assuming that (1) is right - I recently received a document claiming to be
>> >signed using DSIG, which included the following XML fragment:
>> >
>> >         <X509SerialNumber>39F497CA</X509SerialNumber>
>> >
>> >Is this valid XML DSIG?  Was it valid at one time in the past?
>>
>> It does not comply with the most recent schema definition of course. It does 
>> comply with the June version
>>         http://www.w3.org/TR/2000/WD-xmldsig-core-20000601/
>> which was then changed to type integery in the July version:
>>         http://www.w3.org/TR/2000/WD-xmldsig-core-20000711/

We need feedback from implementors.  For interoperability, we either
need to give the encoding for this field in an attribute or someting,
which seems unnecessairly complex, stick with mandating integer, or
change to mandating hex.  But a change at this late date certainly
needs more than a request from one person.

Note that the example in the spec gives a serial number of "12345678"
which looks vaguely decimal to me (although I suppose it could be any
radix from 9 up...).

>> This change apparently happened because of discussion in Pittsburgh:
>>         http://lists.w3.org/Archives/Public/w3c-ietf-xmldsig/2000JulSep/0254.html
>
>I didn't see anything in that discussion. I found
>
>	http://lists.w3.org/Archives/Public/w3c-ietf-xmldsig/2000JulSep/0029.html
>
>which was a very brief "it's an integer in ASN.1, so we'll make it an
>integer."  Tom Gindin mentioned the 160 bit serial number problem, but
>the discussion died there.

Seems like an excellent reason to me to use integer and I certainly
don't see a 160 bit serial number as a problem with that decision.

>It's true that it's an integer in ASN.1, but that integer is actually
>an unbounded number of binary bits, not a XML decimal integer.  It's
>the same word, but with an entirely different meaning.

No, it's a different encoding of exactly the same meaning.  A natural
number.

>David Solo mentioned CryptoBinary (binary - base64 encoded) as an alternative.
>Either that or (binary - hex) encoded seem reasonable to me.  

Is any existing implementation actually using CryptoBinary for this
now?  If not, I can't see why we would change to that encoding.

>Integer, with the difficult conversion from binary, doesn't seem as
>good a choice.
>
>Is there any possibility of revisiting this?

Sure, if there is a clear working group consensus to change.

>> >4
>> >
>> >Assuming all of the above, why was integer chosen?  It would seem like
>> >binary, hex or base64 encoded, would be computationally easier when
>> >handling X509SerialNumber's with many bits.

If computational ease was the only criterion, why would you support
klunky ASN.1/DER encoded certficitates derived from the old ISO X.500
Directory effort?  And if shortness of the encoding were the
criterion, why would you be using XML?  Given that you are
standardizing in a area which already indicates that verbosity and
complexity are not the controlling criterion, I don't see what's wrong
with believing the X.509v3 spec and using integer...

>> I don't recall, perhaps Don or Tom could speak to this.
>
>-- 
>Ken Goldman   kgold@watson.ibm.com   914-784-7646

Thanks,
Donald
===================================================================
 Donald E. Eastlake 3rd                    dee3@torque.pothole.com
 155 Beaver Streeet                         lde008@dma.isg.mot.com
 Milford, MA 01757 USA     +1 508-634-2066(h)   +1 508-261-5434(w)
Received on Wednesday, 14 March 2001 14:52:35 GMT

This archive was generated by hypermail 2.2.0 + w3c-0.29 : Thursday, 13 January 2005 12:10:12 GMT