Re: X509SerialNumber schema

> Date: Wed, 14 Mar 2001 14:52:12 -0500
> From: "Donald E. Eastlake 3rd" <dee3@torque.pothole.com>
> 
> [moved to top] We need feedback from implementors.  For
> interoperability, we either need to give the encoding for this field
> in an attribute or someting, which seems unnecessairly complex,
> stick with mandating integer, or change to mandating hex.  But a
> change at this late date certainly needs more than a request from
> one person.

I agree.  Any implementors out there want to contribute an opinion?

> Basicly, as far as I know, X.509v3 defines CertificateSerialNumber as
> INTEGER.  Why should we be different?  The computational effort
> involved in binary to decimal conversion seems insignificant compared
> with any crypto computation.

We "should be different" because there are two different meanings of
the word integer.

In X509, integer means an array of binary ones and zeros of any
length.  In XML schema, it's text string using the characters 0-9 to
represent a base 10 number.

I'm not sure the computation is insignificant.  2**160 is a very big
number, about 10**50, meaning a lot of Big Integer divide by 10's.

I wouldn't even mind the computation, if the result was something
worthwhile.  For the crypto, it has to be done.  But base 10 isn't
enough better than base 16 to be worth the cost.

> It is true that the unbounded size of this field in X.509 has lead to
> "abuses" where entire DER ASN.1 structures are encoded into this field
> treated as a binary quantity.  But it isn't our fault that instead of
> just numbering their certificates 1, 2, 3, ... as was presumably the
> original concept, some CA's seem to want to encode lots of private
> extension information and the kitchen sink into this field or use a
> hash or whatever.

Perhaps it's not "our fault" that CA's use large bit streams as
serial numbers, but we cannot ignore it either.

> [snip]
> Note that the example in the spec gives a serial number of "12345678"
> which looks vaguely decimal to me (although I suppose it could be any
> radix from 9 up...).

I think that XML schema is clear that it's a base 10 number.
 
> >> This change apparently happened because of discussion in Pittsburgh:
> >>         http://lists.w3.org/Archives/Public/w3c-ietf-xmldsig/2000JulSep/0254.html
> >
> >I didn't see anything in that discussion. I found
> >
> >	http://lists.w3.org/Archives/Public/w3c-ietf-xmldsig/2000JulSep/0029.html
> >
> >which was a very brief "it's an integer in ASN.1, so we'll make it an
> >integer."  Tom Gindin mentioned the 160 bit serial number problem, but
> >the discussion died there.
> 
> Seems like an excellent reason to me to use integer and I certainly
> don't see a 160 bit serial number as a problem with that decision.
> 
> >It's true that it's an integer in ASN.1, but that integer is actually
> >an unbounded number of binary bits, not a XML decimal integer.  It's
> >the same word, but with an entirely different meaning.
> 
> No, it's a different encoding of exactly the same meaning.  A natural
> number.

True, binary base64, binary hex, and integer are three different ways
XML schema provides to encode a numeric value.

I'm becoming convinced that DSIG picked the wrong encoding, adding
computational effort with no corresponding advantage.

> >David Solo mentioned CryptoBinary (binary - base64 encoded) as an alternative.
> >Either that or (binary - hex) encoded seem reasonable to me.  
> 
> Is any existing implementation actually using CryptoBinary for this
> now?  If not, I can't see why we would change to that encoding.

There is at least one implementation I know of (because I received a
document) that worked off the mid-2000 draft (string) and interpreted
it as binary, hex encoded.

> >Integer, with the difficult conversion from binary, doesn't seem as
> >good a choice.
> >
> >Is there any possibility of revisiting this?
> 
> Sure, if there is a clear working group consensus to change.
> 
> >> >4
> >> >
> >> >Assuming all of the above, why was integer chosen?  It would seem like
> >> >binary, hex or base64 encoded, would be computationally easier when
> >> >handling X509SerialNumber's with many bits.
> 
> If computational ease was the only criterion, why would you support
> klunky ASN.1/DER encoded certficitates derived from the old ISO X.500
> Directory effort?  And if shortness of the encoding were the
> criterion, why would you be using XML?  Given that you are
> standardizing in a area which already indicates that verbosity and
> complexity are not the controlling criterion, I don't see what's wrong
> with believing the X.509v3 spec and using integer...

It's just that the choice of integer over binary-base64 or binary-hex
seemed from the DSIG archives fairly arbitrary.  I don't think anyone
other than TOM thought of the computation involved.

I'm not trying to reduce verbosity, just a computation which seems to
have no benefit.  In fact, I'd vote for hex over base64, for
readability over compactness.

-- 
Ken Goldman   kgold@watson.ibm.com   914-784-7646

Received on Wednesday, 14 March 2001 17:05:55 UTC