Putting ObjectReferences First

I first raised this as a solution to what I considered to be a problem after
looking at the canonialized versions of things.  Since then there as been
some off-line converstations to solidify what the problems are and what the
solutions are.  What follows is my best attempt at summerizing these
converstations:

There are two major issues about hash functions that need to be looked at:

1.  Is there enough data being hashed?
2.  Is the random data being hashed in the correct location?

For item #1, the answer is yes for all of the hash functions we are
currently dealing with.  The trick is to make sure that you are filling the
length of the hash compuation (for SHA1 this is 512-bits) atleast once.
Since our signatures are not going to be short due to name space
canonizalization, this is not a problem.

For item #2, the problem is a bit tougher to describe.  The question is how
much of the constant data is adding to the hash, and how much is just
consuming time.  If you hash known text A, random text B, and known text C
with SHA1.  The hashing of A can be done and the state of the hash function
saved out for all future text Bs.  Nothing can be done with the known text C
since hashing B will cause a random change in the internal state of the
hash.  This means that one really wants to minimise the length of A and
maximize the length of C for known text that is part of the hash
computation.

Two ways to address this are to either 1) put a random nonce in the front of
the signature or 2) move the data that already in the signature and random
forward.  The problem with the nonce is that the nonce value must be
transmitted as part of the signature and thus increases the size of all
signed documents.  The problem with moving the data is that some people
believe that one pass processing is still possible and this would break
that.  (My personal belief is that it cannot be done and this is not an
issue.)

jim

Received on Thursday, 28 October 1999 17:49:10 UTC