- From: Bob Smart <Bob.Smart@cmis.CSIRO.AU>
- Date: Wed, 09 Jun 1999 10:45:44 +1000
- To: w3c-ietf-xmldsig@w3.org
> some way to carry the digest algorithm used and any required > associated parameters. Otherwise, the approach might not easily > survive change Well for canonicalization change is impossible. Perhaps that is why it makes us nervous. Since URIs aren't that long, a safe choice here would be to hexify the URI itself, rather than a hash of it. The alternative to canonicalization is to provide a mechanism for transferring the bits that were signed. This can't be impossible since the signature itself is just a string of bits and we have to transfer that exactly. However apart from being very ugly it leaves the new question: "Given two strings of bits representing XML can we determine if they are the same document?". While it is easy to prove that this problem is no harder than canonicalization it is not obviously easier. It does however have the merit that the issue of handling change is now external to the digital signature process. So the no-canonicalization fallback position would be something like: Signature block consists of a sequence of pairs each of which is an exact representation of the bits that were signed OR a hash and URI for that exact rep (*) a sequence of pairs each of which is a public key a signature (for the current exact representation) [i.e. you can have multiple signatures but you have to allow for signatures being applied to different representations.] (*) If the hash is right for the currently available text, or a standard canonicalization thereof, then you don't need to fetch from the URI. We could add an extra option "OR a hash and a standard canonicalization specification", in which case we have an optional-canonicalization scheme. Bob
Received on Tuesday, 8 June 1999 20:46:23 UTC