- From: <david.solo@citicorp.com>
- Date: Fri, 19 Nov 1999 08:46:33 -0500
- TO: jboyer@uwi.com, w3c-ietf-xmldsig@w3.org
- Message-Id: <H0000cc404c3ea16@MHS>
John, Actually, both are true. The point is, if I sign paragraph X (a bunch of bytes), then thats what I've signed whether the paragraph is a standalone object, retrieved via the net, or extracted from a larger document. The thing I believe everyone (except perhaps you) agreed to yesterday is that while target and transforms can be relied upon to tell the core code how to obtain paragraph X (your point 1), a signature is not automatically invalid if paragraph X is obtained a different way (your point 2) [i.e. performing the specified transforms is not semantically required for signature validation]. The only assertion made by the signature is that that exact collection of bytes, paragraph X, was signed. The fact that paragraph X was extracted from document Y is in no way cryptographically assured by the XML signature unless I include object references both to paragraph X and to document Y (and perform additional external validation). Dave > -----Original Message----- > From: jboyer [mailto:jboyer@uwi.com] > Sent: Thursday, November 18, 1999 6:27 PM > To: w3c-ietf-xmldsig > Cc: jboyer > Subject: The XML-DSig Non-standard, or Location/Transforms as 'hints' > > > One of the main points that has caused much of the recent debate over > signing location and transforms is that some of us believe that > > 1) the ObjectReference's Location and Transforms will tell > core code how to > obtain the bucket of bits digested in DigestValue. > > while others of us believe that > > 2) the ObjectReference's Location and Transforms are a hint > that 'may' help > the application find the bits that the core code will need to do the > validation. > > I'm having difficulty buying into this latter point of view > because I think > that far too much work is being pushed off to the > application, which to me > means that most signatures will not validate outside of their > application > domains. I don't see the point in having a 'standard' if the > result is that > applications don't interoperate. > > From an API point of view, proponents of the first idea seem > to want to call > CreateSignature() or VerifySignature() and give a pointer to > a Signature > element. Proponents of the second idea seem to want the same > thing, except > that they must first set up an application-specific callback > function that > CreateSignature() and VerifySignature() can use to help dig > up the required > bits. Therein lies the rub. Callbacks are a wonderful way to solve > problems if you don't care about globally secure resources, > application > interoperability, and so forth. The first idea is in many of > our minds > because we associate 'standard' with interoperability. > > When the signer creates a signature, we are saying that Location and > Transforms provide 'hints' that indicate how the signer > created the bucket > of bits. Presumably, when the signer signed, the Location > and Transforms > describe precisely what happened. So, we are basically > saying that the > verifier can treat these as hints rather than precise steps. So, the > meaning of these *signed* bits has changed without breaking > the signature. > I agree that it will work in any single application context, > but it has an > unappealing engineering aesthetic. > > Finally, when proponents of the second idea say that > Transforms are 'hints', > does this mean that we will be making each application responsible for > resolving the Transforms too? In other words, going back to > the idea of the > callback function, must the callback function resolve the > Location or must > it resolve the Location and Transforms, giving to core code > the exact set of > bits that should match the DigestValue once the DigestMethod > is applied? > > John Boyer > Software Development Manager > UWI.Com -- The Internet Forms Company > >
Received on Friday, 19 November 1999 08:47:40 UTC