Re: IETF Signed-XML BOF Notes

It should be noted that the purpose of an IETF BoF is to determine if there
is sufficient community interest and a reasonable starting point for the
formation of an IETF Working Group.  Thus detailed technical discussion of
proposals really isn't the point unless it is to show that no approache is
likely to enable advancement toward the advertised goal of the working
group it is proposed to set up.

While creation of an IETF working group is never guaranteed, the BoF
clearly showed enough interest and support.  At this point, really, the
only question is what the workshop outcome is going to be.

See additional comments below preceeded by ###.

Thanks,
Donald

Donald E. Eastlake, 3rd
17 Skyline Drive, Hawthorne, NY 10532 USA
dee3@us.ibm.com   tel: 1-914-784-7913, fax: 1-914-784-3833

home: 65 Shindegan Hill Road, RR#1, Carmel, NY 10512 USA
dee3@torque.pothole.com   tel: 1-914-276-2668



"Joseph M. Reagle Jr. (W3C)" <reagle@w3.org> on 04/02/99 02:42:49 PM

To:   "Signed-XML Workshop" <w3c-xml-sig-ws@w3.org>
cc:    (bcc: Donald Eastlake/Hawthorne/IBM)
Subject:  IETF Signed-XML BOF Notes



Folks,

Don Eastlake has provided a location for the draft BOF notes, you can find
them at:
        ftp://ftp.pothole.com/pub/xml-dsig/IETF44minutes.txt

My brief summary:

1. DOM-HASH. Do we need it, why not just sign surface strings? Abstract
models and semantics can be confusing, but it permits compound documents.
There may be a crypto fault in the "hashes on hashes" algorithm. [1]

### The main thing with which there was no disent was that
cannonicalization is necessary, for the reasons cited in the
minutes.  There was criticism by ekr (Eric Riscola) that for
digital signing the recursive nature of the DOM HASH proposal
is not needed (as it would be for efficient tree comparison)
and is slower than just feeding a similarly defined ordered
byte stream for the entire structure to be signed into a
single hash function.

### On the hashes of hashes question, I don't claim to be an
expert in this area.  But the modern definition of a strong
hash function is one for which it is computationally
infeasible to find an input string that will hash to a given
hash value AND computationally infeasible to find two input
strings that will hash to the same value.  The "Fingerprint"
function used in the reference evidently does not meet this
modern definition because it states that it is easy to come
up with two inputs that have the same hash if you know the
"magic number".  (The modern definition assumes you know all
of the interior details of the hash function.)

### Furthermore, PKCS#7, as I recall, normally uses hashes
of hashes.  This is, it hashes the data to be signed as well
as any authenticated attributes and then signes the hash of
those hashes.  Taking the hash of hashes is a common
modern technique even for systems that have received much
close examinateion so I tend to believe that it is
well accepted for modern hash functions like SHA-1.

2. Brown Draft: XML and binary, alg-independent, composite docs, adding and
removing parts, internal and external items (XLink), endorsement, multiple
recipients (shared keyed hash), co-signature. Concern about crypto
weaknesses. Again, why not just use S/MIME chunks?

### Although ekr said he had concern about crypto weakness
the only specific he gave was an incorrect
criticism where he said that the giving of the hash function
before a block to  be signed was not sufficient for one-pass
processing becasue HMAC requires not just the function but
also the key to prefix.  But that is beside the point since
the data is not signed directly, its hash is signed.  So
there is no flaw here (and if there were, it would be purely
an efficiency flaw, not a crypto flaw).

### On "why not just use S/MIME chunks?"  I think an XML
signature standard that isn't XML encoded misses the point.

3. IETF or W3C: Some folks are more comfortable with the IETF process and
security expertise, others feel the key is coordination with other XML
activities. Consensus from W3C workshop needs to be reported back to IETF
on
W3C's plans.

### It seems clear that there would be wider participation
in an IETF working group.  The BoF had 157 attendees that
signed the attendance list.  The last time I counted the
workshop was just over thirty including W3C staff.  That's
fewer than the number of people at the BoF who said they
would actively work on an IETF XML DSIG standards effort.
The IETF is a more open process and is less expensive to
participate in.

__

[1] Dan Connolly later provided me with the reference that led him to raise
this concern:

"Alas, the method is not sound. Recall that the probabilistic guarantee
is valid only if the strings being fingerprinted are independent of the
magic number. But fingerprints themselves are dependent on the
magic number, so the probabilistic guarantee is invalid whenever
fingerprints are fingerprinted. The Vesta group was soon debugging
an unexpected collision."
excerpted from
http://www.research.digital.com/SRC/m3sources/html/fingerprint/src/Fingerpr
in
t.i3.html
Copyright (C) 1994, Digital Equipment Corp.



___________________________________________________________
Joseph Reagle Jr.  W3C:     http://www.w3.org/People/Reagle/
Policy Analyst     Personal:  http://web.mit.edu/reagle/www/
                   mailto:reagle@w3.org

Received on Monday, 5 April 1999 13:35:01 UTC