- From: Dave Kristol <dmk@allegra.att.com>
- Date: Wed, 22 Mar 95 09:31:42 EST
- To: http-wg%cuckoo.hpl.hp.com@hplb.hpl.hp.com
Earlier I raised a concern about MD5 and the case (upper or lower) used for the hex output. There's a more fundamental problem. I believe MD5 defines how to get a 128 bit digest of a message. WE have to define how such a string of bits gets converted to ASCII characters: byte (and nibble) order, prefixes (0x?) or suffixes, and case. "It;s obvious is an unacceptable answer when you're defining inter-operability standards. The case problem (is it 789ABCDEF or 789abcdef?) is important because digests of A1 and A2 themselves get digested in <digest>. Clearly a digest of 789ABCDEF is different from a digest of 789abcdef. Here are some proposed words: 1.3, MD5 Digest For the purposes of this Standard, an MD5 digest of 128 bits is represented as 32 ASCII printable characters. The bits in the 128 bit digest are converted from most significant to least significant bit, four bits at a time to their ASCII presentation as follows. Each four bits is represented by its familiar hexadecimal notation from the characters 0123456789abcdef. That is binary 0000 gets represented by the character '0', 0001, by '1', and so on up to the representation of 1111 as 'f'. Dave Kristol
Received on Wednesday, 22 March 1995 06:48:04 UTC