W3C home > Mailing lists > Public > www-xml-schema-comments@w3.org > October to December 2000

can first hex octet omit leading zeros?

From: Morris Matsa <mmatsa@us.ibm.com>
Date: Thu, 30 Nov 2000 17:52:02 -0500
To: www-xml-schema-comments@w3.org
Message-ID: <OFBBDE8325.6A41756E-ON852569A7.007ACA13@somers.hqregion.ibm.com>

The spec (http://www.w3.org/TR/2000/CR-xmlschema-2-20001024/#dt-encoding)
says:
"encoding is the encoded form of the lexical space"

Thus, the encoding defines the lexical space (not the value space) of a
binary number.  In particular, the hex encoding is defined (same source)
by:
If the value of encoding is hex then each binary octet is encoded as a
character tuple, consisting the two hexadecimal digits ([0-9a-fA-F])
representing the octet code.

Thus, to express the octet binary value 00001111 (decimal value 15), in a
space defined by a schema as a hex encoding, the only lexical value allowed
is "0F" (the two hexadecimal digits).  "F", removing the leading zero from
the first octet, is not allowed.

For the schema that I'm working on, I have certain XML entries that are
numbers expressed in hex. (e.g. 12, 3F, 2AB).  Am I right that I can not
use the binary type from XML schemas to represent general hexadecimal data
like this?  (Because 2AB is not allowed, since the lexical space is
constrained to 02AB).

I think that this decision is ok, and I'm still using hex encoding for
other elements.  I do not think that it is obvious, and I suggest
explaining it somewhere. (the primer?)

Morris
Received on Thursday, 30 November 2000 17:53:21 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Sunday, 6 December 2009 18:12:49 GMT