RE: Feedback on precision of Decimal type.

Actually, now that I have read the spec a little more carefully, I realize
that I missinterpreted what the Decimal type is suppose to represent.  When
I first read it, I thought it was suppose to represent a decimal integer.
After looking at it again, I realize it is really suppose to represent a
floating point value.

My concern was with having a 19 digit integer as the base type for all
integers.  Since Decimal is really a floating point value, this is not as
much a concern to me.  

Integers would be represented by ints and longs, and your definitions for
these seem to be fine for embedded applications.

Thanks for taking the time to read my email.  Sorry about the confusion on
my part.

						---- Charles


> -----Original Message-----
> From: Ashok Malhotra [mailto:petsa@us.ibm.com]
> Sent: Monday, January 08, 2001 10:38 AM
> To: Gordon, Charles
> Cc: www-xml-schema-comments@w3.org
> Subject: Re: Feedback on precision of Decimal type.
> 
> 
> 
> Thank you for this note.  This is precisely the kind of usage feedback
> we are looking for during the CR phase.  I'm fowarding your 
> note to the
> schema-comments mailing list.  Since the length of decimal digits is
> a priority feedback issue we will probably dicuss it a next 
> week's meeting.
> 
> 
> All the best, Ashok
> 
> 
> "Gordon, Charles" <CGordon@netsilicon.com> on 01/04/2001 03:51:11 PM
> 
> To:   Ashok Malhotra/Watson/IBM@IBMUS, "'Paul.V.Biron@kp.org'"
>       <Paul.V.Biron@kp.org>
> cc:
> Subject:  Feedback on precision of Decimal type.
> 
> 
> 
> Dear Editors,
> 
> I am starting a project using XML schema's and have therefore 
> been reading
> your spec's.  My application is to use XML to transfer data 
> to and from
> embedded devices.  One thing that concerns me in your spec is the
> requirement that the Decimal type have a precision of 18 digits.  This
> implies that it is represented by a 64-bit word in memory.  
> There was a
> note
> in that section requesting feedback on that decision, so here 
> is some from
> me.
> 
> I am developing code on a ARM7 processor using the Green 
> Hills Tool set.
> This compiler does not support 64-bit integers at all, not 
> even as longs.
> I
> am sure that many other tool sets for 32 and 16 bit 
> processors will also
> have this limitation.  For cost reasons, most embedded 
> applications only
> use
> 16 or 32 bit processors.  There are some that even use 8 bit 
> processors.
> These systems are not going to be able to support 18 digit numbers
> efficiently.  Most of the compiler sets for the 16 bit 
> processors won't
> even
> have built in support for them.  The tool set I'm using for 
> the ARM7 (which
> is a 32-bit processor) doesn't support 18 digit numbers (and 
> I'm sure it's
> not the only one).
> 
> I am aware that I could create my own integer type that 
> subsets the Decimal
> type and constains the range of allowable values so that it can be
> represented in 32-bit words.  However, this only helps me if 
> I am writing
> the schema.  There will be many applications where an 
> embedded device may
> need to support an existing schema.  In this case, the author may have
> simply used the standard XML Decimal and Integer types.  The embedded
> system
> will be forced to support 64-bit integers even though the 
> compiler tools do
> not.  This creates extra work for the programmer and reduces the
> performance
> of the application.
> 
> I suspect that 32-bit words will be adequate for most applications.  I
> suggest that you make the Decimal type have a precision of 9 
> digits.  This
> is supportable with 32-bit words.  You can define another 
> type, Long for
> example, that supports 18 digits.  This would solve the 
> problem since most
> developers would only specify a Long type if their 
> application actually
> needs it.
> 
>                               ---- Charles
> 
> 
> 
> 

Received on Monday, 8 January 2001 10:47:53 UTC