- From: <bugzilla@wiggum.w3.org>
- Date: Fri, 06 Jan 2006 13:35:03 +0000
- To: www-xml-schema-comments@w3.org
- Cc:
http://www.w3.org/Bugs/Public/show_bug.cgi?id=2627 cmsmcq@w3.org changed: What |Removed |Added ---------------------------------------------------------------------------- Status|NEW |ASSIGNED ------- Additional Comments From cmsmcq@w3.org 2006-01-06 13:35 ------- An alternative proposal, intended to make clear that the reference to decimals and integers being the numbers "generally used in describing datatypes" means that decimals and integers are used in the algorithms of appendices D and E. Change the first bullet item from A number (without precision) is an ordinary mathematical number; see Numerical Values (§D.1) for a discussion of "ordinary" versus "precision-carrying" numbers. The numbers generally used in describing datatypes are decimal numbers and integers. to A number (without precision) is an ordinary mathematical number; 1, 1.0, and 1.000000000000 are the same number. The numbers generally used in the algorithms below are decimal numbers and integers. But I could also live with the suggestion in the description of the issue.
Received on Friday, 6 January 2006 13:35:07 UTC