W3C home > Mailing lists > Public > public-script-coord@w3.org > January to March 2013

Re: IDL: number types

From: Boris Zbarsky <bzbarsky@MIT.EDU>
Date: Wed, 20 Mar 2013 13:18:44 -0400
Message-ID: <5149EF74.5050704@mit.edu>
To: Marcos Caceres <w3c@marcosc.com>
CC: Yehuda Katz <wycats@gmail.com>, Anne van Kesteren <annevk@annevk.nl>, public-script-coord@w3.org
On 3/20/13 12:28 PM, Marcos Caceres wrote:
> What I am trying to understand is what types are being used by authors today and why? For instance, why does the XHR spec really need to use an "unsigned short" for a set of constants that go from 0-4 (instead of an octet)? And so on.

Ah.  So the answer to that is likely to be "for no particularly good 
reason" in many cases.  Or at least for no particularly good reason that 
matters to JS.

> The argument (complaint?) being made, as I understand it, is that the current numeric types in WebIDL don't have equivalents in ES itself

True.

Right now ES the language effectively has concepts of doubles, 32-bit 
signed and unsigned ints, and 16-bit unsigned ints (though this last is 
mostly present in the form of "sequence of 16-bit unsigned ints" and in 
String.fromCharCode).

That said, there are lots of places where people use JS strings or 
arrays as effectively "sequence of some sort of 8-bit integer" in 
practice, and just store them in 16-bit or 32-bit integers with the high 
bits 0...

> (and apparently, this leads to confusion in certain cases

I would love more information on this.

-Boris
Received on Wednesday, 20 March 2013 17:19:13 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 8 May 2013 19:30:09 UTC