[whatwg] Endianness of typed arrays

Personally, I think TypedArrays are too closely tied to WebGL -- as they
become part of JS and not just tied to WebGL (for example, IE10 will
support TypedArrays but not WebGL), they are going to be used for other
things.  For example, reading/writing binary file formats or exchanging
wire protocol packets -- either of these cases, you need a specific
endianess, so the the existing typed arrays aren't useful and you need to
use DataView (which has its own limitations, such as no array views).

Even in the case of WebGL textures/etc, to do it right you need to detect
the byteorder and select the proper resource from the server that already
has that byteorder when you fetch it via XHR or WS.  It would be easier to
know that I am fetching a little-endian texture from the server, and then
when I pass it to something that needs big-endian data it gets converted at
that point (or perhaps when I create a Uint16Array from a Uint16LEArray).

Leaving it unspecified and requiring developers to figure it out so they
can fetch the proper resources means they are more likely to just assume
the native byteorder is the 99%-case and not worry about the rest.

If you were designing a mechanism for JS to manipulate binary data from
scratch, you would not design it this way.  However, we are likely stuck
with what we have, so we can only talk about enhancements -- perhaps adding
byte-order-specific views would be sufficient, or maybe extending DataView
to support array views (with strides, perhaps) would be better.

-- 
John A. Tamplin
Software Engineer (GWT), Google

Received on Wednesday, 28 March 2012 08:01:39 UTC