- From: Aryeh Gregor <Simetrical+w3c@gmail.com>
- Date: Mon, 15 Nov 2010 19:46:58 -0500
On Mon, Nov 15, 2010 at 1:19 AM, Boris Zbarsky <bzbarsky at mit.edu> wrote: > 2) ?Casting an array of integers to an array of bytes will give you > ? ?different results on different hardware. ?For example, the > ? ?integer 0xffff0080 when viewed as imagedata bytes is either > ? ?rgba(255, 255, 0, 0.5) (half-opaque yellow) or rgba(128, 0, 255, 1) > ? ?(fully opaque purplish blue), depending on your endianness. That's evil. Isn't JavaScript meant to conceal machine details like endianness? Couldn't we mandate that the conversion here must be little-endian? Granted that it'd be slower on ARM and such, but "slower" is way better than "causes the program to break". If the performance is really needed, provide extra methods that convert in big-endian fashion. Then those writing programs targeted at ARM, or those who are willing to write different algorithms for big- and little-endian, can use those instead. Or has this already become such a big and general problem that fixing it is basically hopeless, and we're just resigned to everyone's scripts breaking on ARM because they were only tested on x86?
Received on Monday, 15 November 2010 16:46:58 UTC