- From: Mark Callow <callow_mark@hicorp.co.jp>
- Date: Wed, 28 Mar 2012 18:32:18 +0900
On 28/03/2012 18:13, Boris Zbarsky wrote:
>
> What one could do is to store the array buffer bytes always as little
> endian, and then if sending to the GPU byte-swap as needed based on
> the API call being used (and hence the exact types the GPU actually
> expects).
>
> So basically, make all JS-visible state always be little-endian, and
> deal in the one place where you actually need native endianness.
>
Then, if you are on a big-endian system an app will not be able to read
& write int, float, etc. into the int32Array, float32Array, etc. "Typed"
in TypedArrays will no longer have any meaning.
BTW, if the CPU & GPU differ in endianness it is the responsibility of
the OpenGL driver to handle it. When you tell GL you are passing it,
e.g. GL_FLOATs, the values are expected to be in CPU byte-order.
Regards
-Mark
Received on Wednesday, 28 March 2012 02:32:18 UTC