[whatwg] Endianness of typed arrays

vertexAttribPointer lets you specifiy to WebGL the layout and type of
the data in the buffer object. The API follows OpenGL {,ES} for
familiarity and reflects its heritage of a C API avoiding use of
structures. But it works.

OpenGL {,ES} developers typically load data from a serialized form and
perform endianness conversion during deserialization. The serialized
form is what would be loaded into an ArrayBuffer via XHR. It is then
deserialized into 1 or more additional ArrayBuffers.

Regards

    -Mark

On 28/03/2012 18:42, Boris Zbarsky wrote:
> On 3/28/12 2:22 AM, Jonas Sicking wrote:
>> Except if the data was written in 32bit units you do a different byte
>> swapping than if the data was written as 16bit units.
>
> Hmm.  I just read the webgl spec more carefully and discovered that
> bufferData() actually takes a byte array whose structure is opaque,
> which would mean that in that case it does become very difficult to
> figure out what the swapping pattern should be.  :(
>
>> The typed-array spec was specifically designed for use cases like
>> sending buffers containing data in patterns like "32bit data, 16bit
>> data, 16bit data, 32bit data, 16bit data, 16bit data...".
>
> With all due respect, if it were really designed for those use cases
> it would allow declaring a "type" made of "32-bit data, 16-bit data,
> 16-bit data" and then creating an array of such types....  I
> understand we might get such APIs eventually, and then WebGL may end
> up usable again on big-endian platforms if developers use those APIs
> to fill the array buffer.  But as things stand, I think you're right:
> creating a browser implementation on big-endian hardware that has
> working webgl and works correctly with existing code that gets data
> using XHR arraybuffer is impossible as far as I can see.
>
> -Boris

Received on Wednesday, 28 March 2012 03:14:16 UTC