- From: Boris Zbarsky <bzbarsky@MIT.EDU>
- Date: Wed, 28 Mar 2012 02:42:11 -0700
On 3/28/12 2:22 AM, Jonas Sicking wrote: > Except if the data was written in 32bit units you do a different byte > swapping than if the data was written as 16bit units. Hmm. I just read the webgl spec more carefully and discovered that bufferData() actually takes a byte array whose structure is opaque, which would mean that in that case it does become very difficult to figure out what the swapping pattern should be. :( > The typed-array spec was specifically designed for use cases like > sending buffers containing data in patterns like "32bit data, 16bit > data, 16bit data, 32bit data, 16bit data, 16bit data...". With all due respect, if it were really designed for those use cases it would allow declaring a "type" made of "32-bit data, 16-bit data, 16-bit data" and then creating an array of such types.... I understand we might get such APIs eventually, and then WebGL may end up usable again on big-endian platforms if developers use those APIs to fill the array buffer. But as things stand, I think you're right: creating a browser implementation on big-endian hardware that has working webgl and works correctly with existing code that gets data using XHR arraybuffer is impossible as far as I can see. -Boris
Received on Wednesday, 28 March 2012 02:42:11 UTC