- From: Florian Bösch <pyalot@gmail.com>
- Date: Mon, 14 Jan 2013 13:20:58 +0100
- To: Webapps WG <public-webapps@w3.org>
- Message-ID: <CAOK8ODgBy-9H36qZ6pkQUPagWukK=S3wMV8=KVKHnZvkfoKMSg@mail.gmail.com>
Having a texture in WebGL and wanting to encode it into a typed array as PNG, I have found that the only way to do it is the following convoluted method. 1) Create canvas, set to desired size 2) Create 2D context 3) Create imageData object 4) Create a WebGL framebuffer object 5) Attach texture as color target to framebuffer 6) read back pixels into canvas2d's imageData.data member 7) ctx.putImageData into the canvas 8) call canvases toDataURL('image/png') 9) snip off the mime/encoding header 10) implement base64 decode in JS and decode to Uint8Array This process does not strike me as a terribly good way to do it because it is both entirely sync as well as inefficient. There are two parts that could be improved a lot, the first part is how to get bytes out of webgl but this is not what I want to discuss here. The second part is how to encode image bytes to an image, which I'd like to present a suggestion for. Steps 1,2,3,7,8,9 and 10 could be immensely simplified, made asynchronous and made efficient by the following straightforward function: encodePNG(inputDataBufferView, width, height, targetBufferView, onDone, [onProgress]) The callback on onDone would be called with the size written to the targetBufferView.
Received on Monday, 14 January 2013 12:21:26 UTC