IndexedDB, Blobs and partial Blobs - Large Files

This is related to [1] (Use case) and [2] (Reduced test case)

This is about retrieving a large file with partial data and storing it 
in an incremental way in indexedDB.

Instead of maintaining an incremented Blob, the real use case is 
retrieving the Blob from indexedDB, doing new Blob([Blob, chunk]) and 
storing it again in indexedDB, for each chunk, decreasing performances 
even more.

Of course you could wait for having the complete Blob and store it, but 
that's not something very realistic, if you are streaming a 500 MB file 
and an error occurs most likely you would prefer to resume the download 
rather than restarting from the begining.

This seems not efficient at all, was it never discussed the possibility 
to be able to append data directly in indexedDB?

Regards

Aymeric

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=944918
[2] https://bugzilla.mozilla.org/show_bug.cgi?id=945281

-- 
Peersm : http://www.peersm.com
node-Tor : https://www.github.com/Ayms/node-Tor
GitHub : https://www.github.com/Ayms

Received on Monday, 2 December 2013 17:27:22 UTC