Re: Issue with updating/updateend

Insisting on this one, I spent quite a lot of time on this and it's 
still not working perfectly, maybe other implementations don't have the 
problem because they are not using a so small size of chunks and/or 
chunks are never delayed so the events chaining never stops.

//on each chunk received do:
append_buffer.push(chunk);

//handle1
if ((!source.updating)&&(append_buffer.length===1)) {
     source.appendBuffer(append_buffer.shift());
}
if (first_chunk) {
     source.addEventListener('updateend',function() {
         //handle2
         if (append_buffer.length) {
             source.appendBuffer(append_buffer.shift());
         };
     });
};

This should work but it does not with Chrome, append_buffer reaches a 
size of 0, the last chunk is being appended, a new chunk is coming, 
updateend fires --> handle1 and handle2 can execute at the same time and 
append wrongly the same chunk.

It's not supposed to be possible but this is what is happening, maybe 
related to concurrent access.

A workaround is to maintain the events chaining by appending chunks of 
size 0 using a timeout, it's working most of the time but sometimes 
appending a chunk of size 0 fails too, for unknown reasons, on Chrome 
chrome:media-internals only says 'decode error'.

Specs issue or Chrome issue, I don't know, I still don't get the 
rationale for 'updating' and why appendBuffer does not queue the chunks 
by itself.

Regards

Aymeric

Le 02/04/2014 22:46, Aymeric Vitte a écrit :
> The usual code is something like:
>
> if (!source.updating) {
>     source.appendBuffer(append_buffer.shift());
> }
> if (first_chunk) {
>     source.addEventListener('updateend',function() {
>         if (append_buffer.length) {
>             source.appendBuffer(append_buffer.shift());
>         };
>     });
> };
>
> The use case is: chunks of 498 B and bandwidth rate of 1 Mbps, and 
> this does not work at all, at least with Chrome, it might be a Chrome 
> issue and/or a spec issue.
>
> Because between two 'updateend' events, the 'updating' property can 
> become false, therefore you can append a chunk at the wrong place, if 
> your remove the first part of the code (or replace it by if 
> (first_chunk) {source.append...}) then the buffer chaining can stop if 
> for some reasons the chunks are delayed.
>
> With streams the problem will disappear, without streams there is a 
> workaround, but as I mentionned in a previous post I don't find this 
> behavior normal.
>
> Regards
>
> Aymeric
>

-- 
Peersm : http://www.peersm.com
node-Tor : https://www.github.com/Ayms/node-Tor
GitHub : https://www.github.com/Ayms

Received on Thursday, 17 April 2014 17:46:56 UTC