W3C home > Mailing lists > Public > public-html-media@w3.org > April 2014

Re: Issue with updating/updateend

From: Aymeric Vitte <vitteaymeric@gmail.com>
Date: Thu, 17 Apr 2014 23:10:22 +0200
Message-ID: <5350433E.2070806@gmail.com>
To: Aaron Colwell <acolwell@google.com>
CC: "public-html-media@w3.org" <public-html-media@w3.org>
What I mean here is that this API just does not work and can not, unless 
I am proven incorrect, please answer "I still don't get the rationale 
for 'updating' and why appendBuffer does not queue the chunks by 
itself", first time I see a boolean against events, looks very 
approximate, what's the use of this boolean in a stream or promises 
context? And no I am not going to file a bug, just take the youtube 
player, delay the chunks so the event loop breaks and you will see the 
issue. You can continue ignoring, eluding, disconsidering it, that will 
not solve it.


Le 17/04/2014 20:16, Aaron Colwell a écrit :
> This is not a Chrome support channel. Please file a bug at 
> http://crbug.com with a complete minimal repro attached and I can take 
> a look.
>
> Aaron
>
>
> On Thu, Apr 17, 2014 at 10:46 AM, Aymeric Vitte 
> <vitteaymeric@gmail.com <mailto:vitteaymeric@gmail.com>> wrote:
>
>     Insisting on this one, I spent quite a lot of time on this and
>     it's still not working perfectly, maybe other implementations
>     don't have the problem because they are not using a so small size
>     of chunks and/or chunks are never delayed so the events chaining
>     never stops.
>
>     //on each chunk received do:
>     append_buffer.push(chunk);
>
>     //handle1
>     if ((!source.updating)&&(append_buffer.length===1)) {
>
>         source.appendBuffer(append_buffer.shift());
>     }
>     if (first_chunk) {
>         source.addEventListener('updateend',function() {
>             //handle2
>             if (append_buffer.length) {
>                 source.appendBuffer(append_buffer.shift());
>             };
>         });
>     };
>
>     This should work but it does not with Chrome, append_buffer
>     reaches a size of 0, the last chunk is being appended, a new chunk
>     is coming, updateend fires --> handle1 and handle2 can execute at
>     the same time and append wrongly the same chunk.
>
>     It's not supposed to be possible but this is what is happening,
>     maybe related to concurrent access.
>
>     A workaround is to maintain the events chaining by appending
>     chunks of size 0 using a timeout, it's working most of the time
>     but sometimes appending a chunk of size 0 fails too, for unknown
>     reasons, on Chrome chrome:media-internals only says 'decode error'.
>
>     Specs issue or Chrome issue, I don't know, I still don't get the
>     rationale for 'updating' and why appendBuffer does not queue the
>     chunks by itself.
>
>     Regards
>
>     Aymeric
>
>     Le 02/04/2014 22:46, Aymeric Vitte a écrit :
>
>         The usual code is something like:
>
>         if (!source.updating) {
>             source.appendBuffer(append_buffer.shift());
>         }
>         if (first_chunk) {
>             source.addEventListener('updateend',function() {
>                 if (append_buffer.length) {
>                     source.appendBuffer(append_buffer.shift());
>                 };
>             });
>         };
>
>         The use case is: chunks of 498 B and bandwidth rate of 1 Mbps,
>         and this does not work at all, at least with Chrome, it might
>         be a Chrome issue and/or a spec issue.
>
>         Because between two 'updateend' events, the 'updating'
>         property can become false, therefore you can append a chunk at
>         the wrong place, if your remove the first part of the code (or
>         replace it by if (first_chunk) {source.append...}) then the
>         buffer chaining can stop if for some reasons the chunks are
>         delayed.
>
>         With streams the problem will disappear, without streams there
>         is a workaround, but as I mentionned in a previous post I
>         don't find this behavior normal.
>
>         Regards
>
>         Aymeric
>
>
>     -- 
>     Peersm : http://www.peersm.com
>     node-Tor : https://www.github.com/Ayms/node-Tor
>     GitHub : https://www.github.com/Ayms
>
>
>

-- 
Peersm : http://www.peersm.com
node-Tor : https://www.github.com/Ayms/node-Tor
GitHub : https://www.github.com/Ayms
Received on Thursday, 17 April 2014 21:10:59 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 20:33:03 UTC