- From: Aaron Colwell <acolwell@google.com>
- Date: Thu, 17 Apr 2014 11:16:36 -0700
- To: Aymeric Vitte <vitteaymeric@gmail.com>
- Cc: "public-html-media@w3.org" <public-html-media@w3.org>
- Message-ID: <CAA0c1bCNu0JtgGVsHC4mw88fAxOMsApiT2C4Rruip9E7-9A6Vw@mail.gmail.com>
This is not a Chrome support channel. Please file a bug at
http://crbug.comwith a complete minimal repro attached and I can take
a look.
Aaron
On Thu, Apr 17, 2014 at 10:46 AM, Aymeric Vitte <vitteaymeric@gmail.com>wrote:
> Insisting on this one, I spent quite a lot of time on this and it's still
> not working perfectly, maybe other implementations don't have the problem
> because they are not using a so small size of chunks and/or chunks are
> never delayed so the events chaining never stops.
>
> //on each chunk received do:
> append_buffer.push(chunk);
>
> //handle1
> if ((!source.updating)&&(append_buffer.length===1)) {
>
> source.appendBuffer(append_buffer.shift());
> }
> if (first_chunk) {
> source.addEventListener('updateend',function() {
> //handle2
> if (append_buffer.length) {
> source.appendBuffer(append_buffer.shift());
> };
> });
> };
>
> This should work but it does not with Chrome, append_buffer reaches a size
> of 0, the last chunk is being appended, a new chunk is coming, updateend
> fires --> handle1 and handle2 can execute at the same time and append
> wrongly the same chunk.
>
> It's not supposed to be possible but this is what is happening, maybe
> related to concurrent access.
>
> A workaround is to maintain the events chaining by appending chunks of
> size 0 using a timeout, it's working most of the time but sometimes
> appending a chunk of size 0 fails too, for unknown reasons, on Chrome
> chrome:media-internals only says 'decode error'.
>
> Specs issue or Chrome issue, I don't know, I still don't get the rationale
> for 'updating' and why appendBuffer does not queue the chunks by itself.
>
> Regards
>
> Aymeric
>
> Le 02/04/2014 22:46, Aymeric Vitte a écrit :
>
> The usual code is something like:
>>
>> if (!source.updating) {
>> source.appendBuffer(append_buffer.shift());
>> }
>> if (first_chunk) {
>> source.addEventListener('updateend',function() {
>> if (append_buffer.length) {
>> source.appendBuffer(append_buffer.shift());
>> };
>> });
>> };
>>
>> The use case is: chunks of 498 B and bandwidth rate of 1 Mbps, and this
>> does not work at all, at least with Chrome, it might be a Chrome issue
>> and/or a spec issue.
>>
>> Because between two 'updateend' events, the 'updating' property can
>> become false, therefore you can append a chunk at the wrong place, if your
>> remove the first part of the code (or replace it by if (first_chunk)
>> {source.append...}) then the buffer chaining can stop if for some reasons
>> the chunks are delayed.
>>
>> With streams the problem will disappear, without streams there is a
>> workaround, but as I mentionned in a previous post I don't find this
>> behavior normal.
>>
>> Regards
>>
>> Aymeric
>>
>>
> --
> Peersm : http://www.peersm.com
> node-Tor : https://www.github.com/Ayms/node-Tor
> GitHub : https://www.github.com/Ayms
>
>
>
Received on Thursday, 17 April 2014 18:17:05 UTC