Re: Issue with updating/updateend

Le 17/04/2014 23:31, Aaron Colwell a écrit :
> On Thu, Apr 17, 2014 at 2:10 PM, Aymeric Vitte <vitteaymeric@gmail.com 
> <mailto:vitteaymeric@gmail.com>> wrote:
>
>     What I mean here is that this API just does not work and can not,
>     unless I am proven incorrect, please answer "I still don't get the
>     rationale for 'updating' and why appendBuffer does not queue the
>     chunks by itself", first time I see a boolean against events,
>     looks very approximate, what's the use of this boolean in a stream
>     or promises context?
>
>
> This would force the UA to have to arbitrarily buffer an unlimited 
> amount of data.

Could you explain this please? The updating boolean can not stop the 
event loop, so data are coming, buffered, appended and apparently 
discarded at a certain point of time.

> The updating boolean is a form of backpressure.

Same question, how can this boolean be used for backpressure?

> SourceBuffer.appendBuffer() should always throw an exception if 
> updating is set to true.

Why can it not queue the chunk instead of throwing?

> Stream and Promises were not even available when the MSE 
> standardization process was started. The Stream API spec got rebooted 
> recently

Yes and I am part of the "reboot", and backpressure is still a kind of 
open issue while I have given my thoughts about this subject [1], this 
can not be solved with a boolean of course.

> so I don't think one can make a claim that converting to streams would 
> "just work". Unprefixed MSE implementations have been shipping for 
> quite some time and I don't think it makes sense at this point to 
> convert everything to Promises now. That would just create unnecessary 
> churn and pain for large existing deployments like YouTube, Netflix, 
> and others.

Maybe, streams for sure would help, now maybe I am missing a fundamental 
use of updating in this API, cf above questions.

>
>     And no I am not going to file a bug, just take the youtube player,
>     delay the chunks so the event loop breaks and you will see the
>     issue. You can continue ignoring, eluding, disconsidering it, that
>     will not solve it.
>
>
> I don't particularly care for this tone.

Me neither for yours.

> It doesn't make me want to help

I don't need help, just this API to work correctly.

> you especially if you are unwilling to provide a simple repro case 
> that helps isolate the problem you claim is occurring. Saying "just 
> take the youtube player" is not sufficient given that it is a massive 
> piece of JavaScript

It's a simple piece of js.

> and it isn't clear to me how to "delay the chunks" in the same way you 
> are doing it

I am not doing it, the network is, delaying chunks means you don't 
receive enough data and the event loop breaks.

> . If the problem is as fundamental as you claim, it should be trivial 
> for you to create some simple JavaScript to reproduce the problem. 
> Please try to be part of the solution and not simply complain.

I do not complain but taking time to report a potential problem and 
trying to solve it, now in the light of your answers I still don't know 
if we are facing a spec issue or a Chrome issue, I don't get the use of 
the updating boolean.

In order to avoid any misunderstanding, I would not be replying here 
late my time if I did not think there could be an issue,

Regards

Aymeric

[1] https://github.com/whatwg/streams/issues/13

>
> Aaron
>
>
>
>     Le 17/04/2014 20:16, Aaron Colwell a écrit :
>>     This is not a Chrome support channel. Please file a bug at
>>     http://crbug.com with a complete minimal repro attached and I can
>>     take a look.
>>
>>     Aaron
>>
>>
>>     On Thu, Apr 17, 2014 at 10:46 AM, Aymeric Vitte
>>     <vitteaymeric@gmail.com <mailto:vitteaymeric@gmail.com>> wrote:
>>
>>         Insisting on this one, I spent quite a lot of time on this
>>         and it's still not working perfectly, maybe other
>>         implementations don't have the problem because they are not
>>         using a so small size of chunks and/or chunks are never
>>         delayed so the events chaining never stops.
>>
>>         //on each chunk received do:
>>         append_buffer.push(chunk);
>>
>>         //handle1
>>         if ((!source.updating)&&(append_buffer.length===1)) {
>>
>>         source.appendBuffer(append_buffer.shift());
>>         }
>>         if (first_chunk) {
>>         source.addEventListener('updateend',function() {
>>                 //handle2
>>                 if (append_buffer.length) {
>>         source.appendBuffer(append_buffer.shift());
>>                 };
>>             });
>>         };
>>
>>         This should work but it does not with Chrome, append_buffer
>>         reaches a size of 0, the last chunk is being appended, a new
>>         chunk is coming, updateend fires --> handle1 and handle2 can
>>         execute at the same time and append wrongly the same chunk.
>>
>>         It's not supposed to be possible but this is what is
>>         happening, maybe related to concurrent access.
>>
>>         A workaround is to maintain the events chaining by appending
>>         chunks of size 0 using a timeout, it's working most of the
>>         time but sometimes appending a chunk of size 0 fails too, for
>>         unknown reasons, on Chrome chrome:media-internals only says
>>         'decode error'.
>>
>>         Specs issue or Chrome issue, I don't know, I still don't get
>>         the rationale for 'updating' and why appendBuffer does not
>>         queue the chunks by itself.
>>
>>         Regards
>>
>>         Aymeric
>>
>>         Le 02/04/2014 22:46, Aymeric Vitte a écrit :
>>
>>             The usual code is something like:
>>
>>             if (!source.updating) {
>>             source.appendBuffer(append_buffer.shift());
>>             }
>>             if (first_chunk) {
>>             source.addEventListener('updateend',function() {
>>                     if (append_buffer.length) {
>>             source.appendBuffer(append_buffer.shift());
>>                     };
>>                 });
>>             };
>>
>>             The use case is: chunks of 498 B and bandwidth rate of 1
>>             Mbps, and this does not work at all, at least with
>>             Chrome, it might be a Chrome issue and/or a spec issue.
>>
>>             Because between two 'updateend' events, the 'updating'
>>             property can become false, therefore you can append a
>>             chunk at the wrong place, if your remove the first part
>>             of the code (or replace it by if (first_chunk)
>>             {source.append...}) then the buffer chaining can stop if
>>             for some reasons the chunks are delayed.
>>
>>             With streams the problem will disappear, without streams
>>             there is a workaround, but as I mentionned in a previous
>>             post I don't find this behavior normal.
>>
>>             Regards
>>
>>             Aymeric
>>
>>
>>         -- 
>>         Peersm : http://www.peersm.com
>>         node-Tor : https://www.github.com/Ayms/node-Tor
>>         GitHub : https://www.github.com/Ayms
>>
>>
>>
>
>     -- 
>     Peersm :http://www.peersm.com
>     node-Tor :https://www.github.com/Ayms/node-Tor
>     GitHub :https://www.github.com/Ayms
>
>

-- 
Peersm : http://www.peersm.com
node-Tor : https://www.github.com/Ayms/node-Tor
GitHub : https://www.github.com/Ayms

Received on Thursday, 17 April 2014 22:59:53 UTC