Re: MediaStream: multiple consumer case

On 3 April 2013 11:33, Anne van Kesteren <annevk@annevk.nl> wrote:
> On Wed, Apr 3, 2013 at 7:21 PM, Martin Thomson <martin.thomson@gmail.com> wrote:
>> The uses you describe for XMLHttpRequest are different to media
>> streams.  We want media to be "lost" if it isn't played out.  If we
>> buffered, it would cease to be real-time.
>
> That makes sense, but there still is some duplication or reading from
> the same shared buffer going. I'd be interested in hearing the details
> of that or a pointer to where it's explained.

That's implicit in the fact that we're dealing with, in most of our
use cases, real-time data.  I don't know of a place where that is
written down.  There might be something in the audio processing APIs.

As we say often in this group, a MediaStream is a control surface for
a construct that isn't directly observable, except when it is rendered
(using <video>/<audio>, through frame capture or recording).  The
MediaStreams we have defined are inherently real time, which means
that if you aren't observing them, the data just goes away.

This is different from a static source, which we can choose not to
draw from and its state wont change when we come back.  Playing out
from a file is one example of a static source.  The MediaStream that
might be generated from a paused file source would still flow, but it
would be empty (silent or black) during the pause.

Received on Wednesday, 3 April 2013 19:58:06 UTC