W3C home > Mailing lists > Public > public-media-capture@w3.org > April 2013

Re: MediaStream: multiple consumer case

From: Martin Thomson <martin.thomson@gmail.com>
Date: Wed, 3 Apr 2013 12:57:35 -0700
Message-ID: <CABkgnnVAH_=ohCYALz80nCaq40doJBoZvKy_XxwLwcsB_i-J3g@mail.gmail.com>
To: Anne van Kesteren <annevk@annevk.nl>
Cc: "public-media-capture@w3.org" <public-media-capture@w3.org>
On 3 April 2013 11:33, Anne van Kesteren <annevk@annevk.nl> wrote:
> On Wed, Apr 3, 2013 at 7:21 PM, Martin Thomson <martin.thomson@gmail.com> wrote:
>> The uses you describe for XMLHttpRequest are different to media
>> streams.  We want media to be "lost" if it isn't played out.  If we
>> buffered, it would cease to be real-time.
> That makes sense, but there still is some duplication or reading from
> the same shared buffer going. I'd be interested in hearing the details
> of that or a pointer to where it's explained.

That's implicit in the fact that we're dealing with, in most of our
use cases, real-time data.  I don't know of a place where that is
written down.  There might be something in the audio processing APIs.

As we say often in this group, a MediaStream is a control surface for
a construct that isn't directly observable, except when it is rendered
(using <video>/<audio>, through frame capture or recording).  The
MediaStreams we have defined are inherently real time, which means
that if you aren't observing them, the data just goes away.

This is different from a static source, which we can choose not to
draw from and its state wont change when we come back.  Playing out
from a file is one example of a static source.  The MediaStream that
might be generated from a paused file source would still flow, but it
would be empty (silent or black) during the pause.
Received on Wednesday, 3 April 2013 19:58:06 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 16:26:16 UTC