Re: Coordination between MediaStream and MediaSource

Thanks for the note!

We have been discussing the relationship between MediaSource and 
MediaStream off and on since they were born - most commonly in the 
context of thinking about the scenarios that would be possible if a 
MediaSource were able to generate a MediaStream, or a MediaStream 
provide data in the form expected by a MediaSource.

The latter seems to be a natural fit for the Recording interface 
(https://dvcs.w3.org/hg/dap/raw-file/tip/media-stream-capture/MediaRecorder.html), 
since that is already expected to produce encoded data in a 
chunk-at-a-time fashion; the former seems at the moment to gravitate 
towards making it possible to generate a MediaStream from a <video> tag 
that duplicates the <video> tag's input - that would solve both the 
MediaSource case and a number of other scenarios - streaming from remote 
URLs and sourcing video from a file, for example.

See 
https://www.w3.org/wiki/images/archive/2/2d/20140519124714%21Martin-slides.pdf 
slide 7 for the proposal.

On 06/06/2014 10:38 PM, Ian Hickson wrote:
> Hi,
>
> The MediaStream interface is used to represent streams of media data,
> typically (but not necessarily) of audio and/or video content.
>
> The MediaSource object represents a source of media data for an
> HTMLMediaElement.
>
> http://dev.w3.org/2011/webrtc/editor/getusermedia.html#stream-api
> https://dvcs.w3.org/hg/html-media/raw-file/tip/media-source/media-source.html#mediasource
>
> While these sources have quite different properties -- one is populated by
> the user agent from a user-defined source, e.g. a local camera, a stream
> from a remote computer, or a static video file; the other is populated by
> the script itself, providing data directly in the form of binary data --
> they nonetheless share a lot in common:
>
>   - they are both sources for use by <video> elements
>
>   - they both provide APIs for the management of tracks, merging them into
>     a single object for presentation
>
> >From the HTML spec's perspective, my intent is to treat MediaStream and
> MediaSource objects identically, allowing both where either is allowed.
>
> However, I would like to encourage the working groups to coordinate their
> efforts so that these APIs are intuitive to authors, even in situations
> where the author uses both. For example, it seems like it would make sense
> to allow an audio source from a local microphone to be merged with video
> data from an ArrayBufer, for output in a single <video> element. Or for
> WebRTC to take data generated from mixing ArrayBuffers and send it to a
> remote host as a stream.
>
> Please let me know if there's anything I can do to help with this.
>
> Cheers,

Received on Monday, 9 June 2014 07:47:35 UTC