Coordination between MediaStream and MediaSource

Hi,

The MediaStream interface is used to represent streams of media data, 
typically (but not necessarily) of audio and/or video content.

The MediaSource object represents a source of media data for an 
HTMLMediaElement.

http://dev.w3.org/2011/webrtc/editor/getusermedia.html#stream-api
https://dvcs.w3.org/hg/html-media/raw-file/tip/media-source/media-source.html#mediasource

While these sources have quite different properties -- one is populated by 
the user agent from a user-defined source, e.g. a local camera, a stream 
from a remote computer, or a static video file; the other is populated by 
the script itself, providing data directly in the form of binary data -- 
they nonetheless share a lot in common:

 - they are both sources for use by <video> elements

 - they both provide APIs for the management of tracks, merging them into 
   a single object for presentation

>From the HTML spec's perspective, my intent is to treat MediaStream and 
MediaSource objects identically, allowing both where either is allowed. 

However, I would like to encourage the working groups to coordinate their 
efforts so that these APIs are intuitive to authors, even in situations 
where the author uses both. For example, it seems like it would make sense 
to allow an audio source from a local microphone to be merged with video 
data from an ArrayBufer, for output in a single <video> element. Or for 
WebRTC to take data generated from mixing ArrayBuffers and send it to a 
remote host as a stream.

Please let me know if there's anything I can do to help with this.

Cheers,
-- 
Ian Hickson               U+1047E                )\._.,--....,'``.    fL
http://ln.hixie.ch/       U+263A                /,   _.. \   _\  ;`._ ,.
Things that are impossible just take longer.   `._.-(,_..'--(,_..'`-.;.'

Received on Friday, 6 June 2014 20:39:19 UTC