Re: Media Source draft proposal

On Thu, Apr 19, 2012 at 4:45 PM, Robert O'Callahan <robert@ocallahan.org>wrote:

> On Fri, Apr 20, 2012 at 6:58 AM, Maciej Stachowiak <mjs@apple.com> wrote:
>
>> It seems to me that this spec has some conceptual overlap with WebRTC,
>> and WebAudio, which both involve some direct manipulation and streaming of
>> media data.
>>
>> WebRTC: http://dev.w3.org/2011/webrtc/editor/webrtc.html
>> Web Audio API:
>> https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html
>>
>
>
> I actually think those are fairly well separated from this proposal. This
> proposal is all about manipulating the data that goes into a decoder;
> WebRTC and the Audio WG are all about manipulating decoded data. The latter
> two need to be carefully coordinated, but this doesn't.
>

Here is how I tend to think about these three APIs:
MediaStream  : Floor control & stream routing mechanisms.
Media Source : Dynamic presentation creation.
WebAudio : Presentation audio post-processing framework.

If you squint the right way it does look like you could make the Media
Source API a new type of MediaStream object, but most of the floor control
functionality like muting and enabling would useless or redundant. I also
believe it makes more sense to tie Media Source to HTMLMediaElement because
the most likely use cases like adaptive streaming and ad insertion require
standard playback functionality whereas MediaStream essentially assumes a
linear stream and doesn't have mechanisms for supporting seek, buffering
state transitions, and progress events.

I always believed that Media Source could be used with the MediaStream and
WebAudio through a mechanism like the stream attribute in Rob's
spec<https://dvcs.w3.org/hg/audio/raw-file/tip/streams/StreamProcessing.html#media-elements>
and/or
the createMediaElementSource() method in the WebAudio
spec<https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html#dfn-createMediaElementSource>
..
This would provide the benefit of playback control and would not burden
these two APIs with the fact that the presentation is being generated by
JavaScript or by a simple resource URL.

I agree with Rob that WebRTC & the Audio-WG should coordinate their
efforts. I'd prefer it if there was only 1 way to get the stream data,
decoded or otherwise, out of the HTMLMediaElement. I definitely plan to
keep an eye on the progress of those two specs, but I don't think Media
Source really overlaps with their goals so close coordination may not be
necessary.

Aaron

Received on Monday, 23 April 2012 18:02:09 UTC