On Mon, Apr 23, 2012 at 1:56 PM, Maciej Stachowiak <mjs@apple.com> wrote:
>
> On Apr 23, 2012, at 11:00 AM, Aaron Colwell wrote:
> >
> > I agree with Rob that WebRTC & the Audio-WG should coordinate their
> efforts. I'd prefer it if there was only 1 way to get the stream data,
> decoded or otherwise, out of the HTMLMediaElement.
>
> I think the same principle applies to getting data *into* the
> HTMLMediaElement. And ideally, the ways of getting data in and out should
> align well with each other. That's the reason I think MediaSource should be
> part of this conversation
>
Fair enough. I'm open to exploring this. I think we should consider
refactoring MediaStream a little bit so the WebRTC specific methods for
muting & stream enabling are in some form of subclass.
How do we proceed? I'm a W3C newbie so I'm not sure what the process is.
Aaron