- From: Stefan Håkansson LK <stefan.lk.hakansson@ericsson.com>
- Date: Tue, 26 Mar 2013 10:38:11 +0100
- To: public-media-capture@w3.org
On 3/25/13 10:55 PM, Martin Thomson wrote: > I think that it's fair to say that the current state of the > MediaStream[Track] states is pretty dire. At least from a usability > perspective. > > Let's take a quick inventory: > > MediaStreamTrack: > attribute boolean enabled; > readonly attribute MediaStreamTrackState readyState; > attribute EventHandler onstarted; > attribute EventHandler onmute; > attribute EventHandler onunmute; > attribute EventHandler onended; > > MediaStream: > attribute boolean ended; > attribute EventHandler onended; > > On the track we have two state variables, one of them writeable, and > four events. At a bare minimum, this could be a single onstatechange > event. > > For the stream, I really don't know why I would want to set ended = > true on a stream. The explanation doesn't cast any light on the > matter either. Let's pretend that this is read-only for the moment. > > > Rendering > > I believe that the gUM document needs to be very clear about the > rendering logic for MediaStream instances. This is a little too > amorphous in the current document. I propose a section or sub-section > entitled "Rendering MediaStreams" that has an explanation along the > lines of the following. > > When a particular MediaStream instance is attached to a sink that > consumes only one source (not a mixing sink, like <audio> probably > will be), the output will be selected from the set of tracks in the > stream that: > > - are "live"; that is, in a readyState of "muted" or "unmuted" (not > "new" or "ended") > - have enabled = true > > ...of course, media will only render (other than silence/black) if the > selected track's readyState is "unmuted". I'm assuming here that > track selection does not examine the muted/unmuted state. > > ...but only if the stream itself is not ended. Note that the stream > can be un-ended by adding an un-ended track. > > A mixing sink (such as <audio>) renders multiple tracks > simultaneously, combining all candidate tracks. (Commenting on the entire "Rendering" section) I agree to that we need to describe in more detail how MediaStream's interact with media (audio and video) elements. Jim took a stab at it last year, but it is time for an update. Audio and video are also different in that you can render only one video track in a media element, but mix all audio tracks. However, I want to point out that the "resource fetch algorithm" of the html 5 Candidate rec (http://www.w3.org/TR/html5/embedded-content-0.html#concept-media-load-resource) already describes this quite detailed. I think we might get by with referring to that, but clarifying how certain things apply to MediaStreams.
Received on Tuesday, 26 March 2013 09:38:37 UTC