MediaStreams and Media elements

It's time to revise section 8 of the spec, which deals with how to pass a MediaStream to an HTML5 <audio> or <video> element (see http://dev.w3.org/2011/webrtc/editor/getusermedia.html#mediastreams-as-media-elements )  One question is how to deal with readyState and networkState attributes of the media element.  The HTML5 spec has a media element load algorithm which first resolves the URI of the src, and then attempts to fetch the source.   The current gUM spec says that when the algorithm reaches the fetch phase, if the resource is a MediaStream, the algorithm should terminate and set readyState to HAVE_ENOUGH_DATA.  I think that this is correct in the case of a MediaStream that is streaming data, but:


1.       The spec should also say that networkState gets set to NETWORK_IDLE.

2.       Does it matter if the Tracks in the MediaStream are muted or disabled?  My guess is that it doesn't - the output will just be silence or black frames, but we should clarify this.  (By the way, the current spec says that the output of a muted Track is silence or black frames, but doesn't say what the output is for a disabled Track.  Shouldn't it be the same?)

3.       What happens if the MediaStream that is fetched has ended = true?   Should we silently continue to use the dead stream and let HTML5 figure out what to do, or should we raise an error?  In the latter case, the HTML5 spec defines a MediaError  Media_ERR_Aborted , which we might be able to use.  It is defined as "The fetching process for the media resource<http://www.w3.org/TR/2012/CR-html5-20121217/embedded-content-0.html#media-resource> was aborted by the user agent at the user's request."  Isn't  that sort of what happens when a local MediaStream is ended?

4.       Do we want to say anything about remote MediaStreams?  In the case of a local MediaStream, NETWORK_IDLE makes sense for the networkState, because there is no network traffic.  But for a remote stream the NETWORK_LOADING state might be relevant.  On the other hand, the  Media Capture spec seems implicitly to deal with local streams (created by gUM).  If we want to explicitly allow remote streams, we have to explain how they are created, etc.   I suppose we could  say that streams can be remote, but the method of creating such a stream is outside the scope of this spec.  But then we'd at least have to say how the UA determines if a MediaStream is local or remote.

5.       What do we say if a MediaStream with no Tracks is passed to a media element (i.e., in the fetch phase of the algorithm)?  Do we treat this as if the media element had fetched unplayable data? There is a MEDIA_ERR_SRC_NOT_SUPPORTED  that we could  use in this case.  Or is it another Media_ERR_Aborted?  The fetch algorithm checks for the presence of audio and video tracks at a certain point, and any Tracks added after that won't be detected  (until load() is called again.)

I also have questions about how direct assignment should work, but I will send them in a separate email.


-          Jim

Received on Tuesday, 28 May 2013 19:35:36 UTC