Re: MediaStreams and Media elements

On 2013-05-28 21:35, Jim Barnett wrote:
> It’s time to revise section 8 of the spec, which deals with how to pass
> a MediaStream to an HTML5 <audio> or <video> element (see
> http://dev.w3.org/2011/webrtc/editor/getusermedia.html#mediastreams-as-media-elements
> )  One question is how to deal with readyState and networkState
> attributes of the media element.  The HTML5 spec has a media element
> load algorithm which first resolves the URI of the src, and then
> attempts to fetch the source.   The current gUM spec says that when the
> algorithm reaches the fetch phase, if the resource is a MediaStream, the
> algorithm should terminate and set readyState to HAVE_ENOUGH_DATA.  I
> think that this is correct in the case of a MediaStream that is
> streaming data, but:
>
> 1.The spec should also say that networkState gets set to NETWORK_IDLE.

Sounds reasonable.

>
> 2.Does it matter if the Tracks in the MediaStream are muted or
> disabled?  My guess is that it doesn’t – the output will just be silence
> or black frames, but we should clarify this.  (By the way, the current
> spec says that the output of a muted Track is silence or black frames,
> but doesn’t say what the output is for a disabled Track.  Shouldn’t it
> be the same?)

I agree to both points. And my understanding is that (and now we're off 
into webrtc land, sorry) a track that is "disabled" at the sending side 
of a PeerConnection would be "muted" at the receiving end (muted 
corresponds to a source that is not delivering data - and in this case 
the PeerConnection is the source).

>
> 3.What happens if the MediaStream that is fetched has ended = true?
>   Should we silently continue to use the dead stream and let HTML5
> figure out what to do, or should we raise an error?  In the latter case,
> the HTML5 spec defines a MediaError  Media_ERR_Aborted , which we might
> be able to use.  It is defined as “The fetching process for the media
> resource
> <http://www.w3.org/TR/2012/CR-html5-20121217/embedded-content-0.html#media-resource>
> was aborted by the user agent at the user's request.”  Isn’t  that sort
> of what happens when a local MediaStream is ended?

I agree to Rob, "ended" should fire. That of course brings us to the 
question: what should happen if we add a not ended track to that 
MediaStream? I think this is discussed in a separate thread.

>
> 4.Do we want to say anything about remote MediaStreams?  In the case of
> a local MediaStream, NETWORK_IDLE makes sense for the networkState,
> because there is no network traffic.  But for a remote stream the
> NETWORK_LOADING state might be relevant.  On the other hand, the  Media
> Capture spec seems implicitly to deal with local streams (created by
> gUM).  If we want to explicitly allow remote streams, we have to explain
> how they are created, etc.   I suppose we could  say that streams can be
> remote, but the method of creating such a stream is outside the scope of
> this spec.  But then we’d at least have to say how the UA determines if
> a MediaStream is local or remote.

Again I agree to Rob. If the connection goes down (so that a remote 
MediaStream stops) the application will get to know from PeerConnection 
events, and deal with it there.

>
> 5.What do we say if a MediaStream with no Tracks is passed to a media
> element (i.e., in the fetch phase of the algorithm)?  Do we treat this
> as if the media element had fetched unplayable data? There is a
> |/MEDIA_ERR_SRC_NOT_SUPPORTED/|  that we could  use in this case.  Or is
> it another Media_ERR_Aborted?  The fetch algorithm checks for the
> presence of audio and video tracks at a certain point, and any Tracks
> added after that won’t be detected  (until load() is called again.)

The way I read the html document, the resource is checked for audio and 
video tracks. If there are none, nothing really happens. Nothing to 
play, but no errors.

But the way I read the resource fetch algorithm, new tracks would be 
detected at any time they are added:

a. It is said that the media element has an AudioTrackList and a 
VideoTrackList, each with length zero or more
b. Those TrackList objects are said to represent a _dynamic_ list of tracks
c. It is also said that "The networking task source tasks to process the 
data as it is being fetched must, _when appropriate_, include the 
relevant substeps ...", and two of the steps include the creation of new 
Audio/VideoTracks (and firing of addtrack events).


So I think you could attach an empty MediaStream to a media element, 
than as tracks are added to the MediaStream they would be detected by 
the media element.
>
> I also have questions about how direct assignment should work, but I
> will send them in a separate email.
>
> -Jim
>

Received on Wednesday, 29 May 2013 13:08:48 UTC