W3C home > Mailing lists > Public > public-media-capture@w3.org > May 2013

RE: MediaStreams and Media elements

From: Jim Barnett <Jim.Barnett@genesyslab.com>
Date: Wed, 29 May 2013 14:01:21 +0000
To: Stefan Håkansson LK <stefan.lk.hakansson@ericsson.com>, "public-media-capture@w3.org" <public-media-capture@w3.org>
Message-ID: <57A15FAF9E58F841B2B1651FFE16D2810422C4@GENSJZMBX02.msg.int.genesyslab.com>
Stefan,
  My comments in-line, set off by '>>'

-----Original Message-----
From: Stefan Håkansson LK [mailto:stefan.lk.hakansson@ericsson.com] 
Sent: Wednesday, May 29, 2013 9:08 AM
To: public-media-capture@w3.org
Subject: Re: MediaStreams and Media elements

On 2013-05-28 21:35, Jim Barnett wrote:
> It's time to revise section 8 of the spec, which deals with how to 
> pass a MediaStream to an HTML5 <audio> or <video> element (see 
> http://dev.w3.org/2011/webrtc/editor/getusermedia.html#mediastreams-as
> -media-elements
> )  One question is how to deal with readyState and networkState 
> attributes of the media element.  The HTML5 spec has a media element 
> load algorithm which first resolves the URI of the src, and then
> attempts to fetch the source.   The current gUM spec says that when the
> algorithm reaches the fetch phase, if the resource is a MediaStream, 
> the algorithm should terminate and set readyState to HAVE_ENOUGH_DATA.  
> I think that this is correct in the case of a MediaStream that is 
> streaming data, but:
>
> 1.The spec should also say that networkState gets set to NETWORK_IDLE.

Sounds reasonable.

>
> 2.Does it matter if the Tracks in the MediaStream are muted or 
> disabled?  My guess is that it doesn't - the output will just be 
> silence or black frames, but we should clarify this.  (By the way, the 
> current spec says that the output of a muted Track is silence or black 
> frames, but doesn't say what the output is for a disabled Track.  
> Shouldn't it be the same?)

I agree to both points. And my understanding is that (and now we're off into webrtc land, sorry) a track that is "disabled" at the sending side of a PeerConnection would be "muted" at the receiving end (muted corresponds to a source that is not delivering data - and in this case the PeerConnection is the source).

>
> 3.What happens if the MediaStream that is fetched has ended = true?
>   Should we silently continue to use the dead stream and let HTML5 
> figure out what to do, or should we raise an error?  In the latter 
> case, the HTML5 spec defines a MediaError  Media_ERR_Aborted , which 
> we might be able to use.  It is defined as "The fetching process for 
> the media resource 
> <http://www.w3.org/TR/2012/CR-html5-20121217/embedded-content-0.html#m
> edia-resource> was aborted by the user agent at the user's request."  
> Isn't  that sort of what happens when a local MediaStream is ended?

I agree to Rob, "ended" should fire. That of course brings us to the
question: what should happen if we add a not ended track to that MediaStream? I think this is discussed in a separate thread.

>> Is there an 'ended' event for media elements?  I may have missed it.  There is an 'ended' attribute that we should set.  The media element's behavior should be just as if it had reached the end of a file that it was playing.   Maybe that's all we need to say.  

>
> 4.Do we want to say anything about remote MediaStreams?  In the case of
> a local MediaStream, NETWORK_IDLE makes sense for the networkState,
> because there is no network traffic.  But for a remote stream the
> NETWORK_LOADING state might be relevant.  On the other hand, the  Media
> Capture spec seems implicitly to deal with local streams (created by
> gUM).  If we want to explicitly allow remote streams, we have to explain
> how they are created, etc.   I suppose we could  say that streams can be
> remote, but the method of creating such a stream is outside the scope of
> this spec.  But then we'd at least have to say how the UA determines if
> a MediaStream is local or remote.

Again I agree to Rob. If the connection goes down (so that a remote 
MediaStream stops) the application will get to know from PeerConnection 
events, and deal with it there.
>> So nothing needs to be signaled at the media element level?  It will be as if the connection dropped while the element was playing streaming media.  

>
> 5.What do we say if a MediaStream with no Tracks is passed to a media
> element (i.e., in the fetch phase of the algorithm)?  Do we treat this
> as if the media element had fetched unplayable data? There is a
> |/MEDIA_ERR_SRC_NOT_SUPPORTED/|  that we could  use in this case.  Or is
> it another Media_ERR_Aborted?  The fetch algorithm checks for the
> presence of audio and video tracks at a certain point, and any Tracks
> added after that won't be detected  (until load() is called again.)

The way I read the html document, the resource is checked for audio and 
video tracks. If there are none, nothing really happens. Nothing to 
play, but no errors.

But the way I read the resource fetch algorithm, new tracks would be 
detected at any time they are added:

a. It is said that the media element has an AudioTrackList and a 
VideoTrackList, each with length zero or more
b. Those TrackList objects are said to represent a _dynamic_ list of tracks
c. It is also said that "The networking task source tasks to process the 
data as it is being fetched must, _when appropriate_, include the 
relevant substeps ...", and two of the steps include the creation of new 
Audio/VideoTracks (and firing of addtrack events).


So I think you could attach an empty MediaStream to a media element, 
than as tracks are added to the MediaStream they would be detected by 
the media element.

>> If we're lucky.  The problem I have is that the prose you quote above is _inside_ the resource fetch algorithm.  And that algorithm can terminate in a number of places for a wide variety of reasons.  It would be equally plausible for the UA to assume that the empty MediaStream was unplayable data and terminate.  The whole load algorithm is so sprawling and disjointed that it is hard to say what it would do with a MediaStream (which doesn't need to be fetched at all.)  It also makes it hard to modify it cleanly.  Perhaps the best thing to do is to say:
1.  If the MediaStream contains one or more Tracks, treat it as a fully downloaded file (that's what the current prose does.)
2.  Otherwise (i.e., if the MediaStream is empty) wait (for a platform-specific interval) for one or more Tracks to be added, then treat it as a fully downloaded file. 
Both these clauses will cause the load algorithm to terminate, and Tracks added after the algorithm terminates will not  be detected until the next time load() is called.  (This shouldn't be a problem for developers, as long as they understand that this is what will happen.)

 
>
> I also have questions about how direct assignment should work, but I
> will send them in a separate email.
>
> -Jim
>
Received on Wednesday, 29 May 2013 14:01:48 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 16:26:17 UTC