- From: Cullen Jennings (fluffy) <fluffy@cisco.com>
- Date: Tue, 7 May 2013 16:56:36 +0000
- To: Stefan Håkansson LK <stefan.lk.hakansson@ericsson.com>
- CC: "public-media-capture@w3.org" <public-media-capture@w3.org>
This looks good from GUM point of view but when the track is used in context of webrtc, particular a track that has a remote source, I think we probably need to say a bit more. I'd like to get at the what causes the differences from "black" packets flowing across the web vs no packets flowing across the web. On Apr 18, 2013, at 2:38 AM, Stefan Håkansson LK <stefan.lk.hakansson@ericsson.com> wrote: > I think, with the changes Martin proposed done, that this looks good. > > Stefan > > On 2013-04-17 09:19, Adam Bergkvist wrote: >> On 2013-04-16 18:40, Martin Thomson wrote: >>> I like this. The text on enabled/muted is clear enough that I believe >>> it will avoid confusion altogether. >>> >>> On 16 April 2013 05:01, Adam Bergkvist <adam.bergkvist@ericsson.com> >>> wrote: >>>> *** MediaStreamTrack Lifecycle and Media Flow *** >>>> >>>> The MediaStreamTrack interface lets the script control a single flow of >>>> media. The live state indicates that the track source is active and the >>>> track renders media. >>> >>> I don't know what "live state" refers to here. Maybe this is more >>> obvious with highlighting on the "live" keyword, but it isn't clear >>> from context that you are talking about readyState. >>> >>> You probably do need to include some more thorough description about >>> the actual lifecycle (new/live/ended). The title mentions it, but the >>> above doesn't make it clear that you are talking about lifecycle. We >>> probably need something like: >>> -- >>> >>>> The MediaStreamTrack interface lets the script control a single flow of >>>> media. >>> >>> A MediaStreamTrack has three stages in its lifecycle. A track begins >>> as "new" prior to being connected to an active source. >>> >>> Once connected, the 'started' event fires and the track becomes >>> "live". In the "live" state, the track is active and media is >>> available for rendering at a sink (e.g., an HTML <video> or <audio> >>> element). >>> >>> <insert text on muting and rendering> >>> >>> A track becomes "ended" and the 'ended' event fires when the source of >>> a track is disconnected or exhausted. >> >> Looks good to me. >> >> My first idea was to say less about the lifecycle (and change the title) >> to not introduce redundant information. But at a closer look it's >> obvious that the MediaStreamTrack lifecycle is underspecified; most of >> the information is in the non-normative Event Summary section. So let's >> have it in this section. >> >> /Adam >> > >
Received on Tuesday, 7 May 2013 16:57:05 UTC