W3C home > Mailing lists > Public > public-webrtc@w3.org > November 2011

Re: CONCLUSION on "onaddstream" and beyond

From: Adam Bergkvist <adam.bergkvist@ericsson.com>
Date: Wed, 30 Nov 2011 17:27:38 +0100
Message-ID: <4ED6597A.6010807@ericsson.com>
To: Stefan HÃ¥kansson LK <stefan.lk.hakansson@ericsson.com>
CC: "public-webrtc@w3.org" <public-webrtc@w3.org>
On 11/30/2011 10:51 AM, Stefan HÃ¥kansson LK wrote:
> The conclusion of this discussion seems to be:
>
> 1. MediaStream objects should be created more or less immediately as
> result of "getUserMedia" or "addStream" operations (in the later case a
> MediaStream object on the remote side should be created as a result of
> the signaling).
>
> 2. There is a need to, on a MediaStreamTrack level, notify the
> application when live data is available (which can be a bit later than
> the creation of the MediaStream object).
>
> The task is now on the Editors to change the drafts along these lines.

A way of realizing the above is to expose the muted property on 
MediaStreamTrack. The concept of muting a track already exist in the 
spec in the PeerConnection-case where the A-side disables a track; it 
will then be muted (not disabled) on the B-side. It's however not 
possible to detect the muted state with the current spec. A muted track 
will produce blackness or silence when consumed.

* getUserMedia
When the NavigatorUserMediaSuccessCallback fires, information from the 
users' selections will be used to create the tracks in the resulting 
MediaStream's track list. No tracks are muted.

* addstream
When the addstream event listener fires, information from the signaling 
will be used to construct the track list. The tracks are all muted. Each 
tracks will be unmuted once data arrives on the underlying RTP stream. 
We can simply remove the readyState from MediaStream to avoid the 
problem with a LIVE stream that only has muted tracks and let the tracks 
deal with their own states.

The exposed MediaStreamTrack.muted property (along with its 
corresponding event listeners) can also be used to, e.g., show an image 
if the other side (of a PeerConnection) disables a track that is being 
transmitted to you. Another use case is to turn of image processing if 
an augmented reality case.

* Changes to MediaStreamTrack/MediaStreamTrack interfaces

  [Constructor (in MediaStreamTrackList trackList)]
  interface MediaStream {
      readonly attribute DOMString            label;
      readonly attribute MediaStreamTrackList tracks;
      MediaStreamRecorder record ();
-    const unsigned short LIVE = 1;
-    const unsigned short ENDED = 2;
-    readonly attribute unsigned short       readyState;
+    readonly attribute boolean 	     ended;
               attribute Function?            onended;
  };

  interface MediaStreamTrack {
      readonly attribute DOMString kind;
      readonly attribute DOMString label;
               attribute boolean   enabled;

+    readonly attribute boolean   muted;
+
+             attribute Function? onmute;
+             attribute Function? onunmute;
};

* Examples

navigator.webkitGetUserMedia({}, function (stream) {
     selfView.src = webkitURL.createObjectURL(stream);

     // all tracks are unmuted when the success callback is fired
     for (var i = 0; i < stream.tracks[i]; i++)
         console.log(stream.tracks[i].muted); // "false"
});

// addstream event handler example 1.
peerConn.onaddstream = function (evt) {
     // all tracks are muted when the addstream event listener is fired
     // (gets unmuted when data arrives)
     for (var i = 0; i < stream.tracks[i]; i++)
         console.log(stream.tracks[i].muted); // "true"

     // delay playback until all tracks have been unmuted (data has
     // arrived on all tracks)
     var tracks = evt.stream.tracks;
     var numberOfMutedTracks = tracks.length;

     for (var i = 0; i < tracks.length; i++) {
         tracks[i].onunmute = function (evt) {
             evt.target.onunmute = null;
             if (--numberOfMutedTracks == 0)
                 selfView.src = webkitURL.createObjectURL(stream);
         };
     }
};

// addstream event handler example 2.
peerConn.onaddstream = function (evt) {
     // the MediaStreamTrack.muted property can also be used to display
     // a pause screen in a video element if the sender has disabled the
     // video track (appears as muted on your side)

     // will show blackness until data arrives (muted tracks)..
     remoteView.src = webkitURL.createObjectURL(evt.stream);
     // ..showing pause screen in the meantime
     showPauseScreen(remoteView);

     var tracks = evt.stream.tracks;
     var videoTrack;
     for (var i = 0; i < tracks.length; i++) {
         if (tracks[i].type == "video") {
             videoTrack = tracks[i];
             break;
         }
     }

     if (videoTrack) {
         videoTrack.onmute = function () {
             showPauseScreen(remoteView);
         };

         videoTrack.onunmute = function () {
             showVideo(remoteView);
         };
     }
};

/Adam
Received on Wednesday, 30 November 2011 16:32:10 UTC

This archive was generated by hypermail 2.3.1 : Monday, 23 October 2017 15:19:26 UTC