- From: Stefan Håkansson LK <stefan.lk.hakansson@ericsson.com>
- Date: Thu, 1 Dec 2011 13:51:06 +0100
- To: Adam Bergkvist <adam.bergkvist@ericsson.com>
- CC: "public-webrtc@w3.org" <public-webrtc@w3.org>
On 11/30/2011 05:27 PM, Adam Bergkvist wrote:
> On 11/30/2011 10:51 AM, Stefan Håkansson LK wrote:
>> The conclusion of this discussion seems to be:
>>
>> 1. MediaStream objects should be created more or less immediately as
>> result of "getUserMedia" or "addStream" operations (in the later case a
>> MediaStream object on the remote side should be created as a result of
>> the signaling).
>>
>> 2. There is a need to, on a MediaStreamTrack level, notify the
>> application when live data is available (which can be a bit later than
>> the creation of the MediaStream object).
>>
>> The task is now on the Editors to change the drafts along these lines.
>
> A way of realizing the above is to expose the muted property on
> MediaStreamTrack. The concept of muting a track already exist in the
> spec in the PeerConnection-case where the A-side disables a track; it
> will then be muted (not disabled) on the B-side. It's however not
> possible to detect the muted state with the current spec. A muted track
> will produce blackness or silence when consumed.
>
> * getUserMedia
> When the NavigatorUserMediaSuccessCallback fires, information from the
> users' selections will be used to create the tracks in the resulting
> MediaStream's track list. No tracks are muted.
>
> * addstream
> When the addstream event listener fires, information from the signaling
> will be used to construct the track list. The tracks are all muted. Each
> tracks will be unmuted once data arrives on the underlying RTP stream.
> We can simply remove the readyState from MediaStream to avoid the
> problem with a LIVE stream that only has muted tracks and let the tracks
> deal with their own states.
IMO, the above gives the functionality we're after. Looks like a good
start to me!
>
> The exposed MediaStreamTrack.muted property (along with its
> corresponding event listeners) can also be used to, e.g., show an image
> if the other side (of a PeerConnection) disables a track that is being
> transmitted to you. Another use case is to turn of image processing if
> an augmented reality case.
>
> * Changes to MediaStreamTrack/MediaStreamTrack interfaces
"MediaStream/MediaStreamTrack" I guess!
>
> [Constructor (in MediaStreamTrackList trackList)]
> interface MediaStream {
> readonly attribute DOMString label;
> readonly attribute MediaStreamTrackList tracks;
> MediaStreamRecorder record ();
> - const unsigned short LIVE = 1;
> - const unsigned short ENDED = 2;
> - readonly attribute unsigned short readyState;
> + readonly attribute boolean ended;
> attribute Function? onended;
> };
Should we align closer to the Media element by having separate
attributes for audioTracks and videoTracks? This would make it simpler
to detect if there is any audio/video ("ms.videoTracks.length").
>
> interface MediaStreamTrack {
> readonly attribute DOMString kind;
> readonly attribute DOMString label;
> attribute boolean enabled;
>
> + readonly attribute boolean muted;
> +
> + attribute Function? onmute;
> + attribute Function? onunmute;
> };
Should we separate AudioMediaStreamTrack and VideoMediaStreamTrack
definitions as discussed in conjunction with DTMF?
And should we add
>
> * Examples
>
> navigator.webkitGetUserMedia({}, function (stream) {
> selfView.src = webkitURL.createObjectURL(stream);
>
> // all tracks are unmuted when the success callback is fired
> for (var i = 0; i< stream.tracks[i]; i++)
> console.log(stream.tracks[i].muted); // "false"
> });
>
> // addstream event handler example 1.
> peerConn.onaddstream = function (evt) {
> // all tracks are muted when the addstream event listener is fired
> // (gets unmuted when data arrives)
> for (var i = 0; i< stream.tracks[i]; i++)
> console.log(stream.tracks[i].muted); // "true"
>
> // delay playback until all tracks have been unmuted (data has
> // arrived on all tracks)
> var tracks = evt.stream.tracks;
> var numberOfMutedTracks = tracks.length;
>
> for (var i = 0; i< tracks.length; i++) {
> tracks[i].onunmute = function (evt) {
> evt.target.onunmute = null;
> if (--numberOfMutedTracks == 0)
> selfView.src = webkitURL.createObjectURL(stream);
> };
> }
> };
>
> // addstream event handler example 2.
> peerConn.onaddstream = function (evt) {
> // the MediaStreamTrack.muted property can also be used to display
> // a pause screen in a video element if the sender has disabled the
> // video track (appears as muted on your side)
>
> // will show blackness until data arrives (muted tracks)..
> remoteView.src = webkitURL.createObjectURL(evt.stream);
> // ..showing pause screen in the meantime
> showPauseScreen(remoteView);
>
> var tracks = evt.stream.tracks;
> var videoTrack;
> for (var i = 0; i< tracks.length; i++) {
> if (tracks[i].type == "video") {
> videoTrack = tracks[i];
> break;
> }
> }
>
> if (videoTrack) {
> videoTrack.onmute = function () {
> showPauseScreen(remoteView);
> };
>
> videoTrack.onunmute = function () {
> showVideo(remoteView);
> };
> }
> };
>
> /Adam
Received on Thursday, 1 December 2011 12:51:43 UTC