[whatwg] Peer-to-peer use case (was Peer-to-peer communication, video conferencing, <device>, and related topics)

Some feedback below. (Stuff where I agree and there is no question have left out).

 
> On Mon, 31 Jan 2011, Stefan H kansson LK wrote this use case:
> >

We've since produced an updated use case doc: <http://www.ietf.org/id/draft-holmberg-rtcweb-ucreqs-01.txt>

...

> > The web author developing the application has decided to display a 
> > self-view as well as the video from the remote side in rather small 
> > windows, but the user can change the display size during 
> the session. 
> > The application also supports if a participant (for a 
> longer or shorter 
> > time) would like to stop sending audio (but keep video) or 
> video (keep 
> > audio) to the other peer ("mute").
...
> 
> All of this except selectively muting audio vs video is currently 
> possible in the proposed API.
> 
> The simplest way to make selective muting possible too would 
> be to change 
> how the pause/resume thing works in GeneratedStream, so that 
> instead of 
> pause() and resume(), we have individual controls for audio 
> and video. 
> Something like:
> 
>    void muteAudio();
>    void resumeAudio();
>    readonly attribute boolean audioMuted;
>    void muteVideo();
>    void resumeViduo();
>    readonly attribute boolean videoMuted;
> 
> Alternatively, we could just have mutable attributes:
> 
>    attribute boolean audioEnabled;
>    attribute boolean videoEnabled;
> 
> Any opinions on this?
We're looking into this and will produce a more elaborate input related to this.

...

> > !The web application must be able to    !If the video is 
> going to be displayed !
> > !define the media format to be used for !in a large window, 
> use higher bit-    !
> > !the streams sent to a peer.            !rate/resolution. 
> Should media settings!
> > !                                       !be allowed to be 
> changed during a     !
> > !                                       !session (at e.g. 
> window resize)?      !
> 
> Shouldn't this be automatic and renegotiated dynamically via SDP 
> offer/answer?
Yes, this should be (re)negotiated via SDP, but what is unclear is how the SDP is populated based on the application's preferences.

...

> > !Streams being transmitted must be      !Do not starve 
> other traffic (e.g. on  !
> > !subject to rate control                !ADSL link)         
>                    !
> 
> Not sure whether this requires any thing special. Could you elaborate?
What I am after is that the RTP/UDP streams sent from one UA to the other must have some rate adaptation implemented. HTTP uses TCP transport, and TCP reduces the send rate when a packet does not arrive (so that flows share the available throughput in a fair way when there is a bottleneck). For UDP there is no such mechanism, so unless something is added in the RTP implementation it could starve other traffic. I don't think it should be visible in the API though, it is a requirment on the implemenation in the UA.

...

 
> > !Synchronization between audio and video!                   
>                    !
> > !must be supported                      !                   
>                    !
> 
> If there's one stream, that's automatic, no?
One audiovisual stream is actually transmitted as two RTP streams (one audio, one video). And synchronization at playout is not automatic, it is something you do based on RTP timestamps and RTCP stuff. But again, this is a req on the implementaion in the UA, not on the API.

...

> > !The web application must be made aware !To be able to 
> inform user and take    !
> > !of when streams from a peer are no     !action (one of the 
> peers still has    !
> > !longer received                        !connection with 
> the server)           !
> > 
> --------------------------------------------------------------
> ------------------
> > !The browser must detect when no streams!                   
>                    !
> > !are received from a peer               !                   
>                    !
> 
> These aren't really yet supported in the API, but I intend 
> for us to add 
> this kind of thing at the same time sa we add similar metrics 
> to <video> 
> and <audio>. To do this, though, it would really help to have 
> a better 
> idea what the requirements are. What information should be available? 
> "Packets received per second" (and "sent", maybe) seems like 
> an obvious 
> one, but what other information can we collect?
I think more studies are required to answer this one.

//Stefan

Received on Tuesday, 22 March 2011 03:01:33 UTC