- From: Stefan Hakansson LK <stefan.lk.hakansson@ericsson.com>
- Date: Fri, 20 Jul 2012 15:00:05 +0200
- To: public-webrtc@w3.org
On 07/20/2012 12:49 PM, Harald Alvestrand wrote: > On 07/02/2012 05:56 PM, Stefan Hakansson LK wrote: >> On 06/28/2012 11:53 AM, Harald Alvestrand wrote: >>> I'm on holiday, but wanted to respond to this briefly.... >>> >>> On 06/25/2012 09:22 AM, Stefan Hakansson LK wrote: >>>> On 06/21/2012 02:24 PM, Harald Alvestrand wrote: >>>>> As usual, apologies for the formatting, and errors in the WebIDL is a >>>>> certainty, not a possibility. >>>>> >>>>> I've tried to insert the level of indirection that allows us to >>>>> return a >>>>> set of objects for (say) a media stream with FEC that uses multiple >>>>> SSRCs rather than just a single object. >>>> >>>> I think the proposed stats to track look like a good start, but I >>>> think that an additional one that could be of interest is the actual >>>> interface/network used. The app gets to know (sort of) what interfaces >>>> that are available since the candidates are handled by the app. But >>>> knowing what interface that was in actual use could be very helpful. >> >> I got no comment on th above part (which was the most important!). Any >> views? > > Yes, I ignored that, since specific stats aren't defined in the proposal > yet. > I was kind of assuming that the local/remote IP address of the interface > in use was available as a "statistic", which gives the same result (I > think). Yes, perhaps that is all what is needed. > >> >>>> >>>> Regarding the API, an alternative approach could be to tie this more >>>> to the tracks of the MediaStreams, e.g. add something to the >>>> MediaStreamTrack interface: >>>> >>>> void getInfo(Function handler); >>>> >>>> The info delivered would depend on the source of the track. For tracks >>>> in local (i.e. generated locally and not delivered by a >>>> PeerConnection) MediaStreams, the info would be things like sampling >>>> rate, channels (mono, 5.1) for audio; resolution and frame rate for >>>> video. If the MediaStream has traveled over a PeerConnection, the info >>>> listed in your document (packets received/lost, ssrc, codec, ) would >>>> be appended. >>>> >>>> This would associate the stats to a MediaStreamTrack, getting rid of >>>> referencing problems (when MediaStreams are added or removed) if the >>>> local/remoteStreams is used. In addition, if the corresponding info is >>>> requested for data channels, it could easily be added to those. >>>> >>>> One drawback is that only the receiving app would get the stats >>>> associated with the PeerConnection transport. >>> I think this is a fatal flaw - it means that the sender can't tell if >>> the recipient is suffering packet loss, for instance. Given that the >>> information is already delivered to the sender node by means of RTCP, it >>> seems strange not to give access to it locally. I don't see a reason to >>> accept this drawback. >> >> The sender's app could see if the receiver's app send the info over >> (e.g using the data channel). > Only if they're the same app, or have standardized another interface for > that purpose. We already have RTCP. > >> >>> >>>> But as said, the main receiver of stats is the service provider, and >>>> reporting per MediaStreamTrack allows the service provider to >>>> correlate stats (perhaps using the MediaStream label). And in most >>>> cases it is the same application in both ends, and the receiving end >>>> could share the stats with the sending end (perhaps using the data >>>> channel) if desired. >>> This won't work in the case where the app is written to communicate with >>> something that is not itself (such as a media gateway). I don't see a >>> reason to accept this limitation. >> >> That depends on what ways the media GW has a way to communicate with >> the app. >> >> Anyway, I certainly has nothing against exposing stats to the sender >> app as well, but I think we should do something about the API in that >> case. I don't like the idea that "removeStream" would create empty >> entries in the local/remoteStream arrays. > Then let's remove "removeStream" altogether, and have the streams > remain, but in the Closed state. That is one way to do it. Would it enable freeing up resource? I'm thinking about a situation where audio and video is streamed to a remote peer, and the data connection is used as well. If the sender/sending app determines that audio or video will not be used for a while, resources like codecs should be released (but the data connection should remain) to other (web and native) apps. > >> >>> >>>> If there are different apps used, they could still agree on stats >>>> exchange (just as they would have to agree on a bunch of other things >>>> to get things working) if desired. >>>> >>>> Stefan >>>> >>>>> >>>>> Comments welcome! >>>>> >>>>> Harald >>>>> >>>> >>>> >>>> >>> >>> >>> >> >> >> > > >
Received on Friday, 20 July 2012 13:00:36 UTC