- From: Bernard Aboba <Bernard.Aboba@microsoft.com>
- Date: Mon, 1 Dec 2014 17:14:49 +0000
- To: Martin Thomson <martin.thomson@gmail.com>, "public-webrtc@w3.org" <public-webrtc@w3.org>
I would agree that we should not specify how to handle inbound media not yet assigned to an MST. However, my impression from TPAC was that there was interest in generating an event indicating that unsignaled media had been received, providing an error indication and basic information on the unhandled RTP stream (such as the SSRC and perhaps the mid RTP header extension). The event would only be fired once on receipt of an unhandled RTP stream. While the application might act on the event in order to fix the signaling, there should be no expectation that this would enable the previously received media to be rendered. Consider the following: partial interface RTCPeerConnection { attribute EventHandler? onunhandledrtp; }; RTCRtpUnhandledEvent The unhandledrtp event of the RTCPeerConnection object uses the RTCRtpUnhandledEvent interface. Firing an unhandledrtp event named e with an RTCRtpUnhandled stream means that an event with the name e, which does not bubble (except where otherwise stated) and is not cancelable (except where otherwise stated), and which uses the RTCRtpUnhandledEvent interface with the stream attribute set to an RTCRtpUnhandled object, MUST be created and dispatched at the given target. dictionary RTCRtpUnhandledEventInit : EventInit { RTCRtpUnhandled? stream; }; [Constructor(DOMString type, RTCRtpUnhandledEventInit eventInitDict)] interface RTCRtpUnhandledEvent : Event { readonly attribute RTCRtpUnhandled stream; }; Attributes stream of type RTCRtpUnhandled, readonly The stream attribute is the RTCRtpUnhandled object with the characteristics of the RTP stream that caused the event. Dictionary RTCRtpUnhandledEventInit Members stream of type RTCRtpUnhandled, nullable The characteristics of the RTP stream that caused the event. dictionary RTCRtpUnhandled { unsigned long ssrc; payloadtype payloadType; DOMString muxId; }; -----Original Message----- From: Martin Thomson [mailto:martin.thomson@gmail.com] Sent: Monday, December 1, 2014 8:44 AM To: public-webrtc@w3.org Subject: What to do with unsignaled media https://github.com/w3c/webrtc-pc/issues/39 >From the issue: In #29, @stefhak noted that we don't really specify how we handle inbound media that isn't yet assigned to a MediaStreamTrack through signaling. This can happen with renegotiation+bundling at the offerer. We don't specify whether to keep the data, or what to do with it. I tend to think that this falls into the space where you have resource and quality trade-offs that aren't really very safe to specify for the general case. A machine with lots of RAM can store several jitter buffers worth of audio and several I-frames, but there are limits to how much they can store. There are also limits too on processing capacity to handle all that media properly. In the same way we don't specify jitter buffer size and depth, I'm going to recommend closing this issue without at most some editorial additions.
Received on Monday, 1 December 2014 17:15:22 UTC