- From: Bernard Aboba <Bernard.Aboba@microsoft.com>
- Date: Fri, 4 Mar 2016 18:14:38 +0000
- To: "public-ortc@w3.org" <public-ortc@w3.org>, Robin Raymond <robin@hookflash.com>, Philipp Hancke <philipp.hancke@googlemail.com>
- Message-ID: <BLUPR03MB149F82CA01F69FE1A4AF52DECBE0@BLUPR03MB149.namprd03.prod.outlook.com>
Here is a post describing how one might use RED and FEC in ORTC to set things up similar to the way they are done in recent builds of Chrome. As noted below, there are still some open questions in this area. First, some background. Robin has published a March 1, 2016 update to the ORTC API spec on the ortc.org site: http://ortc.org/wp-content/uploads/2016/03/ortc.html A lot of issues (41!) were addressed in this revision, so it was somewhat overdue. The new Editor's draft includes quite a few updates relating to RTX/RED/FEC. Since are expecting implementation activity over the coming months, it is good to get discussion going so we can make sure we understand how it works and fine issues before they become interoperability problems. Many people who have looked at how RTX/RED/FEC works in ORTC ask "why they are they included both as codecs and in encoding parameters??". At various points we have discussed removing them as codecs. But then the question arose about how the capabilities of RTX/RED/FEC could be provided. Also, including RTX/RED/FEC as codecs enables ORTC to be used with simple capabilities exchange signaling where only payload types are available. An example might be a 1-1 audio/video scenario where RTX/RED/FEC is turned on, but encodings[0].ssrc, encodings[0].fec.ssrc and encodings[0].rtx.ssrc might not be set. So given those concerns, RTX/RED/FEC have been kept as codecs, with codec capabilities and settings having been filled in within the latest Editor's draft. Codec Capability Parameters for H.264 are covered in Section 9.3.3.3, rtx is in Section 9.3.3.4, red in Section 9.3.3.5, ulpfec in Section 9.3.3.6 and flexfec in Section 9.3.3.7. For rtx we only have the rtxTime capability and ulpfec has no capability parameters. However, flexfec has several capabilities, and for red we have payloadTypes. Codec Parameters are specified for H.264 (Section 9.7.2.3), rtx (Section 9.7.2.4), red (Section 9.7.2.5), ulpfec (Section 9.7.2.6) and flexfec (Section 9.7.2.7). In addition to being specified as codecs, there are some additional places where information is provided on the set up for RED and FEC. fecMechanisms is described in Section 9.1.1. This section now has defined values: "red", "red+ulpfec" and "flexfec". This was done to make it possible to distinguish the use of red to encapsulate FEC (red/ulpfec) from use of standalone FEC (e.g. flexfec) and from the use of red without fec (classic RFC 2198 redundant audio coding). Within Section 9.8 (RTCRtpEncodingParameters) we have encodings[].fec and encodings[].rtx, with each of these having some additional clarifications and parameters. Taking all of this into account, let us see if we can explain how the ORTC API might be used to set up communications similarly to what we see in the latest builds of Chrome. In looking at this, I have a few (unanswered) questions, so hopefully others can fill things in. At the bottom of this message is a Chrome Canary Version 51 offer that involves red (PT 116) and ulpfec (PT 117). RTX (PT 98) is used to retransmit RED. So to enable this in ORTC, we would have the following codec settings: "vp8" codec codecs[3].name = "vp8"; codecs[3].payloadType = 100; "vp9" codec codecs[4].name = "vp9"; codecs[4].payloadType = 101; "ulpfec" codec codecs[5].name = "ulpfec" codecs[5].payloadType = 117; "red" codec codecs[6].name = "red"; codecs[6].payloadType = 116; "rtx" codec codecs[7].name = "rtx" codecs[7].payloadType = 96; Then we would set up the encoding parameters for VP8: Parameters.encodings[0].ssrc = 1821827098; parameters.encodings[0].codecPayloadType = 100; parameters.encodings[0].fec.mechanism = "red+ulpfec"; parameters.encodings[0].fec.ssrc = X ; // What is the SSRC to be used for the red + ulpfec stream? parameters.encodings[0].rtx.payloadType = 96; //retransmission of VP8 (PT 100) parameters.encodings[0].rtx.ssrc = 4154144015; //SSRC for the RTX stream And VP9: Parameters.encodings[1].ssrc = 1821827098; // same SSRC as for VP8 parameters.encodings[1].codecPayloadType = 101; parameters.encodings[1].fec.mechanism = "red+ulpfec"; parameters.encodings[1].fec.ssrc = Y ; // What is the SSRC to be used for the red + ulpfec stream? Is this the same as for VP8?? parameters.encodings[1].rtx.payloadType = 97; //retransmission of VP9 (PT 101) parameters.encodings[1].rtx.ssrc = 4154144015; //SSRC for the VP9 RTX stream (same as VP8 RTX stream?) We will also need to have encoding parameters for red + ulpfec since that is also being retransmitted: Parameters.encodings[2].ssrc = X; // Does this need to be set twice?? parameters.encodings[2].codecPayloadType = 116; parameters.encodings[2].rtx.payloadType = 98; //retransmission of red+uplfec (PT 116) parameters.encodings[2].rtx.ssrc = 4154144015; //SSRC for the RTX of the red+ulpfec stream. Is this the same as for RTX of the VP8/VP9 streams?? ================================================= Chrome Canary Version 51 from createOffer() =================================================== v=0 o=- 1284756290424282220 2 IN IP4 127.0.0.1 s=- t=0 0 a=group:BUNDLE audio video a=msid-semantic: WMS PI6XhSRaxHTZTGZbzkvBOmuisqxsYh8gSyp4 m=audio 9 UDP/TLS/RTP/SAVPF 111 103 104 9 0 8 126 c=IN IP4 0.0.0.0 a=rtcp:9 IN IP4 0.0.0.0 a=ice-ufrag:g4KrfxeCdmZ/TxnO a=ice-pwd:Zjlw70VLzATBGeC/dSCvNs4J a=fingerprint:sha-256 E1:5C:21:01:DC:59:5E:AF:04:13:9A:5D:AC:3E:05:8E:3D:6E:23:97:84:CD:96:C6:6E:31:52:92:65:14:CB:25 a=setup:actpass a=mid:audio a=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time a=sendrecv a=rtcp-mux a=rtpmap:111 opus/48000/2 a=rtcp-fb:111 transport-cc a=fmtp:111 minptime=10; useinbandfec=1 a=rtpmap:103 ISAC/16000 a=rtpmap:104 ISAC/32000 a=rtpmap:9 G722/8000 a=rtpmap:0 PCMU/8000 a=rtpmap:8 PCMA/8000 a=rtpmap:126 telephone-event/8000 a=maxptime:60 a=ssrc:2568666259 cname:BfnS+qIA8PlwzvOD a=ssrc:2568666259 msid:PI6XhSRaxHTZTGZbzkvBOmuisqxsYh8gSyp4 943bfdac-019f-4b78-b229-bb20a382c201 a=ssrc:2568666259 mslabel:PI6XhSRaxHTZTGZbzkvBOmuisqxsYh8gSyp4 a=ssrc:2568666259 label:943bfdac-019f-4b78-b229-bb20a382c201 m=video 9 UDP/TLS/RTP/SAVPF 101 100 116 117 96 97 98 c=IN IP4 0.0.0.0 a=rtcp:9 IN IP4 0.0.0.0 a=ice-ufrag:g4KrfxeCdmZ/TxnO a=ice-pwd:Zjlw70VLzATBGeC/dSCvNs4J a=fingerprint:sha-256 E1:5C:21:01:DC:59:5E:AF:04:13:9A:5D:AC:3E:05:8E:3D:6E:23:97:84:CD:96:C6:6E:31:52:92:65:14:CB:25 a=setup:actpass a=mid:video a=extmap:2 urn:ietf:params:rtp-hdrext:toffset a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time a=extmap:4 urn:3gpp:video-orientation a=sendrecv a=rtcp-mux a=rtpmap:100 VP8/90000 a=rtcp-fb:100 ccm fir a=rtcp-fb:100 nack a=rtcp-fb:100 nack pli a=rtcp-fb:100 goog-remb a=rtcp-fb:100 transport-cc a=rtpmap:101 VP9/90000 a=rtcp-fb:101 ccm fir a=rtcp-fb:101 nack a=rtcp-fb:101 nack pli a=rtcp-fb:101 goog-remb a=rtcp-fb:101 transport-cc a=rtpmap:116 red/90000 a=rtpmap:117 ulpfec/90000 a=rtpmap:96 rtx/90000 a=fmtp:96 apt=100 a=rtpmap:97 rtx/90000 a=fmtp:97 apt=101 a=rtpmap:98 rtx/90000 a=fmtp:98 apt=116 a=ssrc-group:FID 1821827098 4154144015 a=ssrc:1821827098 cname:BfnS+qIA8PlwzvOD a=ssrc:1821827098 msid:PI6XhSRaxHTZTGZbzkvBOmuisqxsYh8gSyp4 9ddea351-03fc-4b6f-8ff3-8cde0533df4f a=ssrc:1821827098 mslabel:PI6XhSRaxHTZTGZbzkvBOmuisqxsYh8gSyp4 a=ssrc:1821827098 label:9ddea351-03fc-4b6f-8ff3-8cde0533df4f a=ssrc:4154144015 cname:BfnS+qIA8PlwzvOD a=ssrc:4154144015 msid:PI6XhSRaxHTZTGZbzkvBOmuisqxsYh8gSyp4 9ddea351-03fc-4b6f-8ff3-8cde0533df4f a=ssrc:4154144015 mslabel:PI6XhSRaxHTZTGZbzkvBOmuisqxsYh8gSyp4 a=ssrc:4154144015 label:9ddea351-03fc-4b6f-8ff3-8cde0533df4f
Received on Friday, 4 March 2016 18:15:12 UTC