- From: Peter Thatcher <pthatcher@google.com>
- Date: Mon, 05 Mar 2018 22:53:39 +0000
- To: Sergio Garcia Murillo <sergio.garcia.murillo@gmail.com>
- Cc: public-webrtc@w3.org
Received on Monday, 5 March 2018 22:54:14 UTC
However the JS/wasm wants :). It could use one frame == one QUIC stream or one RTP packet == one QUIC stream or even one big RTP packet == one frame == one QUIC stream. I prefer the first, but the API should allow any of these. On Mon, Mar 5, 2018 at 2:50 PM Sergio Garcia Murillo < sergio.garcia.murillo@gmail.com> wrote: > On 05/03/2018 23:22, Peter Thatcher wrote: > > If you want to send media over QUIC or do your own crypto between > > encode and network (perhaps over some low-level RTP transport), then > > you need access to media after its encoded and before it's decoded. > > Peter, one side-question, how do you envision that media over quic > should be used? Do you plan to encapsulate RTP over quic or just send > each full frame over a single quic stream? Just trying to imagine what > api surface should the encoders/decoders expose. > > Best regards > > Sergio > > >
Received on Monday, 5 March 2018 22:54:14 UTC