Re: Encoders (Re: Getting rid of SDP)

On 06/03/2018 8:28, Harald Alvestrand wrote:
> On 03/06/2018 12:10 AM, Peter Thatcher wrote:
>> On Mon, Mar 5, 2018 at 3:06 PM Sergio Garcia Murillo 
>> < 
>> <>> wrote:
>>     More flexibility, more work, more bug surface.. ;)
>>     Anyway, I am not particularly against having access to RTP
>>     packets from the encoders,
>> Encoders do not emit RTP packets.  They emit encoded video frames.  
>> Those are then packetized into RTP packets.  I'm suggesting the 
>> JS/wasm have access to the encoded frame before the packetization.
> Actually, encoders usually take raw framebuffers (4:2:0, 4:4:4 or 
> other formats) + metadata and emit video frames + metadata. It may be 
> crucially important to get a handle on what the metadata looks like, 
> in order to make sure we are able to transport not just the bytes of 
> the frame, but the metadata too.
> Metadata includes things like timing information (carried through the 
> encoding process), interframe dependencies (an output from the 
> encoding process) and preferences for encoding choices (input to the 
> encoding process).
> We need to make sure we're not imagining things to be simpler than 
> they actually are.

True, but the good thing is that we don't have to reinvent the wheel, 
there are multiple encoder/decoder api abstractions that we can use as 
input for our design. Even libwebrtc has an API+metadata design 
internally.. ;)

Best regards

Received on Tuesday, 6 March 2018 09:43:20 UTC