[webrtc-pc] How do applications know a DataChannel's buffer capacity, so they can avoid filling it?

taylor-b has just created a new issue for https://github.com/w3c/webrtc-pc:

== How do applications know a DataChannel's buffer capacity, so they can avoid filling it? ==
A well-behaving application that sends a large amount of data will use the "bufferedAmount" event and "bufferedAmountLowThreshold", to ensure it doesn't fill the DataChannel's buffer and cause the channel to be closed.

But how does the application know what to set "bufferedAmountLowThreshold" to? If the spec doesn't provide any guarantees about an implementation's buffer size, the application developer is forced to research how large different browsers' buffers are, pick a value low enough, and hope the browser buffers don't get smaller.

So I'd suggest either:

* Providing a `bufferSize` attribute (pretty straightforward)
* Guarantee something about the buffer size. For example, "it MUST be N MB or larger".
* Get rid of the "data channel closing on buffer filling" behavior, and just throw/return an error instead, like a non-blocking socket. This is a more radical change, but is anyone really attached to the current behavior? It seems to cause a fair amount of confusion. See discussion here: https://groups.google.com/forum/#!topic/discuss-webrtc/apmq9VLY9lc

Please view or discuss this issue at https://github.com/w3c/webrtc-pc/issues/1148 using your GitHub account

Received on Saturday, 29 April 2017 18:24:36 UTC