Re: Proposal for API for Data

On 04/09/2012 11:31 PM, Cullen Jennings wrote:
> On Mar 15, 2012, at 10:13 AM, Harald Alvestrand wrote:
>> On 03/15/2012 02:34 PM, Cullen Jennings wrote:
>>> I had a proposal for the Data API for awhile and have circulated it privately to some folks but I had not sent it to the list because I wanted to keep the focus on the PeerConnection / JSEP stuff for awhile. However, I think I probably need to send this out. I think at the high level there is general agreement we need to be abel to send and receive data much like web sockets does, we need to be abel to create and destroy channels, and we need to be able to mark a channel as reliable or non reliable delivery. The API I'm proposing to do this is very simple.
>>> Interface DataMediaStreamTrack : MediaStreamTrack
>>> {
>>>       attribute boolean reliable;
>>> }
>>> DataMediaStreamTrack implements WebSocket;
>>> This ends up getting the function to send and receive data as well as report errors fromWebSocket. Any library that takes a WebSocket can take a DataMediaStream. It unifies creation of a new stream to be the same as MediaStreams. So in the same way that a new Video track was created, you can add a data track or remove one. It is labeled in same way as the the other tracks and so on. It seems like i works well for data sources that are from the JS app and it also can be used with data sources, such as a sensor or game controller, that may be more controlled by the browser in the same way a microphone or camera is more controlled form the browser. That will allow for better real time performance of devices that produce a data stream instead of an media stream.
>>> I'd like to discuss on one of the call the pros / cons of such an interface but right now I am heads down on the JSEP changes.
>> At one point I argued something similar....
>> the biggest worry I have with this approach is that it inherits a lot of stuff that doesn't seem to make sense, for instance:
> I think my answer to all of this this is more or less "same thing as media". Something in media stream that does not work for data probably will not work for DTMF.
DTMF is specified as a function you call on a media stream. What's the 
> My primary issues here is that I'd rather not have things be greatly different if they don't need to be. If there is something in the MediaStreams that does not work for Data, then I am worried that it will have problem for other media types that get done in the future that are not Audio or Video.
> I do think that given the primary goal of the Data interface is to reduce latency, that we need to have a model that works when the Data is generated directly by a device and not from the JS app. Of course we all need to support the JS app generation.
>> - "mute" functions from the MediaStreamTrack: What does it mean to "mute" a data channel?
> same thing as with media - it largely an indicate that the received data will be ignored
Will the data be acked or signalled back as "lost"? How does the sender 
knows which data was received and which was not? You've worked long 
enough with "graceful" vs "disruptive" close to know that the difference 
in system design between the two cases is huge.

For media, we're usually able to deal with this being unclear because it 
doesn't matter very much to the sender at which exact instant he got 
muted. For data, it's very far from clear to me that we can design 
systems that tolerate this.
>> - track manipulations: what does it mean to add a DataMediaStreamTrack to two MediaStreams?
> I don't think we have full agreement on what this means with media, but again, I would propose same thing. I think this means both streams would get a copy of the data yet only one copy of the data would be sent over the network.
So the "ondata()" handler will be called twice, once for each 
MediaStream? That's implementable. Is there a situation for which it 
makes sense?
>> Perhaps nothing, since there still can be only one handler (or handler list) for the onmessage() function anyway?
>> - "url" and "extensions" attributes from the WebSocket API: Do they have meaning? Are "extensions" names from the same namespace as for WebSockets?
> I may be confused but I think the URLs only show up in the constructor and thus would not apply over to the DataMediaStreamTrack.
I'm reading this spec:

It shows:

interface WebSocket : EventTarget {
   readonly attribute DOMString url; 
<----------------------------------------------- Here

   // ready state
   const unsigned short CONNECTING = 0;
   const unsigned short OPEN = 1;
   const unsigned short CLOSING = 2;
   const unsigned short CLOSED = 3;
   readonly attribute unsigned short readyState;
   readonly attribute unsigned long bufferedAmount;

   // networking
   [TreatNonCallableAsNull] attribute Function? onopen;
   [TreatNonCallableAsNull] attribute Function? onerror;
   [TreatNonCallableAsNull] attribute Function? onclose;
   readonly attribute DOMString extensions;
   readonly attribute DOMString protocol;
   void close([Clamp] optional unsigned short code, optional DOMString 

   // messaging
   [TreatNonCallableAsNull] attribute Function? onmessage;
            attribute DOMString binaryType;
   void send(DOMString data);
   void send(ArrayBuffer data);
   void send(Blob data);

There are lots of interesting fields here. If we inherit from that 
specification, each and every field must have a defined value, even if 
it is "always NULL", and each and every function must do something 
sensible, even if it is "throw an UNIMPLEMENTED" exception.

That's the cost of using inheritance.


Received on Tuesday, 10 April 2012 09:29:50 UTC