- From: tim panton <thp@westhawk.co.uk>
- Date: Mon, 22 Jul 2013 17:01:30 +0100
- To: cowwoc <cowwoc@bbs.darktech.org>
- Cc: "public-webrtc@w3.org" <public-webrtc@w3.org>
On 22 Jul 2013, at 16:37, cowwoc <cowwoc@bbs.darktech.org> wrote: > On 22/07/2013 4:10 AM, tim panton wrote: >>> Tim, >>> >>> Let's take a step back. >>> >>> I think we both agree that we need a low-level API needs to be driven by the capabilities exposed by the signaling layer (not high-level use-cases). I think we both agree that we need a high-level API needs to be driven by typical Web Developer use-cases. So what are we disagreeing on here? >> I think we disagree on quite a bit. I dislike the 'low level' description. What we need is an object orientated api that exposes a coherent set of capabilities. The webAudio API is a good example of how that can be done. > I don't get the difference between what you're saying and what I wrote. We're about talking about a low-level API that exposes capabilities that is implemented on top of the signaling layer. I'm not talking about a low-level api any more than webAudio or DOM are low level. Exposing a coherent set of objects that represent the underlying capabilities is not the same thing as low level. Take a look at https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html#dfn-OscillatorNode for an example of what I mean. By contrast the CU-RTC api's ice abstraction _is_ low level - any API that requires javascript to do bit manipulation has gone astray in my view. I am _DEFINITELY_ not talking about anything to do with any signalling layers. Signalling belongs in javascript, not in the browser. I fought long and hard to avoid signalling being baked into the browser, I have zero interest in any proposal that even hints in that direction. Tim.
Received on Monday, 22 July 2013 16:01:58 UTC