W3C home > Mailing lists > Public > public-webrtc@w3.org > September 2012

Re: On the subject of complexity

From: Tim Panton <thp@westhawk.co.uk>
Date: Sun, 2 Sep 2012 10:23:34 -0700
Cc: public-webrtc@w3.org
Message-Id: <F2AF7A2F-9316-417E-823B-9F8C7362C9E4@westhawk.co.uk>
To: Anant Narayanan <anant@mozilla.com>

On 30 Aug 2012, at 14:17, Anant Narayanan wrote:

> On 8/30/12 9:54 AM, Martin Thomson wrote:
>> The problem with a black box is that there is nothing I can do to fix
>> it.  All my experience with browsers suggests that it takes a long
>> time before you can be certain that the bug has gone away for good.
>> Bugs introduced in 2001 are only now on a small enough portion of the
>> web that they can be effectively ignored (if you are feeling
>> aggressive).
> I think this statement is misleading because it implies that the very same problem does not exist with third party JS libraries.
> Sure, if you discover a bug in the JS implementation of ICE, you might be able to fix it, but I can safely assert that over 90% of the users of that library won't even know of the existence of such a bug, let alone know how to fix it, or worse, know that they need to update their includes because a critical fix was released.
> I trust browser update mechanisms far more than the popular systems used by web developers today to manage their JS library dependencies.

I'm confused by this argument. What kind of bug are we thinking of here? 

If the discussion is about  peer-to-peer  communication, i.e. both ends are browsers, all that is necessary is that 
both ends have the same expectations of 'nICE' .

 The essence of the MS proposal is that the JS implements just as much of an ICE-like state machine (call it nICE)
to satisfy it's needs. In most cases the two ends will have been served the same js from the same site - so one can assume a compatible set of bugs 
(unless the browsers miss-implement the low level API) - presumably someone will have tested it when it was released.

There is more of an issue about interop with legacy code where the JS ICE implementation might have bugs, 
but it might also contain code that works around known bugs in legacy end points. If ICE is baked in to the browser, then a coder who wants
to interop with a buggy (or questionable implementation) will be forced to hope that all of the browser vendors take pity on her.

As I said in a previous message all legacy interop will require explicit tweaking/testing
in order to assert it works against a given legacy endpoints. 
There is (IMHO) a better chance that legacy interop will occur the more constrained the API.
(SDP as it currently defined for rtcweb is essentially unconstrained - possibly even Turning complete ;-) 
Anyone who doubts this should take a look at the continuing viability of SIPit. )


> ...
>> This requires relatively little code to build as it turns out.  But
>> it's all javascript.
> I'm a big proponent of moving as many things out of the browser and into JavaScript as possible (heck, I've made several statements in the past supporting the ability to encode and decode media entirely from JS, encouraged by projects like "Broadway", the pure JS H.264 decoder).
> However, following numerous discussions with folks both inside and outside of Mozilla, I have come to conclude that in this particular case of network setup and teardown, doing so would be a mistake and a disservice to the web community at large, simply because of the nature of a network API. Equating an ICE implementation to a library like jQuery, or even Broadway, is naive at best.
> -Anant
Received on Sunday, 2 September 2012 17:24:06 UTC

This archive was generated by hypermail 2.3.1 : Monday, 23 October 2017 15:19:30 UTC