W3C home > Mailing lists > Public > public-webapps@w3.org > January to March 2015

Re: Thread-Safe DOM // was Re: do not deprecate synchronous XMLHttpRequest

From: Marc Fawzi <marc.fawzi@gmail.com>
Date: Wed, 11 Feb 2015 18:45:17 -0800
Message-ID: <CACioZiubEw3CfCM-cc+bBLtg62cbfB9Vkm08EibYjO-1YiLr-Q@mail.gmail.com>
To: Boris Zbarsky <bzbarsky@mit.edu>
Cc: public-webapps <public-webapps@w3.org>
this "backward compatibility" stuff is making me think that the web is
built upon the axiom that we will never start over and we must keep piling
up new features and principles on top of the old ones

this has worked so far, miraculously and not without overhead, but I can
only assume that it's at the cost of growing complexity in the browser
codebase. I'm sure you have to manage a ton of code that has to do with old
features and old ideas...

how long can this be sustained? forever? what is the point in time where
the business of retaining backward compatibility becomes a huge nightmare?

On Wed, Feb 11, 2015 at 12:33 PM, Boris Zbarsky <bzbarsky@mit.edu> wrote:

> On 2/11/15 3:04 PM, Brendan Eich wrote:
>> If you want multi-threaded DOM access, then again based on all that I
>> know about the three open source browser engines in the field, I do not
>> see any implementor taking the huge bug-risk and opportunity-cost and
>> (mainly) performance-regression hit of adding barriers and other
>> synchronization devices all over their DOM code. Only the Servo project,
>> which is all about safety with maximal hardware parallelism, might get
>> to the promised land you seek (even that's not clear yet).
> A good start is defining terms.  What do we mean by "multi-threaded DOM
> access"?
> If we mean "concurrent access to the same DOM objects from both a window
> and a worker, or multiple workers", then I think that's a no-go in Servo as
> well, and not worth trying to design for: it would introduce a lot of spec
> and implementation complexity that I don't think is warranted by the use
> cases I've seen.
> If we mean the much more modest "have a DOM implementation available in
> workers" then that might be viable.  Even _that_ is pretty hard to do in
> Gecko, at least, because there is various global state (caches of various
> sorts) that the DOM uses that would need to either move into TLS or become
> threadsafe in some form or something...  Again, various specs (mostly DOM
> and HTML) would need to be gone over very carefully to make sure they're
> not making assumptions about the availability of such global shared state.
>  We should add lighter-weight workers and immutable data structures
> I should note that even some things that could be immutable might involved
> a shared cache in current implementations (e.g. to speed up sequential
> indexed access into a child list implemented as a linked list)...
> Obviously that sort of thing can be changed, but your bigger point that
> there is a lot of risk to doing that in existing implementations remains.
> -Boris
Received on Thursday, 12 February 2015 02:46:25 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 18:14:43 UTC