W3C home > Mailing lists > Public > public-html@w3.org > July 2008

RE: Workers

From: Justin James <j_james@mindspring.com>
Date: Tue, 22 Jul 2008 00:49:24 -0400
To: "'Andrew Fedoniouk'" <news@terrainformatica.com>
Cc: "'Ian Hickson'" <ian@hixie.ch>, <public-html@w3.org>
Message-ID: <00ea01c8ebb6$50b67850$f22368f0$@com>



> -----Original Message-----
> From: public-html-request@w3.org [mailto:public-html-request@w3.org] On
> Behalf Of Andrew Fedoniouk
> Sent: Monday, July 21, 2008 5:16 PM
> To: Justin James
> Cc: 'Ian Hickson'; public-html@w3.org
> Subject: Re: Workers
> Problem with data URLs is also in the fact that they are almost not
> usable for using  in static
> HTML document. For dynamic, generated html  they are probably fine but
> when attribute value spans
> three pages is not to be read by human.
> 
> I understand motivation of having Worker script to be placed in
> separate
> documents/files.
> It appears as this is the only way to avoid execution of the script in
> main thread. It has to live in
> separate document. Not perfect of course as it brings some uncertainty
> in the algorithm - when it will really
> start and will it start at all?.

I can see why this makes sense, but I find it hard to believe that browser
vendors cannot build a thread system into their JavaScript interpreters that
does this sensibly, without needing to use a separate document. They can
download files in a separate thread, I am not sure why they can't have a
multithreaded JavaScript system too.

> UA usually have some cap on how many simultaneous connections can be
> used by http client.
> And yet "A single-user client SHOULD NOT maintain more than 2
> connections with any server or proxy. " [1]
> In case of Workers UA shall use some intrinsic limit too. I think it is
> enough for createWorker() to throw
> an exception if allowed maximum is reached.

Yeah, I forgot about the 2 download cap as well, I am curious if that would
be waived for workers or not.

A few more thoughts on the topic, that need to be better enunciated in the
draft:

* In some cases, particularly workers created during onLoad, there will be
such severe contention for the download queue that it will take longer to
run the script from the remote URL than a self-stored script.

* What happens in regard to caching? Let's say, for example, that a mouse
over operation spawns a worker, and the user is madly waving their mouse
about. Does the browser re-download the script each time? If so, you are
going to see a TON of lag. Or does it cache the script? Can the developer
request that the item be cached or override the cache? If so, what mechanism
is available for that?

* What happens if, for whatever reason, the script contents should be
generated by the server, and GET is inadequate? Is there a method to do a
full POST to get the script from a server?

* What is considered a "failure" to download the script? Will the download
system follow redirects, for example?

J.Ja
Received on Tuesday, 22 July 2008 04:50:24 UTC

This archive was generated by hypermail 2.3.1 : Monday, 29 September 2014 09:38:56 UTC