[Bug 14194] Request: specification for script preloading

http://www.w3.org/Bugs/Public/show_bug.cgi?id=14194

--- Comment #7 from Kyle Simpson <w3c@getify.myspamkiller.com> 2011-09-21 18:48:52 UTC ---
(In reply to comment #4)
> > This preloads my image and (I think) keeps it in memory.
> 
> It keeps the compressed image data in memory (or on disk, of course; see
> below).  Decompression can happen synchronously when you use the image; this
> can lead to significant pauses (order of seconds) when the image is actually
> used.
> 
> Are you OK with only preloading compressed representations of scripts and
> having a noticeable decompression or fetch delay on use, like for images?

Given the choice between no preloading and preloading as you suggest, I pick
preloading as you suggest, for sure.

But that type of delay, especially on mobile (limited CPU, etc) basically can
act to decapitate the whole purpose for preloading, which is to reduce (as much
as possible) the delays (downloading, parsing, compiling, etc) when you want to
synchrously execute a script.

Are you suggesting that this delay would still be synchronous upon attempt to
execute the preloaded script?

Also, I've looked at a lot of waterfall diagrams in a whole bunch of different
browsers, devices, connections, etc. I've never seen anywhere near
orders-of-magnitude of seconds for the amount of time the browser spends
"decompressing"  a gzip'd script. Even on large JS files like 300k+, the most
I've ever seen the "processing" stage of a script take is a couple hundred ms.

Can you elaborate on any scenario by which gzip unzipping would take seconds to
perform (even on low-CPU mobiles)?


> > Is that memory ever cleaned up if I don't use the image?
> 
> I can't speak to other UAs, but in Gecko it's not.  Yet.  We're considering
> changing that.  Of course the OS can also cause it to be swapped to disk, in
> which case it has to be read from there too via a page fault.
> 
> Again, is that acceptable for script preloading?

I think strictly speaking for the purposes of this proposal, it's "allowable",
but certainly far far short of ideal. It will probably act to moot most (but
not all) of the benefits.

It seems like keeping preloaded scripts compressed in memory is missing a
significant opportunity to perform parsing and compilation (and even static
analysis, like type inference, etc) on a preloaded script that is just sitting
there idle, waiting to be executed sometime in the future.

Moreover, what if the JS is served uncompressed in the download? A lot of JS is
still sent out without gzip on the wider web. Are you planning to compress such
uncompressed scripts before keeping them in memory, thereby "charging" the time
cost both for compression and for decompression to the client?

If not, you're saying that you're ok storing those uncompressed scripts in
memory as-is, but not the ones which were downloaded via gzip? That seems a
little strange of a distinction to me.

-- 
Configure bugmail: http://www.w3.org/Bugs/Public/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the QA contact for the bug.

Received on Wednesday, 21 September 2011 18:48:58 UTC