[whatwg] Proposal for separating script downloads and execution

>> I don't understand why the preloading specifically would imply different
>> HTTP caching semantics than normal dynamic script loading?
>
> It doesn't have to.  It's just that if preloading is easy to trigger by
> accident and authors don't notice when they accidentally preload lots of
> stuff then we may have a problem if we don't coalesce identical-object
> (whatever that means) loads.
>
> Normal script loading doesn't have the "don't notice" issue much,
> because a typical script running is noticeable.

I'm curious if we could apply some "limit" to the number of scripts that 
will be simultaneously preloaded, at say 100 scripts for instance? A 
sufficiently high limit that almost all normal usages of this feature would 
never hit that limit, and yet small enough to prevent the run-away memory 
usage you're concerned about.

The way I'd see that limit working would be that if more scripts are 
requested to preload than the 100 limit, then all the rest will simply be 
blocked in a loading queue, waiting for the script elements to either be 
added and thus execute some from the preload queue, or abandoned/aborted 
(GC'd)... both of which would free up slots in the preload queue, letting 
the browser preload some more.

This would work conceptually very similar to how the simultaneous connect 
limits work right now... from the API perspective, there'd be no difference, 
but the browser would just throttle and delay any loads that go over this 
internal limit. In fact, a browser could probably be free to play with this 
limit a little bit depending on conditions like amount of available memory 
on the computer/device, etc. I don't see any reason that an author would 
need to know what the limits are, or control them, so long as the limit is 
never so low as to prevent the normal use-cases from operating as expected.

To be clear, I'm not saying that no site would ever need to load more than 
100 scripts. I know there are sites out there that do. But I'm saying that I 
don't know of any sites that would have a need to preload that many scripts. 
Script loaders could quite easily be set to begin executing the preload 
queue as soon as that localized part of the dependency graph is fulfilled, 
which could naturally keep the queue being emptied as more scripts are 
preloaded. It would be an extreme condition in which there truly was a 
dependency graph that required more than 100 dependencies in the cycle 
before the execution cursor could advance.

If 100 still seems too low, make it 500. Somewhere orders of magnitude lower 
than the run-away 10,000 scripts case... seems like it could mitigate the 
browser vendors' fears in this area. Thoughts?



--Kyle

Received on Wednesday, 23 February 2011 05:33:59 UTC