- From: Getify <getify@gmail.com>
- Date: Sun, 17 Oct 2010 23:31:48 -0500
- To: "Adam Barth" <w3c@adambarth.com>, "public html" <public-html@w3.org>
?> Can you provide examples of sites that break? It's easier to reason > about concrete broken sites than about these things in the abstract. I've provided several examples earlier in this thread of sites that will break either with the Mozilla change (ordering) or the Webkit change (not fetching invalid mime-type scripts). Briefly: http://zappos.com http://blog.getify.com http://flensed.com For these sites, the Webkit breakage will be because LABjs is relying on (in IE/Webkit) the ability to fetch a script into cache but not have it execute, using what I call the "preloading" trick. It works by injecting a script node with `type=script/cache` that will load the script into cache but not execute it. Moreover, the load still fires the `onload` event, so LABjs knows when the script is successfully in the cache, and can then, in the proper execution sequence, immediately execute the script by re-injecting the same source URL but with the correct `type=text/javascript`. I understand that this seems exactly like <link rel=prefetch>, which is true on the surface that LABjs could possibly be changed to use that trick instead of the invalid mime-type trick. However, this option is bad because: 1) as I said, there doesn't seem to be a valid feature-test to know in which version Webkit stops supporting the fetch-but-not-execute behavior, so backwards-compat and existings sites/content will be vulnerable. 2) the "preloading" trick, whether it uses `script/cache` or `prefetch`, is far sub-optimal for the loading use case in question. It's only been a hack fallback for IE/Webkit thus far because up until now there hasn't been another option. But it's very susceptible to breaking if even one of the scripts being loaded via "preloading" fails to send out proper cache headers. Some studies/experts in this area have suggested that as much as 70% of scripts on the web are not sent out with proper headers, so even if that estimate is twice as much as reality, 35% of scripts will completely break with this trick. Thus, I do not feel that "preloading", regardless of technique used, is a valid spec'd long-term solution to the use-case. What we need specifically is a way to load a list of scripts (each only once) in parallel but be able to specify that their execution order matters (dependencies), so the order should be enforced as insertion order. This behavior is already something that HTML script tags provide, we're just asking that it also be available opt-in to script-inserted script nodes as well, via setting `async` to false specifically. > Isn't this what defer does? I guess you're saying it's not > performance-oriented? `defer` tells the browser to wait until the DOM is finished before executing the scripts. Yes, it does eventually execute them in order, but pinning scripts' execution to the DOM is where it fails to meet the performance optimization mark as desired. It's also unclear if `defer` scripts delay *download* until after the DOM (some tests in some browsers suggest so) or just execution, which would be even less "performance-oriented". Moreover, it's also unclear if `defer` scripts being injected long after DOM would still maintain order, etc. These questions, and others like it, are why I feel that `async=false` on a dynamically-inserted script is a better switch than `defer` for the behavior we're looking for. > I believe tonyg said that he thought the proposed change would slow > down a bunch of real-world web sites that use script-inserted scripts > to achieve parallel loading in existing browsers. Generally, folks > aren't going to be that exciting about slowing down web sites. Here's why I assert that the change in question will NOT slow down existing sites for IE/Webkit: because I'm arguing that I think it makes sense compat-wise for browsers (and spec) to implement `true` as the default value for `async`, but *only* on script-inserted scripts. Of course, that makes the default opposite of what how it defaults to `false` on parser-inserted scripts, which might seem odd at first glance but I think is logically defendable upon examination. Not only would that default mean protecting all existing content in Webkit/IE that relies on unordered (faster) execution of script-inserted nodes (since the default for those scripts would still be `true` and thus unordered), but it would also provide an effective feature-test for script loaders by being able to look for not only the `async` property on a script node but also its default value being `true` (as opposed to the `false` default it has now). var el = document.createElement("script"); var ft = ("async" in el && el.async == true); --Kyle
Received on Monday, 18 October 2010 04:32:25 UTC