W3C home > Mailing lists > Public > public-html@w3.org > October 2010

Re: Executing script-inserted external scripts in insertion order

From: Henri Sivonen <hsivonen@iki.fi>
Date: Tue, 12 Oct 2010 01:18:27 -0700 (PDT)
To: Getify <getify@gmail.com>, hallvord@opera.com
Cc: public html <public-html@w3.org>
Message-ID: <962503334.48826.1286871507532.JavaMail.root@cm-mail03.mozilla.org>
Getify wrote:
> I agree that avoiding UA sniffing or inferences and getting to a point
> where
> we can feature-detect on this behavior is the best ultimate goal.
> However, I
> think the stretch of optimism to suggest that quickly Opera,
> Webkit/Chrome,
> *and* IE will all fall in line is a bit much -- in fact, I think it
> may be
> quite awhile before we get convergent behavior.

I don't expect all browsers to align quickly. That's exactly why there's a need for capability sniffing.

> Moreover, the older
> browsers
> will be around for a long time with the divergent behavior. As such, I
> think
> browser inferences will be the only practical solution for LABjs (and
> related projects) to pragmatically serve all those browsers.

I fully expect LABjs to UA sniff past browsers. My point is that, ideally, for future browsers LABjs should be able to capability sniff so that a LABjs update isn't needed every time an additional browser implements the spec. And more to the point, a situation where a JS library used by many sites needs to be updated in order for an additional browser to implement the spec is a way to make it less likely that browsers implement the spec.

> I don't think that introducing a new property/attribute ("ordered")
> into the
> mix will ensure any sooner convergence of behavior.


> While looking for
> "ordered" may provide a shorter-term feature-test, the same inference
> and
> forking behavior will have to be maintained for quite awhile until we
> have
> full convergence (and older divergent browsers are practically
> insignificant), and thus the feature-test helpfulness will be
> diminished.

So if IE and WebKit stayed on the old script/cache code path even if they made script-inserted external scripts without the async attribute execute in insertion order, what carrot would you be giving IE and WebKit to motivate them to change (other than that it would be the Right Thing to implement the spec)?
> We could just as easily say that feature-test for `async` (and
> specifically
> its default value, either true or false) is the feature-test for
> ordered vs.
> non-ordered behavior.

I hadn't considered this.

> This feature-test will fail for awhile, but once
> we
> eventually have the convergence (and thus inference forking is moot),
> that
> feature-test will be just as reliable as a feature-test on "ordered".

What reason is there to believe that the presence of the async DOM property allows you to infer anything about the default behavior for script-inserted external scripts? 

> >>> * The time after the future. We should minimize the cruft
> >>> accumulated as
> >>> a side effect of getting from the present to the future so that it
> >>> doesn't haunt us after we've gotten to the future.
> Agreed. I'd say `ordered` might qualify as future cruft if `async`
> will do.

Yeah, that's why I mentioned cruft.

> >>> <jgraham> Doesn't it make more sense to make preserving order the
> >>> default
> It makes more sense to a web performance optimization person like me.
> It's
> the more ideal of the behaviors as compared to the non-preserving
> order by
> default in IE/Webkit/Chrome. In fact, I always wanted and assumed that
> the
> most proper (performance-savvy) behavior would be to have
> order-preservation
> by default,

Why would that be the performance-savvy default? It seems to me IE/WebKit behavior is more performance-savvy when there are at least two non-interacting scripts being downloaded in parallel (e.g. a site functionality script and an advertising script).

> However, I do not know if there are really other compelling use cases
> for
> the opposite to be default. For instance, if jQuery is injecting
> scripts
> and is bothered by some side-effect of them having order preservation,
> is

Now you are talking about a different thing: Script-inserted inline script maintaining insertion order relative to script-inserted external scripts.

> there REALLY a reason that jQuery couldn't just add the `async`
> property
> (true) to the injected scripts, and thus opt-in to the behavior
> desired?

But jQuery as deployed doesn't do taht. If you were to propose a solution that'd require existing jQuery installations to be updated around the Web, someone else might at least as reasonably suggest that existing LABjs installations be updated around the Web.

I'm trying to find a solution that satisfies the constraints placed by existing jQuery and the constraints placed by existing LABjs. AFAICT, the jQuery constraint is about how script-inserted inline scripts execute relative to script-inserted external scripts but the LABjs constraint is about how script-inserted external scripts execute relative to each other. Thus, it seems to me that it's possible to satisfy both constraints at the same time. (By making script-inserted inline scripts not maintain order relative to script-inserted external scripts and making script-inserted external scripts with async=false maintain order among themselves.)

> Since injected scripts must always be controlled by some script logic
> that
> is doing the injecting, it would seem either side of the fence could
> change
> to start setting the property's value to the opposite of its default,
> and
> thus achieve what they need.
> So in that respect, I'm not sure there's a compelling argument for
> either
> way to be the default. 

I think the default shouldn't depend on what makes sense but on what works for existing content. That is, the default isn't for the benefit of script authors. The default is for the benefit of browser users who browse to legacy content.

> FWIW, I won't live or die by whichever one
> turns out
> to be the default behavior, as long as the other behavior is still
> available
> to me by changing the `async` (or some other, I suppose) attribute.

Indeed. The features provided for future script-writing should be opt-in where they wouldn't work with existing content as defaults.

> >>> More seriously, it really doesn't make sense to make
> >>> script-inserted
> >>> external scripts and parser-inserted scripts to maintain order by
> >>> default (probably not at all). Since WebKit and IE taken together
> >>> have
> >>> non-trivial market share, Web authors can't rely on the stronger
> >>> ordering provided by Opera and Firefox 3.6.
> I don't particularly agree that we should bend spec to the larger
> market share. 

My point is that if existing content can't rely on a characteristic, the spec doesn't need to provide that characteristic as a default.

> Those browsers aren't enjoying "better performance" on this
> topic
> because they chose the better behavior -- in fact, the opposite -- but
> as a
> side-effect of them supporting another non-standard behavior (the
> "text/cache" thing) they are able to keep pace performance-wise while
> still
> having the full `async` capability (without the property/attribute
> itself)
> if the script loader simply doesn't go to extra effort to maintain the
> order.

Maybe they aren't enjoying better performance in the case where a library like LABjs emulates in-order execution. However, they might be enjoying better performance in other (potentially more common) cases. I don't have measurement data, but on the face if things, it seems reasonable to assume IE and WebKit at least aren't putting themselves at a performance disadvantage when the execution order doesn't matter.

(BTW, both LABjs and RequireJS use "script/cache"--not "text/cache".)

> As I've said, from my perspective, the previous behavior for FF (along
> with
> `aysnc=true/false` on injected scripts being respected to control
> ordering)
> is the more ideal and defendable behavior. I think that's what should
> be
> spec'd, and we should try to convince all the browsers why and to come
> in
> line with it.

I disagree. I think the old Gecko behavior had three characteristics that deviated from the current spec draft, so the old Gecko behavior shouldn't be considered as one either-or thing. I think two of those characteristics were clearly undesirable and the third is potentially desirable.

 1) Blocking script-inserted inline scripts when there's a pending external non-async script. This characteristic is so undesirable that it's what prompted the change in Gecko. Even we ignored compatibility with existing content and considered only design aesthetics, I'd argue that this behavior is undesirable even on design aesthetic/logic grounds, because it made a normally synchronous operation become asynchronous depending on what has happened before the operation. Absent compat forcing the contrary behavior (as is possibly the case with the window-level load event), I think operations should be either predictably synchronous or predictably asynchronous.

 2) Blocking parser-inserted scripts when there's a pending script-inserted external non-async script. This characteristic is undesirable, because it put Gecko at a performance disadvantage compared to IE/WebKit and, AFAICT, the stronger guarantee that Gecko used to provide was useless to sites, because there was no way to make use of and have the site work with IE or WebKit, too, since the behavior is (or am I wrong?) impossible to emulate by adding more scripting.

 3) Blocking a script-inserted external non-async script on a pending script-inserted external non-async script. As I understood from your blog, this is the only one of the three characteristics that LABjs relies on. Correct? Thus, compatibility with existing content makes it desirable. However, it's slightly undesirable in the case where there are independent legacy (i.e. legacy in the sense that the author didn't know about async and didn't set async=true) scripts loading in parallel, e.g. a site functionality script, advertising script and analytic script.

Considering the slight undesirability of characteristic #3 in the case of independent legacy scripts, I think it's not safe to expect that IE or WebKit would implement characteristic #3 (so that LABjs could rely on it in IE and WebKit in the future) simply because the WG speccing it. In the case of WebKit, there's evidence of prioritizing performance over spec correctness (see http://weblogs.mozillazine.org/bz/archives/020267.html ) in the sense of knowingly not fixing a correctness bug when the fix would regress performance. That's why I expect you'll need to be able to show a performance carrot for characteristic #3 (e.g. showing how it would result in much better perf in the LABjs case). Of course, I'd love to see WebKit (and IE) developers prove me wrong by implementing characteristic #3 even without someone coming up with a very convincing perf carrot. ;-)

> >> >> Whether or not we added Henri's proposed opt-in feature, it
> >> >> seems
> >> >> like HTML5 should specify the IE/WebKit behavior for unknown
> >> >> type
> >> >> attribute, unless there is some compelling reason to go with the
> >> >> Gecko behavior.
> >> >
> >> > There sort of is... The gecko behavior doesn't involve making a
> >> > network request for data you then plan to do nothing with, right?
> >> > Whereas the IE/Webkit behavior involves doing just that, as far
> >> > as I
> >> > can see.
> >>
> >> Is it common to use unknown script types on external scripts for
> >> purposes other than preloading the script (as these libraries seem
> >> to
> >> do)? I guess there might be pages with unconditional includes of
> >> vbscript, but I don't know how typical this is.
> Actually, there is another interesting use case somewhat unrelated to
> performance-oriented script loaders for why the "text/cache"
> fetching/caching behavior is helpful. Some templating techniques,
> including
> jQuery's templates as recently being released and made more popular,
> use a
> script node in the HTML with a fake type to store content such as HTML
> templates, which can later be fetched (by `id`) and used by JavaScript
> for
> presentation purposes.

These are parser-inserted inline scripts with unexecutable types, right?

> If the standard were to be extended that fake mime-types (or even,
> exactly
> "text/cache") should download the contents into the script element
> (and the
> cache) but not try to automatically parse them, then users of such
> templating tools could load templates from external resources instead
> of
> inlining them into the markup. This of course has benefits for caching
> (performance), etc.
> To be clear, for that use case to be served, not only would the
> resource
> need to be fetched by the "text/cache" node, but also made available
> to
> script via the `text` property of the node.

There's no chance that such a feature would be supported cross-origin without CORS, because it would cause cross-origin information leaks that aren't already present and unremovable for compatibility. (If loading scripts cross-origin without CORS could be killed without breaking existing content, browsers would probably have killed it already to enhance security.) With CORS for cross-origin or without CORS for same-origin, XHR is already further along being supported than what you propose.

> But keep in mind, the "preloading" use case in LABjs has some problems
> of
> its own (double-loads of resources if cache headers were sent out
> wrong,
> etc), and is *really* just a hack to get around that IE/Webkit/Chrome
> don't
> support the more desired behavior from FF/Opera. I wouldn't argue
> strongly
> for standardization of that behavior (unless from the templating use
> case
> perspective), and certainly not to the exclusion of getting a
> straightforward fix for "ordering" of injected scripts that avoids the
> hackiness of this solution.

It seems to me that supporting existing content is the only reasonable reason to spec fetching scripts that have type="script/cache".

Hallvord R. M. Steen wrote:
> We've
> arrived at the current implementation through years of trial, error
> and
> broken sites (some important like aol.com and CNet) so we're naturally
> worried about making even small changes here.

Do you remember why aol.com or CNet broke and what they did not break in IE or WebKit?
> It does not make much sense to me to implement one attribute to opt
> out of
> source order execution ('async') and another one to opt IN because the
> default behaviour isn't default after all.. ;)


> > "Please leave your sense of logic at the door, thank." :-)
> Not sure about that - script execution is now getting very complicated
> from an author point of view, and ease of authoring is one of our main
> concerns..

Authors can't really rely on defaults that make stronger guarantees (without UA sniffing) when IE and WebKit (as deployed) don't implement those guarantees.

> In other words, if web authors don't follow
> his
> advice on improving performance correctly they won't benefit from
> improved
> performance.. ;)

Web authors might not follow advice correctly. Film at 11. :-(
> > It doesn't make sense to make script-inserted inline script maintain
> > order relative to script-inserted external scripts by default. The
> > reason is that it's bad when a typically synchronous operation
> > becomes
> > asynchronous depending on what has happened before. In the typical
> > case,
> > a script-inserted inline script is evaluated synchronously. However,
> > if
> > order is maintained as in Firefox 3.6, the operation becomes
> > asynchronous if there happens to be a pending external script.
> It's not ideal, but over the years we (Opera) have concluded that this
> is
> required for web compat reasons,

How can it be required for Web compat when IE and WebKit don't do it. I could buy an explanation that enough sites put IE on a different code path from everyone else, but if that were the explanation, how could WebKit have gotten away with cloning IE rather than Gecko?

> and we (the WG) have already solved
> this
> problem by spec'ing the async attribute. For example jQuery's global
> eval
> method can simply set async on the created script, and it won't be
> affected by pending external DOM-inserted scripts after all. This
> would
> fix your Twitter widget bug (585620), right?

That would be fix for newly-authored content. It wouldn't be a fix for supporting existing content.

To me, it makes no sense to argue Web compat as a reason to implement a behavior in one sentence and the next sentence suggest site changes as a solution for addressing compat issues arising from that behavior.

> We could specify that the async attribute reflects how the browser
> will
> execute the script?
> For example
> document.createElement('script').async > false in WebKit (not yet
> supported in IE AFAIK - not tested in IE9 though).
> var tmp=document.createElement('script');
> tmp.src='foo';
> if(tmp.async==true){
> // loading of script-inserted external scripts is unordered by default
> }

async defaults to false in WebKit nightlies, so if they ship a release, the test you suggest will de facto become unreliable.

> How long did it take you from releasing the relevant beta
> build to the Twitter widget bug report?

Roughly one month from the start of the beta exposure.

The HTML5 parser was enabled by default on 2010-05-03. The first beta it shipped in was released on 2010-07-06. The bug was filed on 2010-08-09.
Henri Sivonen
Received on Tuesday, 12 October 2010 08:19:05 UTC

This archive was generated by hypermail 2.3.1 : Thursday, 29 October 2015 10:16:05 UTC