W3C home > Mailing lists > Public > public-html@w3.org > January 2009

Re: cloneNode() and script execution

From: Jonas Sicking <jonas@sicking.cc>
Date: Tue, 20 Jan 2009 01:41:11 -0800
Message-ID: <63df84f0901200141w621e7441y449b35dbd8e74f40@mail.gmail.com>
To: "Maciej Stachowiak" <mjs@apple.com>
Cc: "Hallvord R. M. Steen" <hallvord@opera.com>, "Boris Zbarsky" <bzbarsky@mit.edu>, "Preston L. Bannister" <preston@bannister.us>, public-html@w3.org

On Mon, Jan 19, 2009 at 6:44 PM, Maciej Stachowiak <mjs@apple.com> wrote:
>
>
> On Jan 18, 2009, at 10:26 PM, Hallvord R. M. Steen wrote:
>
>>
>>> If you change the *src* attribute on a script node, should the new script
>>> be executed. *Yes - otherwise, why bother?
>>
>> Not current practise according to my tests (except for Safari which
>> happily re-executes many more scripts than the rest of us - perhaps Safari
>> just didn't come across those compat problems yet? Pure luck? ;-) )
>
> We haven't run into issues with this as far as I know but I am willing to
> believe they exist, and we are not wedded to the current behavior.
>
> One thing I am curious about is what exactly the Trident behavior is here,
> since apparently Gecko's behavior is an attempt to partly emulate it.

We didn't really shoot for emulating anyone. I did some initial
testing against IE7 back when I wrote a lot of this code and concluded
that their behavior with regards to script tags was so awkward that I
really didn't want to emulate it. So instead I opted to go for what
made the most sense to me and hope it didn't generate any
compatibility issues. So far none has appeared.

The only case that I can think of off the top of my head where I feel
that we're doing something illogical solely to emulate someone else is
when setting .innerHTML. During the innerHTML setter any script
loading is disabled. I believe this is something that everyone does
and at this point is required for web compatibility. (And if everyone
disagrees and thinks this makes perfect sense, good for them).

On Mon, Jan 19, 2009 at 5:54 PM, Hallvord R. M. Steen
<hallvord@opera.com> wrote:
> On Tue, 20 Jan 2009 03:29:05 +0900, Jonas Sicking <jonas@sicking.cc> wrote:
>
>>> So the proposal is that if the <script> has no @src and a shallow clone
>>> is
>>> done the clone should be allowed to execute if someone subsequently adds
>>> kids or an @src to it, right?
>
> Yes, basically. The reasoning being that when you don't in fact copy the
> script that has run, a "was executed" flag on the new empty node sort of
> makes no sense.

While I agree that on the surface of it it makes little sense to copy
the 'was executed' flag in this case. However, I don't think that the
use case is strong enough to warren extra exceptions and extra code.
It's IMHO much simpler to be consistent and always copy the 'was
executed' flag. Would you also apply the same logic if it was a deep
clone, but the script node had no children, for example if the
children had been removed since the node executed?

Put it another way, the argument against always executing a clone was
that someone is likely to do it by accident. The same argument could
be applied here. If someone does a shallow cloneNode on <script>, it's
quite possible that they are doing it as part of a custom tree clone
that walks a tree and does shallow clones recursively except in a few
places to modify the resulting tree.

Basically I think that it's somewhat strange to interpret the 'deep'
argument as anything but what the DOM Core spec says, that it should
clone the children as well as the node. If we do want to make
exceptions to this rule I think we should have a good reason for it.
And given the weak use cases that I can think of for cloning <script>
nodes at all, I don't think we have a good reason here.

/ Jonas
Received on Tuesday, 20 January 2009 09:41:52 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 9 May 2012 00:16:28 GMT