Re: Do we need the restrictions on the <base> element?

Boris Zbarsky schreef:
>> Also, in your case of image swapping, there is network overhead for 
>> retrieving the new image.
>
> No, there isn't.  Once both of the images being swapped between are 
> loaded (during pageload), the only overhead is painting the image -- 
> both images are already stored in decoded form in platform bitmaps, 
> usually in the graphics card's memory.  So the painting is quite fast.
>
>> This way outweighs the insignificant extra time it takes to perform 
>> the O(log n) operation for resolving the baseURI.
>
> Do you have actual data to back this up?  Or just feelings?  I have 
> profiles showing that the baseURI issue is a significant component of 
> image swapping performance in Gecko.

Of course an O(log n) operation is going to be slower than an O(1) 
operation. However, is the impact significant enough to ever be noticed 
by the user? I mean, in what scenarios do images swap? They swap when 
the user clicks a button in a picture viewer, or when the user hovers 
over something. Both of these are the result of user interactions, and 
the computer is so much faster than the user that a difference here is 
not going to be noticed, even if the script swaps like a 100 images. The 
other case I can imagine is an animation, for which it also doesn’t make 
sense to go faster than say, 60 frames/second at the most. Again, the 
time that this recursive lookup takes is a fraction of one 60th of a 
seconds.

By the way, blitting a bitmap still seems a much slower operation than 
calling a function recursively a dozen or so times.

As for test data, the only way I can test this is in Javascript. I’m not 
sure if any performance data there is interesting to you. But I do not 
think looping a baseURI lookup a few hundred thousand times and looking 
at the performance difference is an accurate reflection of reality.

In fact, contrary to what I thought earlier, this already seems to be 
implemented in Firefox. So if this change from O(1) to O(log n) is such 
a big problem, then why is it. I can only conclude that this is not a 
problem.

>> * Nowhere is it specified that this should actually happen. *
>
> If you're willing to accept hysteresis in your DOM, sure.  But 
> generally, it's accepted that as much as possible the rendering of the 
> document should reflect the state of the DOM, not how the DOM got into 
> that state.  If you abandon this principle, all sorts of 
> "optimizations" become possible.  ;)
>
> And yes, I know that <html:base> already violates this principle. This 
> is for compat reasons as much as anything else, harking back to the 
> days of Netscape and its stack-based non-DOM handling of HTML.

Well, if you think it’s acceptable for html:base :). It’s a pragmatic 
solution, I admit, but if that’s the only thing that works for browser 
vendors then it’s better than no solution, and I don’t think it’s 
directly against any specification. And HTML5 can take the role here to 
specify the desired behaviour for the HTML DOM.

By the way, are there really any sites that depend on the current 
behaviour that when <base>’s href attribute is changed, the images don’t 
change? It seems like an unlikely operation in the first place.

>> The current implementation of Firefox does not resolve baseURI 
>> dynamically. Instead, it seems baseURI is a constant which is set on 
>> every element while parsing.
>
> That's certainly not the case for XML nodes in Firefox.  For HTML, 
> this is true.

Oh, you’re right. I must have confused my XHTML and HTML test cases somehow.

>> because 1. the current method has an initial O(n) operation
>
> It doesn't, since the base URI is stored on the document.

No, this is also true when xml:base is used. So it can’t be that it’s 
stored on the document.

>> Finally, if you really want to you can optimise the whole baseURI 
>> implementation by making the getter check for a global flag which 
>> indicates whether "xml:base" is used anywhere.
>
> Yep, so that any page that uses it gets an immediate performance hit 
> (just like mutation events).  But is that desirable?  Again, authors 
> aren't happy when DOM methods start being 2-3 times slower just 
> because someone stuck a single attribute somewhere in the document.

Well, that is assuming that you let the images re-evaluate.

But either way, if they’re using that functionality, then that’s what 
happens. You optimise where you can, especially when it concerns 
avoiding performance changes in existing documents. I’d say that’s a 
good thing.

Unless you have a proposal for how to implement xml:base-like 
functionality without any performance impact?

>> 3. Not support dynamic operations of any kind on base URIs. This is 
>> the current behaviour of Firefox’s.
>
> That statement is false, certainly for XHTML, given that Firefox does 
> support xml:base for it.

You’re partially right, I confused my test cases. In XHTML, when the 
xml:base attribute is changed, links etc. are indeed properly updated 
when hovered or activated, and the baseURI property returns the new 
value as well. However, re-setting an image’s URL does not result in the 
new baseURI being used (unlike Internet Explorer and Opera), and 
modifying the <base>’s href doesn’t have any effect either (also unlike 
Internet Explorer and Opera). I guess this was probably the source of my 
confusion.


~Grauw

-- 
Ushiko-san! Kimi wa doushite, Ushiko-san nan da!!
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Laurens Holst, student, university of Utrecht, the Netherlands.
Website: www.grauw.nl. Backbase employee; www.backbase.com.

Received on Tuesday, 5 June 2007 10:48:12 UTC