- From: Joseph Kesselman <keshlam@us.ibm.com>
- Date: Tue, 20 Mar 2001 15:06:35 -0500
- To: www-dom@w3.org
>I am not sure I agree with that. As soon as an application uses a class >factory to create subclasses at runtime, I do not necessarily know the >classes that I need to react on. Same problem exists with Visitor, though. A Visitor requires a callback method for every different response that accept() might want to issue. This means that every time you add a new response you must update all your Visitors to be able to handle it -- or do runtime type matching on the Visitor to determine whether or not it can support this case, and figure out what to do if it doesn't. To me that's much the same coding problem, merely redistributed somewhat. >The main advantage, as I see it, of the visitor pattern is to use >actual object type at runtime Note that object _type_ may not be the right thing to test -- see past discussion of why RTTI is not reliable/portable for distinguishing node types. Note too that if you start adding your own nodeTypes to the DOM you've stepped out of the space defined by the spec and all bets are off. So this set of constants is, in fact, well defined at compile time for a compliant DOM. Visitor does allow you to use the "actual object role", admittedly. If which callback you were making depended not only on the nodetype but also the namespace and localname, subclassing the node and defining the visitor pattern's accept() method appropriate might indeed yield faster code than testing all of these each time. But see the first point above; you now need a visitor which knows about additional callbacks, which means you aren't interoperable with visitors which know only about the basic DOM, which in turn seems to me to defeat much of the purpose of abstract Visitors in the first place. I can see doing this as an implementation-specific optimization; I'm not sure I see it as something which is really general enough that it belongs in the standard DOM spec. [Re Visitor versus Traverse-and-dispatch:] >I am not sure I understand you here. Do you refer to dispatch in node >classes with children to be pre- or postorder traversal? Traversal abstracts what group of nodes should examined, in what order, which we agree is half the role of Visitor. My point is that by having the loop doing the traversal also do the dispatching, you have the ability to do very different dispatching off the same object tree - whereas if you build the dispatch into the objects via accept(), you have to do a significant amount of kluging if the distribution of responses desired for this particular task doesn't match the distribution hard-coded into the nodes. >The main use case I am interested is DOM-based applications where the DOM >classes are subclassed heavily to provide application-specific >functionality. In this case, I believe, the case for visitors is very strong I think I really need something specific to look at, to compare with the Traverse-and-dispatch case; the examples I've come up with so far seem pretty evenly matched between the two. >This is aggravated in situations where the DOM engine in my application does >not know about the classes it might encounter at runtime, because they are >generated by a class factory, and they may even be supplied by plugin That seems to be in the Embedded DOM space rather than Core DOM behavior. But your milage may vary. ______________________________________ Joe Kesselman / IBM Research
Received on Tuesday, 20 March 2001 15:06:43 UTC