The more I read about BOS, ilink storage strategies, multi-pass XML parsing etc.
the more I see a tension between client-side and server-side implementation
of XML processing.
I mean, an XML document - as authored - might perhaps have ilinks spread
willy-nilly but that does not mean it has to be served to the client in that
form. Equally, AF processing could be organised for maximum authoring
convenience in the "raw" XML document and then SPAMM'ed to attributes before
the client sees it.
The disadvantages of server side processing have already been discussed on this
list. The advantages, in terms of simplicity, for XML browser tools however,
would surely be significant.
What I am trying to get at is that XML docs as seen by the client can either
be "raw" XML or "cooked" XML that simplifies (and speeds up!) client-side
implementation. Cooked in terms of AFs. Cooked in terms of BOS. Cooked in
terms of ...
The transition from raw to cooked - if done at the server side - does not
complicate the client. Simple client - more complex servers. Having said
that, BOS (I like DJD's term "document working set") or multiple
parsing of XML docs, will be, IMO, kinda tough on the client end.
Analogies abound. The first that springs to mind is the difference between
a networked database application that downloads records looking for matches
and one that sends SQL to the server and retrieves matching records only.
Whether or not cooked XML is an option seems to me to depend on what
client are advised to do with XML docs over an above browsing. Are they
intended to be thrown away NC style or harvested once for local re-use? For
throwaway docs, cooked is fine. For harvesting, cooked is perhaps, far from
Sean Mc Grath
(Member of the XML-KISS iniative).