- From: Mark Nottingham <mnot@mnot.net>
- Date: Mon, 25 Nov 2002 14:08:15 -0800
- To: "Tim Bray" <tbray@textuality.com>, "Paul Grosso" <pgrosso@arbortext.com>
- Cc: <www-tag@w3.org>
> I think the Web Services community ought > to be real nervous about flying in the face of an IETF BCP. Why? It's a Best Current Practice, and it is explicitly scoped to use of XML within *IETF* protocols. It's just that community's best current thinking (words directly from RFC2026) on a particular topic, and may (dare I say probably will) change. Granted, if there is a desire to make it possible to use Web services as the basis of IETF protocols, there might be cause for concern; however, that isn't a requirement that I've heard for a while, and I imagine that the IETF isn't terribly inclined to start using them willy-nilly either; Web services have a different audience. > - On the other hand, I think it's entirely reasonable that SOAP agents > be forbidden from being required or expected or even allowed to charge > off fetching DTDs or external parameter entities at run times, for > obvious performance and security reasons. This gets to the heart of the question, I think; can specific applications of XML restrict the use of syntactic mechanisms it defines? The answer seems to be "yes." I don't think it's supportable to be selective about this, at least at this granularity. > - That granted, forbidding an internal subset seems kind of dumb. > Speaking as an XML processor implementor, the extra code required is > hardly detectable and the performance gain not significiant. > Furthermore, every XML processor in the world just silently does the > internal subset and it's going to cost *extra work* for SOAP > implementations to check that they haven't. I.e. you can't use an > ordinary off-the-shelf non-validating XML processor. Perhaps the WG has a good reason for this prohibition; have they been asked?
Received on Monday, 25 November 2002 17:12:23 UTC