- From: Champion, Mike <Mike.Champion@SoftwareAG-USA.com>
- Date: Mon, 8 Jul 2002 18:03:59 -0600
- To: "'xml-dist-app@w3.org'" <xml-dist-app@w3.org>
> -----Original Message----- > From: Paul Prescod [mailto:paul@prescod.net] > Sent: Monday, July 08, 2002 6:53 PM > To: gtn@rbii.com; 'xml-dist-app@w3.org' > Subject: Re: FW: LC Comments: Web Method Feature > > We are not trying to stop people from > solving problems. We are trying to encourage them to solve > them *in the most interoperable way*. Understood. Still, one sometimes get the impression that REST advocates see interoperability as the most important criteria that must trump all others. Here's a real world example: Some geographically dispersed organization needs to regularly exchange huge files between far-flung sites. They *could* do this over the web, but there is enough intrinsic unreliability in the lower levels of the networks that HTTP or FTP require a lot of operator intervention or inefficient retries to get the job done. So, they use a proprietary message queuing system that takes care of things more efficiently in terms of both human time and network bandwidth. Between organizations, I can fully agree that interoperability almost always trumps efficiency, but this is a hard sell for EAI, B2B, etc. installations with a less open network. After all, given all the work these folks do to make sure that only authorized people and trusted programs can communicate over their systems, it's not all that much more trouble to ensure that compatible software is deployed. More generally, if one treats HTTP or FTP as the highest-level protocol, then one is saying that the application code (or human operator) is supposed to take care of the details of confirmations and retries. It's nice that proper use of REST principles guarantees that GETs are safe and PUTs idempotent, but that's still something that the application layer has to deal with. The appeal of SOAP (or proprietary) messaging is that this grunt work can be shoved down into the *infrastructure*. Also, maybe I'm missing something: If MQ Series or some similar system supports read, write, update, and delete of arbitrary chunks of data, why isn't this RESTful? If an application uses some SOAP headers to ask for a reliable transport (which could be guaranteed with hardware, HTTPR or whatever, or a proprietary protocol) rather than insisting that the reliability be the responsibility of the application, why is this a Bad Thing? OK, it's not widely interoperable with systems that don't understand those headers, but what if the user doesn't care? Could we establish a modus vivendi "Use REST principles when you care about interoperability over the Web, use whatever works when you don't?" Or "use REST principles to get scalability on an unknown infrastructure, use the capabilities that you already have if you paid the big bucks to get scalability". If so, SOAP offers more flexibility than REST needs, but no more than some of these other use cases require.
Received on Monday, 8 July 2002 20:04:32 UTC