- From: Mark Baker <distobj@acm.org>
- Date: Thu, 18 Apr 2002 23:50:37 -0400
- To: Don Box <dbox@microsoft.com>
- Cc: "Hutchison, Nigel" <Nigel.Hutchison@softwareag.com>, jones@research.att.com, moore@cs.utk.edu, www-tag@w3.org, dorchard@bea.com
On Thu, Apr 18, 2002 at 06:28:44PM -0700, Don Box wrote: > It is happening again in this decade, since like it or not, SOAP seems > to be where networked applications are headed. SOAP has been actively promoted since Sept 1999, about 32 months ago. Between June 1993, and about the same time later, Jan 1996, the Web grew from 130 to about 100,000 sites[1]. Though I can't be bothered to count them, XMethods.net[2] lists perhaps a couple of hundred Web services. I wonder when Web services proponents will ask themselves why they're not seeing the same kind of growth that the Web saw? It couldn't be because of a lack of marketing $$$! 8-) IMO (and to keep this on topic 8-), it's because if you have an HTTP URI, you know what methods you can invoke on it (GET being the method supported by all HTTP URI, http://api.google.com/search/beta2 notwithstanding). Whereas if you have a URI to a SOAP endpoint, you don't (if you're using SOAP as most people do). Even if you lookup some WSDL to tell you what the methods are, you still don't know what they mean to invoke them; they're just opaque strings. The meaning of HTTP's methods are specified a priori, in RFC 2616; every HTTP client and server has built in "knowledge" about what they mean. [1] http://www.funet.fi/index/FUNET/history/internet/en/kasvu.html#www [2] http://www.xmethods.net MB -- Mark Baker, Chief Science Officer, Planetfred, Inc. Ottawa, Ontario, CANADA. mbaker@planetfred.com http://www.markbaker.ca http://www.planetfred.com
Received on Thursday, 18 April 2002 23:44:21 UTC