W3C home > Mailing lists > Public > www-ws@w3.org > April 2002

Re: potential users of web services

From: Mark Baker <distobj@acm.org>
Date: Fri, 5 Apr 2002 15:03:22 -0500
To: Andrew Layman <andrewl@microsoft.com>
Cc: www-ws@w3.org
Message-ID: <20020405150322.H8199@www.markbaker.ca>
On Fri, Apr 05, 2002 at 09:10:46AM -0800, Andrew Layman wrote:
> The term Web service was created to contrast with two earlier
> technologies.  On the one hand, it identifies a distinction from "Web
> site" in that a Web site serves pages, typically in HTML, for display in
> a browser to a human, while a "Web service" offers a computation
> directly to anther computer, with no special expectation that the
> computation will be used in a browser or for display to a human. Web
> services are not computer-to-human but computer-to-computer.

Well, if it's the HTML that you're concerned about, why not return some
XML or RDF via HTTP GET?  That's machine processable.  And any piece of
software can invoke HTTP GET on a URI, no human required.

What about this?  http://www.xmlhack.com/rss10.php

It's an RSS feed for xmlhack.com.  No "getXmlhackRss()", just
"GET /rss10.php".  It's also not easily human parseable.

I don't know why that's any less a Web service than getStockQuote().

MB
-- 
Mark Baker, Chief Science Officer, Planetfred, Inc.
Ottawa, Ontario, CANADA.      mbaker@planetfred.com
http://www.markbaker.ca   http://www.planetfred.com
Received on Friday, 5 April 2002 14:57:36 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 3 July 2007 12:25:40 GMT