W3C home > Mailing lists > Public > www-ws@w3.org > April 2002

RE: potential users of web services

From: Anne Thomas Manes <anne@manes.net>
Date: Fri, 5 Apr 2002 17:13:44 -0500
To: "Mark Baker" <distobj@acm.org>, "Andrew Layman" <andrewl@microsoft.com>
Cc: <www-ws@w3.org>
I would say that an RSS feed is a web service. I think the key point is that
the request and/or response are machine-processable -- regardless of the
method of invocation.


> -----Original Message-----
> From: www-ws-request@w3.org [mailto:www-ws-request@w3.org]On Behalf Of
> Mark Baker
> Sent: Friday, April 05, 2002 3:03 PM
> To: Andrew Layman
> Cc: www-ws@w3.org
> Subject: Re: potential users of web services
> On Fri, Apr 05, 2002 at 09:10:46AM -0800, Andrew Layman wrote:
> > The term Web service was created to contrast with two earlier
> > technologies.  On the one hand, it identifies a distinction from "Web
> > site" in that a Web site serves pages, typically in HTML, for display in
> > a browser to a human, while a "Web service" offers a computation
> > directly to anther computer, with no special expectation that the
> > computation will be used in a browser or for display to a human. Web
> > services are not computer-to-human but computer-to-computer.
> Well, if it's the HTML that you're concerned about, why not return some
> XML or RDF via HTTP GET?  That's machine processable.  And any piece of
> software can invoke HTTP GET on a URI, no human required.
> What about this?  http://www.xmlhack.com/rss10.php
> It's an RSS feed for xmlhack.com.  No "getXmlhackRss()", just
> "GET /rss10.php".  It's also not easily human parseable.
> I don't know why that's any less a Web service than getStockQuote().
> MB
> --
> Mark Baker, Chief Science Officer, Planetfred, Inc.
> Ottawa, Ontario, CANADA.      mbaker@planetfred.com
> http://www.markbaker.ca   http://www.planetfred.com
Received on Friday, 5 April 2002 17:13:22 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 23:05:08 UTC