W3C home > Mailing lists > Public > www-ws-arch@w3.org > January 2003

RE: Issue 5; GET vs GetLastTradePrice

From: David Orchard <dorchard@bea.com>
Date: Thu, 2 Jan 2003 09:38:27 -0800
To: "'Champion, Mike'" <Mike.Champion@SoftwareAG-USA.com>, <www-ws-arch@w3.org>
Message-ID: <008b01c2b285$c2fc14d0$9d0ba8c0@beasys.com>


Interesting POV.  I offer my limited analysis.  The web is optimized for
ad-hoc retrieval of resources, specifically around using GET.  It's the
default method given only a URI.  Therefore people can easily post (:-) them
on billboards, send via email etc.

In the web context, it makes huge sense to be able to give around URLs.  And
the default method of GET is completely required for this success.  I want
to re-state my position, which is that the web was successful because of the
combination of URIs and the default use of GET.  And when people use POST on
web sites, they typically use it interchangeably with GET.  I've proven this
point in the past.  Therefore the web architecture may be designed around
REST, but in actually it's almost 100% deployed in a simple one method
invoke style, and this is the case that the web is brilliantly optimized
for.  Almost nobody deploys GET/PUT/POST/DELETE methods on the same URI.
99.9% of sites treat POST/GET as interchangeable.

The question from a Web services perspective, is does this same optimization
make sense?  Do we have programs that already have prebuilt "method"s, or
even a default method, and we simply have to punch in new addresses?  I
don't think so.

IMO, ad-hoc retrieval of representations doesn't make as much sense in a
machine to machine world, as compared to a hypermedia world.

The architectural middle ground that I believe we are moving towards is a
model that is optimized for program creation - hence the necessity of wsdl -
and allowing for ad-hoc invocation - hence the support for GET.  The
optimization is different, but the features are supported.  I also believe
that the default method for web services will be the POST method.

One of the areas that we (Web services) still have work to do is in figuring
out how to deploy Web services across trust boundaries.  The use of tcp
ports and port firewalls make sense when it was really hard to create and
deploy applications.  Think of how long it took to get the protocols over
the various internet ports figured out.  But now we want to scale this for
gajillions of applications while wanting the firewall admin to actually do a
reasonable job at setting up the right policy.

To me, XML is the thing that breaks the previous model of dealing with
internet level application deployment.  10 years ago, it was fairly hard to
create document formats and protocols.  XML made it orders of magnitude
easier to create formats.  SOAP makes it an order of magnitude easier to
create .... XML based protocols.  People that focus on the use of URIs and
GET as the focus on app development, IMO tend to ignore the whole
centralized registry of ports and description of the HTTP protocol as well
as the commensurate programming that the HTTP developers had to do.

If there is any argument that would convince me that GET is better than
GetLastTradePrice, it would be the firewall argument.  But having spent a
few times discussing the realities of deployment of the various options with
MarkB, I remain unconvinced that the firewall admin has any greater insight
or control into application of security policy under one or the other

Which makes me believe that we still have a hole in the web services
architecture on what needs to be in soap messages and wsdl definitions in
order to allow a firewall admin to do their job better.  SOAPAction was one
attempt at this.  WS-Routing Action was another.  There are various other
approaches.  But I don't think we've taken a very systematic and thorough
approach at looking at the use case of a web service being developed and
deployed by developers and security people.


> -----Original Message-----
> From: www-ws-arch-request@w3.org [mailto:www-ws-arch-request@w3.org]On
> Behalf Of Champion, Mike
> Sent: Thursday, January 02, 2003 8:41 AM
> To: www-ws-arch@w3.org
> Subject: RE: Issue 5; GET vs GetLastTradePrice
> > -----Original Message-----
> > From: Walden Mathews [mailto:waldenm@optonline.net]
> > Sent: Thursday, January 02, 2003 10:37 AM
> > To: Newcomer, Eric
> > Cc: www-ws-arch@w3.org
> > Subject: Re: Issue 5; GET vs GetLastTradePrice
> >
> > I'd like to get clearer on what that middle ground is.  Last
> > summer I got involved in a project that
> > had already decided to use XML in a "document" mode as
> > opposed to a "RPC" mode, but the
> > distinction was only skin deep, at least according to my
> > analysis.
> I sometimes suspect that too.  As someone said in one of these recent
> threads, it would be hard to distinguish an instance of one
> from the other
> using a protocol analyzer. The distinctions do seem more at the design
> pattern level-- do you CONCEIVE of the message as a method
> invocation with
> arguments that gets directly mapped onto a procedure call of
> some sort, or
> do you conceive of it as a business document to be acted on by some
> intermediate software that interprets the data and indirectly
> invokes the
> back-end software.
> I'm  seeing this more and more as an engineering question of
> finding the
> optimal degree of coupling in a particular web application
> than as some huge
> meta-question of competing paradigms. The tighter coupling of
> direct mapping
> of method invocations (synchronous or asynchronous) on one
> system to another
> via SOAP messages makes sense in stable, well-managed systems where
> performance considerations are paramount; the looser coupling
> of business
> document exchange makes sense in more dynamic, hapharzardly
> managed systems
> where ease of discovery by new "customers" is more important than
> performance for existing customers.
Received on Thursday, 2 January 2003 12:41:45 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:41:01 UTC