W3C home > Mailing lists > Public > www-ws-arch@w3.org > January 2003

Re: Issue 5; GET vs GetLastTradePrice

From: Walden Mathews <waldenm@optonline.net>
Date: Thu, 02 Jan 2003 10:37:29 -0500
To: "Newcomer, Eric" <Eric.Newcomer@iona.com>
Cc: www-ws-arch@w3.org
Message-id: <002401c2b274$dc5b0be0$1702a8c0@WorkGroup>


> Yes, I was trying to draw a distinction between the case where
applications share semantic information totally out of band, and the case in
which applications rely totally on automatic discovery mechansisms to share
semantics.  I think there's a middle ground that we sort of avoid when we
polarize the discussion as "REST vs RPC/SOAP" or "specific interfaces vs
generic" or whatever.

I'd like to get clearer on what that middle ground is.  Last summer I got
involved in a project that
had already decided to use XML in a "document" mode as opposed to a "RPC"
mode, but the
distinction was only skin deep, at least according to my analysis.  The
approach was still leading
these developers in the direction of inventing a ton of new protocol, when
very little new protocol
was actually needed.  In effect, they were doing the work that the
RPC-framework tools do, so
they were getting the worst of both worlds.  I wonder if the WWW
architecture will provide the
guidance they would need for avoiding that pitfall.

> What I was trying to say is that XML tools can help reduce the custom
coding effort, not eliminate it altogether.

Oh, I didn't realize this was about tools.  Would it be feasible to have the
discussion sans the
mention of tools?  Are tools really central to the problem of getting two
applications to understand
each other, or are they an optimization to apply deeper solutions to that

Does the inclusion/exclusion of tools affect your view of what that "middle
ground" is?

Received on Thursday, 2 January 2003 10:37:36 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:41:01 UTC