W3C home > Mailing lists > Public > www-ws-arch@w3.org > January 2003

Re: Myth of loose coupling

From: Jacobs,David B. <djacobs@mitre.org>
Date: Mon, 6 Jan 2003 22:33:30 -1000
Message-ID: <0ee101c2b627$76583ff0$6501a8c0@jacobs>
To: "Ugo Corda" <UCorda@SeeBeyond.com>, <www-ws-arch@w3.org>

----- Original Message -----
From: "Ugo Corda" <UCorda@SeeBeyond.com>
But from the point of view of the client machine (the browser) there is no
change at all. It is still the same old HTML syntax/schema. On the other
hand, if we look at the "total" interface (HTML + human interpreting the
page rendition), the change might indeed be significant. In the extreme
case, the information might have been rearranged on the page so much that a
person used to the old layout might need to call Amazon to figure out how to
deal with the new interface (tight coupling).


Yes, but there is quite a bit of leeway that Amazon has before that becomes
an issue.

To go off topic a little, I was wondering if someone could explain to me why
web services haven't focused on making existing web sites machine
processable.  I mean SOAP is a really cool RPC technology, but programming
SOAP applications is quite different from web sites.  Similarly, while REST
is philosophically closer to web sites, it too does not really follow how
web sites typically function.

When you visit a web site it typically returns an HTML page.  The page
typically contain up to three types of information.
* Links of where you can go next
* Forms you can fill out and submit
* Results/Information for you to consume

Why not then just have an additional output type for your web site (like WAP
is for many sites) that outputs a markup language that explicitly labels the
links you can go to, the forms to fill out and an encapsulated xml document
for the consumable results.

For example something like (please ignore the specifics and focus on the
flavor, I'm sure there are many better machine processable markup languages,
this one was made up off the cuff).

<mtml>
  <choices>
    <link type="http://ecommerce.org/#checkout"
href="http://company.com/checkout"/>
    <link type="http://ecommerce.org/#emptyBasket"
href="http://company.com/emptyBasket"/>
    <form method="GET" type="http://ecommerce.org/#addItem"
href="http://company.com/addItem">
      <input name="title" type="http://books.org/#title" value="Grapes of
Wrath"/>
    </form>
    <form method="GET" type="http://ecommerce.org/#search"
href="http://company.com/search">
      <input name="keyword" type="http://books.org/#search"/>
    </form>
  </choices>
  <body>
    ...Application specific XML goes here
  </body>
</mtml>

I would think existing web application builders would find this much easier
to wrap themselves around.  And in fact make it much easier to support the
web site for humans and machine agents at the same time.  This also solves
the problem or workflow coordination for web services because the web
services tells you when you can execute different methods (just like it does
for people).

Am I missing something fundamental?

David
Received on Tuesday, 7 January 2003 03:35:26 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 3 July 2007 12:25:12 GMT