W3C home > Mailing lists > Public > xml-dist-app@w3.org > January 2002

RE: Article: Fat protocols slow Web services

From: Champion, Mike <Mike.Champion@SoftwareAG-USA.com>
Date: Thu, 10 Jan 2002 09:22:06 -0500
Message-ID: <9A4FC925410C024792B85198DF1E97E4022E3D4C@usmsg03.sagus.com>
To: xml-dist-app@w3.org


> -----Original Message-----
> From: Francis Norton [mailto:francis@redrice.com]
> Sent: Thursday, January 10, 2002 8:47 AM
> To: Champion, Mike
> Cc: xml-dist-app@w3.org
> Subject: Re: Article: Fat protocols slow Web services
> 
> 
> All looks so pretty much like Web Services with private UDDI, to me.

Yes! My point -- a real question, not a rhetorical one -- is whether the
HTTP/XML/RPC works anywhere near as well in real applications over the wild
internet as it does over a tame intranet. The points that have been raised
in this thread about latency, reliability, "burstiness" etc. can be managed
in an intranet by investments in hardware and competent system
administration, but not when you have potentially millions of remote
customers with all sorts of devices, transports, and intermediaries to deal
with.

The larger issue is whether web services (over the wild world web) can, as a
general rule, continue to build on the synchronous/RPC paradigm that has
scaled reasonably well from single machines to local area networks to
enterprise networks.  There seem to be two points of view: Either the
internet infrastructure will evolve fast enough so that the RPC paradigm
continues to scale up and the underlying complexity is hidden from the
application programmer, or it won't and a more loosely coupled, asynchronous
model of web services delivery will be something that web services
developers have to deal with.  SOAP can definitely handle either model, RPC
is one use case and there could be use cases for various asynchronous
coordination protocols as well.  In other words, is the job of making SOAP
web services work well over the internet going to be the job of those
long-suffering network administrators that the original article was
addressed to, or the Visual Basic programmers that Kurt Cagle mentioned?

Another non-rhetorical question: Do the Microsoft .NET tools that those
Visual Basic programmers will be migrating to really assume an underlying
synchronousness or can they transparently handle, for example, an underlying
SMTP transport? I would guess that they can, but you may have to write code
rather than relying on the GUI and the wizards to do it auto-magically?
Received on Thursday, 10 January 2002 09:22:16 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 7 December 2009 10:59:05 GMT