W3C home > Mailing lists > Public > www-ws-desc@w3.org > July 2003

RE: Can someone recap the differences between @serviceGroup vs. definitions-targetNamespace ?

From: Savas Parastatidis <Savas.Parastatidis@newcastle.ac.uk>
Date: Fri, 25 Jul 2003 01:08:31 +0100
Message-ID: <BC28A9E979C56C44BCBC2DED313A447001EC3449@bond.ncl.ac.uk>
To: <fred.carter@amberpoint.com>
Cc: <www-ws-desc@w3.org>


> I agree completely.  While the serviceGroup is "more general," it
> completely in providing the ability to express that some collection of
> endpoints "manipulates" (works with, shares state with, etc.) some
> underlying thing.  Having the ability to find an interface but be
> to figure out which one to use seems to me to considerably lessen the
> power of the standard.
> Amongst the points of Web services is to create interoperable systems
> which can be assembled regardless of implementation technology.  Such
> assembly requires the ability to determine interfaces, points of
> *and* an ability to determine which of the alternatives with which to
> work.  The ability to indicate this as part of a service's definition
> seems critical.
> The general notion, separately, of serviceGroups is also powerful.
> there were a means (perhaps an extensible collection of @purpose
> attributes, at least one of which is something like /underlying
> entity/), then that mechansim would work as well.  But losing the
> to link endpoints (or collections thereof) to some named underlying
> thing seems like a loss to me.

If we see a Web Service as a set of actions that represent the
functionality an organisation wishes to expose to the world and those
actions are performed on a consumer's request via a message, why would
that organisation wish to expose any kind of information about the
underlying infrastructure/resources used? Why would a service interface
be closely associated with a specific resource?

When I consume a service I am interested in the results of the
operations rather than the underlying resources used to achieve those
results. Interfaces that are coupled with specific resources, internal
to an organisation, will result into tightly coupled distributed
applications. Granted, interoperability will be possible because of XML,
WSDL, and SOAP but the architecture will encourage light-weight
components, tightly integrated instead of loosely coupled coarse
components that I believe services should be. 

Also, for any kind of association between service interfaces or between
a service interface and a resource we have RDF.

Just my 2c.

Received on Thursday, 24 July 2003 20:08:37 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 23:06:32 UTC