W3C home > Mailing lists > Public > www-tag@w3.org > March 2002

Re: Genericity, strong typing, SOAP-RPC, GRASP, and defining the Web

From: Jeff Bone <jbone@jump.net>
Date: Tue, 26 Mar 2002 19:16:09 -0600
Message-ID: <3CA11D59.410B4861@jump.net>
To: Miles Sabin <miles@mistral.co.uk>
CC: www-tag@w3.org

Miles Sabin wrote:

> Jeff Bone wrote,
> However this ignores the fact that there's "power
> > in large values" and that going to smaller, fine-grained interfaces
> > with more specific types produces tighter coupling and non-linear
> > growth of compositional complexity.
>
> I don't think it's anything like that clear cut.
>
> Types don't merely have complexity costs, they have complexity
> benefits as well.

Note that we're talking about compositional or architectural complexity,
not complexity in general.  I would suggest it's very clear cut that
having an open-ended and arbitrarily large set of types introduces
compositional or architectural complexity;  it's the genericity of e.g.
HTTP across all Http-typed resources that enables caching proxies and
other interesting intermediaries to be usefully plugged in ad hoc and post
facto.

> A type is a constraint: it sets bounds on the range
> of acceptable values of any instance of that type, and as such
> _reduces_ complexity.

Without a doubt.  In the context of our conversation, I'm proposing that
HTTP defines *1* single type (or rather a family of related types with
perhaps the type "resources that support GET" being one of the most
basic.)  The question isn't whether types are good --- surely they are ---
it's whether strong typing is good (assumption:  yes, but that doesn't
imply fine-grained types) and whether constraining the *number* of types
in a given system has complexity benefits (assumption:  again, yes.)

> In many cases we have a simple trade off: packing information into
> values vs. packing information into types. In some cases the trade off
> will favour weak types and rich data; in others it will favour strong
> types and thin data. But in neither case have we magic'd complexity
> away. Ultimately we'd just moving be the same stuff around ... squeeze
> the balloon in in one place and it'll bulge out in another.

This is of course the common assumption.  I'm not so sure, but I'm not
really ready to make that argument.  As my perennial anecdotal evidence to
suggest that system complexity is vastly reduced and power gained by
highly constrained type systems / universal generic interfaces, consider
the overall simplicity and "functionality to lines-of-code" qualities
obtained in Plan vis-a-vis UNIX by pushing the "everything is a file,"
"everything in the same namespace," and "text streams everywhere" concepts
to their logical conclusions.

> Pragmatically that might be a win, because hopefully we'd have moved
> problems closer to the most appropriate place to solve them. But
> pragmatism is really all we really have here, not the grounds for
> choosing an overarching philosophical system.

Absolutely!

> In particular, I'd say that where there's a non-linear growth of
> complexity in a large strongly typed system, there'd be an
> corresponding growth of complexity in the equivalent large weakly
> typed system.

UNIX.  Plan 9...

jb
Received on Tuesday, 26 March 2002 20:29:25 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Thursday, 26 April 2012 12:47:05 GMT