RE: Genericity, strong typing, SOAP-RPC, GRASP, and defining the Web

Jeff Bone wrote,
> Note that we're talking about compositional or architectural 
> complexity, not complexity in general.

I think that's a sleight of hand. Architectures aren't merely 
abstractions, they exist only insofar as they're embodied in
concrete implementations. It's often the case that architectural 
simplicity comes at the cost of implementation complexity, in which
case measuring complexity by focussing soley on it's architectural
aspect is cheating.

> I would suggest it's very clear cut that having an open-ended and 
> arbitrarily large set of types introduces compositional or 
> architectural complexity;

That's arguable. What matters is the relationships between those
types, not just how many there are. In OO software I've seen 
incredibly baroque class structures with a handful of types, and
incredibly simple class structures with hundreds.

> > A type is a constraint: it sets bounds on the range of acceptable 
> > values of any instance of that type, and as such _reduces_ 
> > complexity.
> Without a doubt.  In the context of our conversation, I'm proposing 
> that HTTP defines *1* single type (or rather a family of related 
> types with perhaps the type "resources that support GET" being one 
> of the most basic.)  The question isn't whether types are good --- 
> surely they are --- it's whether strong typing is good (assumption:  
> yes, but that doesn't imply fine-grained types) and whether 
> constraining the *number* of types in a given system has complexity 
> benefits (assumption:  again, yes.)

I agree up to a point. But this only applies where HTTP is used 
solely for it's original purpose: hypertext transfer. Add media types 
with type specific processing at the receiver, or content negotiation, 
or layered application semantics encoded into URI conventions and the
genericity evaporates.



Received on Tuesday, 26 March 2002 21:14:50 UTC