W3C home > Mailing lists > Public > www-talk@w3.org > May to June 1995

Re: Agent-mediated access (was Re: Criticism of Kidcode...)

From: Brian Behlendorf <brian@organic.com>
Date: Mon, 19 Jun 1995 23:55:36 -0700 (PDT)
To: Peter Deutsch <peterd@bunyip.com>
Cc: rating@junction.net, www-talk@www10.w3.org, uri@bunyip.com
Message-Id: <Pine.3.89.9506192330.e28468-0100000@eat.organic.com>
On Mon, 19 Jun 1995, Peter Deutsch wrote:
> } I've outlined many of these thoughts in a short paper at
> } http://www.organic.com/Staff/brian/community-filters.html
> 
> Got the paper and and the only problem I have with it is
> the assumption that architecturally we want everything
> going through HTTP proxy servers, as these form a natural
> bottleneck and leave the user with a browser that can
> still effectively see the entire world. 

It's not necessarily a bottleneck.  The proxy servers can be very local - 
in the case of the elementary school teacher who wanted to take her 
students to see pages at NASA, the proxy would be at the school itself, 
on their side of their internet link.  For speed this is the optimal 
setup anyways, particularly if the class is visiting many of the same 
pages.  

If the teacher chose to subscribe to, say, the PTA's list of acceptible 
URL's, then the proxy connections do not have to go through the PTA's 
proxy server - the school's proxy server would just have to fetch the 
PTA's access control list, and keep it updated.

Current browsers do indeed allow you to selectively turn on and off the proxy
settings.  Two solutions: the school's lab machines could be set up behind a
firewall that allowed *only* the proxy server to make outbound connection,
thus students had to go through the proxy server to get to the outside world;
or, you layer the proxy server selectibility one layer beyond obvious (i.e.,
as an environment variable at launch time), though I prefer the former. 

> From our
> perspective, it also is suboptimal since it requires users
> to continue viewing the net in terms of access protocols
> ("http://" indeed). 

Well, proxy servers can support a wide variety of protocols, not just
http-accessible URL's.  Just about every internet-related resource can be
identified by a URL, and can thus be filtered by proxy. 

> I want users selecting items based
> upon names like "Stock Quoter" or "Book Search". Let the
> object figure out how to find the server.

Well, which "user"?  The end user (children) would never have to know 
anything about URL's; in fact the teacher, if they subscribe to a 
third-party filtering service, would never have to know either.  Just one 
person in the chain - the most trusted point - would have to know.  At 
some point the name-to-URL mapping takes place, and until URN's are 
widely deployed there's not much choice.

> I'm happy to use servers to supply a URA to the client for
> execution, but want the executing code to be as close to
> the user as possible. Otherwise we can expect scaling
> problems with proxies being swamped by demand, and
> security problems since users are still essentially armed
> with a generalized browser and can potentially see the
> entire net if your filtering fails.

Right, I agree with all this - I'm envisioning proxy servers that are 
very close to the community they serve.  Sufficiently close proxies 
address all your concerns, I think.

When we talk about "security" in this regard, we need to be careful.  
Nothing we design will deter a determined individual from seeing what 
they want to see.  A 12-year-old armed with Dad's credit card could set 
up a Netcom account and away he goes.  To me, this is acceptible - a 
determined 12-year-old will sneak a peak at Playboys on a rack in a 
bookstore too.  We can only try and ensure that circumvention doesn't 
become trivial.

I think the URC/URN mechanism is indeed the best long-term solution - and you
will not find a more ardent supporter of SOAP's than I - but many people are
clamoring for a solution *NOW*, and URC/URN mechanisms have a lot of
infrastructure to build, as well as a fundamental change in both the tools
and how people building and use the web envision the dataspace.  A filtering
proxy, capable of accepting/subscribing to ACL's from other sites, could be
quickly implemented and doesn't represent a break with the existing WWW
philosophy.  If URCs or URAs can be used to contain and describe the 
ACL's as they are passed around, fantastic. 

	Brian

--=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=--
brian@organic.com  brian@hyperreal.com  http://www.[hyperreal,organic].com/
Received on Tuesday, 20 June 1995 02:55:36 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 27 October 2010 18:14:17 GMT