- From: Stephen D. Williams <sdw@lig.net>
- Date: Tue, 20 Jun 1995 13:54:07 -0400 (EDT)
- To: nazgul@utopia.com (Kee Hinckley)
- Cc: lilley@afs.mcc.ac.uk, m.koster@nexor.co.uk, nsb@nsb.fv.com, rating@junction.net, www-talk@www10.w3.org, uri@bunyip.com
> > At 3:41 PM 6/19/95, lilley wrote: > >Those influential (and monied) groups who care about such things can then > >finance the proxies and their URC resolvers to implement whatever type > >of cens^H^H^H^Hfiltering is desired. > > A proxy based system really doesn't scale, money or no-money. I think you're wrong: All ISP's, companies, and Internet sites with more than a few users should have proxies for http, ftp, nntp (Newsservers are the most common proxy of course). Of course single global proxies don't work, but neither would a small set of central news servers. Local proxies (aggregating gateways) Reduce load on upstream sites. This causes the whole net to scale better. That is part of why I say that having user configured filtering at an ISP level proxy is a perfect way to introduce ratings/filtering/selecting. I'm pressed for time now, so I'm slow in implementing the following, but I'll put out the design in case anyone has time on their hands: (This assumes you've read my previous postings to rating@junction.net. Email for a repeat. Note that I spent a fair amount of time researching possible meta-data formats. There are several competing 'future standards' currently.) Create a form that allows a user to select filtering information for primary and secondary users (children). Also allow selection of meta information to be categorized or search upon. This could be used to highlight desired information that is new. (from www-announce, site-wide .meta files, comprehensive ratings servers, etc.) I'm planning on creating a cgi program that compares it's arguments with a database of sites, users, and cached information. To begin with, it would match based on IP address. Later, it would prompt for user information at the beginning of a session. Either through configuration or through modifications (I'm going to either user cern_http or spinner), map all URL's to /cgi-bin/filter/*. For each document, access a local .meta file, then site/.meta, then site/directory/.meta, then URL.meta, in reverse order. Read document, scanning for <meta> tags. Create the cached URL.meta file. Compare to user settings. Either return document, or return page showing why page was unavailable. For a user to use, they only have to point their client to this proxy server. > Kee Hinckley Utopia Inc. - Cyberspace Architects=81 617/721-6100 > nazgul@utopia.com http://www.utopia.com/ > > I'm not sure which upsets me more: that people are so unwilling to accept > responsibility for their own actions, or that they are so eager to regulate > everyone else's. sdw -- Stephen D. Williams 25Feb1965 VW,OH (FBI ID) sdw@lig.net http://www.lig.net/sdw Consultant, Vienna,VA Mar95- 703-918-1491W 43392 Wayside Cir.,Ashburn, VA 22011 OO/Unix/Comm/NN ICBM/GPS: 39 02 37N, 77 29 16W home, 38 54 04N, 77 15 56W Pres.: Concinnous Consulting,Inc.;SDW Systems;Local Internet Gateway Co.;28May95
Received on Tuesday, 20 June 1995 13:08:57 UTC