Sixdegrees and trust

All this talk about collaborative filtering, abuse, and trust, brings to mind the Sixdegrees site at htttp://www.sixdegrees.com/

The site lets users enter lists of their friends, acquaintances,etc, and maintains all the information in a database. The system then knows who your friends are, who your friends-of-friends are, etc.

The applications they've set up (movie reviews, discussion boards) don't strike me as more than kind of nifty. But the database of personal connections is potentially tremendously useful. You can find how you're connected to anyone. This seems like a great mechanism for systems to filter out spam, abuse, etc, in any open communication medium (like annotations, and ratings on annotations).

Anyhow. Definitely worth a look if you haven't seen it already, and if you're interested in social networks as part of a solution for filtering annotations.

				- Misha



At 11:23 AM 12/9/98 -0500, dlaliberte@gte.com wrote:
>Jakob Hummes writes:
> > Do you really think that ISPs will switch your annotations service off,
> > just because Web page mainteners do not like your service?  At least, I
> > would suggest that other ISPs will fast step in and offer such a service
> > to you as a competitive advantage...  If consumers want it, they'll get
> > it.
>
>Yes, I did mention that an ISP-sponsored annotation service might be
>viable.  But the risk of litigation might scare off such ventures.
>Moreover, how much do the consumers really want annotations?  At what
>additional cost?
>
>Somethings seem like they should be interesting, but they aren't, like
>annotations.  Other things seem like they would not be interesting, but
>they are, like chat.  Apparently, we are the odd ones.
>
> > Still all the other challenges (especially uninteresting and misleading
> > annotations) still exist.  This can be overcome by filtering operations
> > (cooperative filtering and defined interest groups come into mind).
>
>Automatic filtering will continue to be an interesting technological
>challenge for a long time since it is essentially an artificial
>intelligence problem.
>
>Enabling *people* to do the filtering is a far easier technological
>problem, but it is also a social problem: why would they bother?
>Certainly some people will do reviews for you, but they have to be
>getting something for it, or it has to be extremely easy.  Furthermore,
>whose filtering are you going to trust?  There are quite a few pieces
>missing from the web-of-trust puzzle, but I think it will eventually
>happen.
>
>--
>Daniel LaLiberte
> dlaliberte@gte.com  (was: liberte@ncsa.uiuc.edu)
> liberte@hypernews.org
>
>
>

Received on Wednesday, 9 December 1998 12:02:08 UTC