Re: structured metadata using A

Doug Sheppard (sirilyan@dlcwest.com)
Wed, 14 May 1997 22:07:25 -0600


Message-ID: <337A8BFD.453DF77B@dlcwest.com>
Date: Wed, 14 May 1997 22:07:25 -0600
From: Doug Sheppard <sirilyan@dlcwest.com>
To: Paul Prescod <papresco@calum.csclub.uwaterloo.ca>
CC: wlkngowl@unix.asb.com, Bruno Kestemont <bkest@ulb.ac.be>, www-html@w3.org
Subject: Re: structured metadata using A

Paul Prescod wrote:
> > Most people looking at a bogus web page will quickly
> > realize it and continue down the search list.
> 
> Hits that are infrequently followed in response to a particular search
> should be moved down the list. That might also clean up the problem of
> hundreds of copies of the same document or documents in the same set.
> All but the most important of the hits would drop to the bottom of the
> list.

All but the least-viewed documents, which is not the same as "most
important".  If the 50th hit for a certain keyword set is the one that
the user is looking for, but the first ten hits are all bogus, the
clickthroughs on the first ten hits will still register with the
engine.  This might create a feedback loop where the user, knowing that
these top hits are the "most important", will go to them, and not see
they're bogus until they've already helped reinforce their importance.

Given that the current state of the art is "first 50 words or
meta-description", I think we'd need to already have better annotation
and metadata systems in place - something to provide a structured view
of a document, perhaps? - before we can start ranking search results on
popularity of click-through.