- From: Matt Jensen <mattj@newsblip.com>
- Date: Mon, 6 Nov 2000 13:04:11 -0500 (EST)
- To: Gordon Joly <gordo@dircon.co.uk>
- cc: Craig Pugsley <craig.pugsley@mimesweeper.com>, "'www-rdf-interest@w3.org'" <www-rdf-interest@w3.org>, "'semantic-web@w3.org'" <semantic-web@w3.org>
On Mon, 6 Nov 2000, Gordon Joly wrote: > And Great store was put in LISP and Prolog to build AI systems. To me, one of the most intriguing aspects of TBL's description of a Semantic Web is the parallel with the *flaws* of the WWW. There were many people in the 80's working on hypermedia systems, and a significant reason that they stalled and the WWW took off is that they cared about ensuring consistency, bidirectional links, etc., and Tim was willing to let go of that. The result is >1 billion WWW pages, and probably >10 billion links. A small percentage of the pages are broken, but on the whole the WWW provides tremendous value. Similarly, I view most of what has been done in AI as focused on consistency, correctness, etc., which (so far) has limited the successes it can claim. If you're looking for a Semantic Web that can give you "truth", we've got a long wait. If you're looking for something that improves search results through related concepts and simple inferences, in a few years you should be able to get something that's useful, but not perfect. -Matt Jensen NewsBlip Seattle
Received on Monday, 6 November 2000 13:08:55 UTC