- From: Leo Obrst <lobrst@mitre.org>
- Date: Thu, 14 Feb 2002 17:58:05 -0500
- To: Deborah McGuinness <dlm@ksl.stanford.edu>
- CC: Ludger van Elst <elst@dfki.uni-kl.de>, www-webont-wg@w3.org
One of the problems in general is that these definitions are pretty meaningless to the unenlightened masses. I know: in many venues, I've had to tamp/dumb down (this is not really pejorative: why expect anyone outside our community to know the details?) the definition/exposition of what an ontology is and why is it useful. This is as true in business as it is in government. A paraphrase of my usual spiel (with parenthetical comments bracketed) at the ontology-naive level: --- What's an Ontology? An ontology defines the common words and concepts (meanings) used to describe and represent an area of knowledge. Ontologies are used by people, databases, and applications that need to share domain information (a domain is just a specific subject area or area of knowledge, like medicine, tool manufacturing, real estate, automobile repair, financial management, etc.) Ontologies include computer-usable definitions of basic concepts in the domain and the relationships among them. They encode knowledge in a domain and also knowledge that spans domains. So, they make that knowledge reusable. An ontology includes the following kinds of concepts: · Classes (general things) in the many domains of interest · Instances (particular things) · The relationships among those things · The properties (and property values) of those things · The functions of and processes involving those things · Constraints on and rules involving those things [I usually give an example here of an ontology which has the above items, in an English quasi-logical form.] Ontologies are usually expressed in a logic-based language, so that fine, accurate, consistent, sound, and meaningful distinctions can be made among the classes, instances, properties, attributes, and relations. Some ontology tools can perform automated reasoning using the ontologies, and thus provide advanced services to intelligent applications such as: conceptual/semantic search and retrieval (non-keyword based), software agents, decision support, speech and natural language understanding, knowledge management, intelligent databases, and electronic commerce. One way to look at ontologies is as metadata schema (metadata is just data about data, mostly about its content; a schema is just a blueprint for particular data), that is, a way of structuring and representing the semantics (meaning) for metadata elements. What is normally known as an ontology can range from the simple notion of a Taxonomy (knowledge with minimal hierarchic or parent/child structure), to a Thesaurus (words and synonyms), to a Conceptual Model (with more complex knowledge), to a Logical Theory (with very rich, complex, consistent, meaningful knowledge). [I introduce the notion of 'metadata', which many audiences have some familiarity with, and relate it in simple terms to 'database schema', which they may also have some knowledge of. I also sketch what I call the "Ontology Spectrum", a way of relating notions such as 'taxonomy', 'thesaurus', 'conceptual model', 'logical theory' in an ascending way so that naive audiences can relate what they know to the bigger picture of what 'ontologies' are all about.] Ontologies figure prominently in the emerging "Semantic Web" as a way of representing the semantics of documents and enabling the semantics to be used by web applications and intelligent agents. Ontologies can prove very useful for a community as a way of structuring and defining the meaning of the metadata terms that are currently being collected and standardized. Using ontologies, tomorrow's applications can be "intelligent", in the sense that they can more accurately work at the human conceptual level. --- Hope this helps some, Leo Deborah McGuinness wrote: > > I also typically refer to Gruber's definition when I introduce ontologies - > I think citing his work is important. > When i introduce ontologies to people unfamiliar with our field, I also > think it is useful to mention that ontology has been around in the > philosophical literature for a long time and our definition departs from > theirs. > I also typically point to collections of work on ontologies, e.g., fois > books. > > I wrote a paper on Ontologies Come of Age[1], which could be one of the > things pointed to if you like and of course it points to much previous work. > > [1] > http://www.ksl.stanford.edu/people/dlm/papers/ontologies-come-of-age-abstract.html > > d > Ludger van Elst wrote: > > > Hi Webont-Members, > > > > > "what is an ontology?" stuff in requirements abstract/intro > > > From: Dan Connolly <connolly@w3.org> > > ... > > > let's > > > see if there's some text to grab... yes: > > > > > > Put simply, an ontology is just a set of > > > standard vocabularly terms along with some > > > formal definitions of the terms. > > > > > > Lightly edited: > > > > > > An ontology is vocabularly of terms along > > > with some formal definitions of the terms. > > > > I am a little bit surprised that - though nearly all papers about > > ontologies refer to Tom Gruber´s "shared conceptualization" definition - > > all proposals in this group only capture the "conceptualization" aspect > > but don´t mention the "sharing" aspect. > > In the requirements document there is a paragraph titled "3.1 Shared > > Ontologies" which would - accepting Tom's definition - expand to "3.1 > > Shared Shared Conceptualizations". > > In my opinion, a good definition of the term ontology should cover both > > aspects, sharing and conceptualizing. Otherwise, we should consequently > > only talk about conceptualizations (e.g., "A conceptualization is > > vocabularly of terms along with some formal definitions of the terms."), > > not ontologies. > > > > What do you think about this topic? > > > > Best regards, > > Ludger > > > > ______________________________________________________________________ > > Ludger van Elst > > Deutsches Forschungszentrum für Künstliche Intelligenz GmbH > > Erwin-Schrödinger-Straße Geb. 57/377, D-67608 Kaiserslautern, Germany > > Tel. : 0631 205-3474 > > E-mail: elst@dfki.uni-kl.de > > WWW : http://www.dfki.uni-kl.de/~elst/ > > ______________________________________________________________________ > > -- > Deborah L. McGuinness > Knowledge Systems Laboratory > Gates Computer Science Building, 2A Room 241 > Stanford University, Stanford, CA 94305-9020 > email: dlm@ksl.stanford.edu > URL: http://ksl.stanford.edu/people/dlm > (voice) 650 723 9770 (stanford fax) 650 725 5850 (computer fax) 801 > 705 0941 -- _____________________________________________ Dr. Leo Obrst The MITRE Corporation mailto:lobrst@mitre.org Intelligent Information Management/Exploitation Voice: 703-883-6770 7515 Colshire Drive, M/S W640 Fax: 703-883-1379 McLean, VA 22102-7508, USA
Received on Thursday, 14 February 2002 17:59:02 UTC