- From: <paola.dimaio@gmail.com>
- Date: Thu, 23 Oct 2008 21:48:28 -0700
- To: "रविंदर ठाकुर (ravinder thakur)" <ravinderthakur@gmail.com>
- Cc: "Hugh Glaser" <hg@ecs.soton.ac.uk>, "Andreas Langegger" <al@jku.at>, "Semantic Web" <semantic-web@w3.org>, "semantic_web@googlegroups.com" <semantic_web@googlegroups.com>
well well interesting dilemma,( a discussion possibly pertaining to another list?) /digress/ > if ontology is finite and well defined then it won't be complete and if its > not well defined and not finite, well then i would say its not an ontology. I think Ravinder, that its worth distinguishing 'Ontology' from 'ontology' (sorry folks, I dont remember who said this, was it Tom, or Michael?) where by the first is the metaphysical representation of 'what there is' with no particular limitation in scope - everything which exists is part of it - which however in absolute terms is imponderable , and the second refers to a subset of what there is, aka domain ontology, limited by the domain boundary, which supports limited reasoning within a certain scope we only know what we think there is, based on our cognitive and perception abilities, and what we can prove exists, to some extent, based on some notion of logic and scientific method when I go to the shop to buy milk, I have no assurance that they have not run out, until I get there. However, I get dressed, take the umbrella if its raining, walk to the shop nonetheless, based on an hypothesis, a likely assumption that the shop has the milk that I require Afteer a couple of times I dont find t, I try to understand the pattern - say, at what time does the milk tend to run out , and think of a solution, ie go earlier, or ask them shop to keep some aside for me et - the powerful human mind would look for a cause of the problem and a for a solution implicitly. it would build a mental picture of the situation, and lead to an adjustment in behaviour to avoid the problem - very simple, very normal, very implicit capablity of the mind Ontology is the schema of things that the human mind adopts as reference for reasoning to take place, human knowledge representation is highly dynamic, capable of adjusting and expanding To build machines capable of replicating even this simple intelligent behaviour requires simulating the realm of assumptions, axioms and possibilities that only the human mind can do intriscally. So ontology engineering came about, in the attemp to support intelligent systems behaviours. (build me an intelligent butler please) the interesting part is that ontology engineering is exploring ways of developing 'learning ontologies', that is ontologies that can evolve and adapt and expand their boundaries it will be really exciting when different domain ontologies will be able to interact with each other /digress ends/
Received on Friday, 24 October 2008 04:49:05 UTC