- From: Azamat <abdoul@cytanet.com.cy>
- Date: Fri, 7 Apr 2006 13:01:53 +0300
- To: "Danny Ayers" <danny.ayers@gmail.com>
- Cc: <semantic-web@w3.org>, "ONTAC-WG General Discussion" <ontac-forum@colab.cim3.net>, "Hans Teijgeler" <hans.teijgeler@quicknet.nl>, <seanb@cs.man.ac.uk>, "Paul Prueitt \(ontologystream\)" <psp@ontologystream.com>
Danny, Please see my comments below. With respects, Azamat Abdoullaev ----- Original Message ----- From: "Danny Ayers" <danny.ayers@gmail.com> To: "ONTAC-WG General Discussion" <ontac-forum@colab.cim3.net> Cc: "Hans Teijgeler" <hans.teijgeler@quicknet.nl>; <semantic-web@w3.org>; "John F. Sowa" <sowa@bestweb.net>; <seanb@cs.man.ac.uk>; "Peter F. Patel-Schneider" <pfps@research.bell-labs.com>; "Frank Manola" <fmanola@acm.org>; "Paul Prueitt (ontologystream)" <psp@ontologystream.com> Sent: Friday, April 07, 2006 12:49 AM Subject: Re: [ontac-forum] Re: owl:Class and owl:Thing Hey Azamat, I have every intention of reading your post in its entirety, but in lieu of that, what Hans said: you must be a hell of a typist! Asha: Sorry about that. To meet your and Hans requests, i passed the content through the editor, try and read now. <Asha/> There is a question I'm curious about. I believe your main point is that without a firm foundation in reality modelling, ontological activity will be lacking. The reality modelling you describe includes physics and other conceptual frameworks that are, er, grounded in reality. But what of reasoning about things which are not conveniently addressed through models of reality? I have a concept of a poem, and on reading a given poem I may have (highly subjective) interpretations of what the poet intends when he talks of a "host of golden daffodils". Me, I wonder whether he means the big cultured things, or the little wild ones. I suspect the latter, and as I've visited the geographic areas which inspired said poet I assume I should remember whether there were little wild ones there or not. But I honestly can't remember anything much but a leaky tent. No angels. Asha: Reality exists at several basic levels: natural (physical, chemical, biological), psychological, cultural and computational (virtual). So your conceptual world with its diverse variety of mental entities (thoughts, ideas, feelings and images) and their associations is a part of the [mental] world <Asha/>. My point is this: is knowledge representation just about things that can being related to reality, or is it about what the humble human considers as knowledge? (I bet there's 3000 years of the literature could be pointed to here, but I'd rather ask here & now ;-) Asha: It is crucially important to differentiate two kinds of representational languages and technologies: World Representation and knowledge representation, or the World Representation and Reasoning language [the subject matter of UFO] and just knowledge representation and reasoning languages [the subject matter of formal logic, i.e., the SW languages and the upper ontologies using formal logical representations]. The UFO is all about constructing a general framework as a unifying theoretical system and thence a universal language by compounding the classical models and theories about the nature and pattern of reality within a single standard account. For, there is a global Master Ontology dealing with the world, its things, beings, and relationships, and there is a plurality of domain ontologies dealing with the specific regions, parts, domains, or realms of reality. Computing ontology is all about the representation of the world, its entity states, changes, and relationships, in machine-processed forms. In other words, there is Ontology in the primary (intensional) sense and ontologies in the secondary (extensional) senses. And the basic meaning consists in being a fundamental account of reality and realities and their associative orderings. Thus the Global Master Ontology concerns with the entity and relation types in the world at the first place. Only at the second, it studies how the realities [world things and relationships] relate (map, project) to the concepts and associations in the mind, to the coded representations and structures in machines, and to the words and sentences in natural languages. There is a seminal article written by Barry Smith, ''Beyond Concepts: Ontology as Reality Representation'', which i recommend to read. <Asha/> In pragmatic terms it certainly does seem to be most immediately productive to use software to deal with reality-based problems, and in general we seem fairly well equipped to express these in a mathematical form that fits with the machines. But isn't it implicit in the upper ontology approach that it will exclude conceptual structures that may not fit with any consensus view of reality? The domain-independence of languages like RDF/OWL could be applied to many realities, without any need to commit to any particular model. Isn't that an advantage? Asha: I was amazed how quickly you understood that the SW languages is just exploting an old legacy set of predicable relations: ''class'', ''property'', ''sameness'', ''difference'', and ''inheritance''. A guiding formula of SW: UFO (Global Master Ontology of Reality) U ULO (Upper Level Ontologies, UML, SUO, GFO/GOL, BWW, DOLCE, etc.) U UFL (Unifying Framework Logic, FMF and SW Languages, RDF/OWL, etc.) <Asha/> Ok, capture this post as "whim" instantiation-of "fleeting thought"... Cheers, Danny. ASHA: The class/thing distinction makes here all the difference, and you hardly will get any explicit account from the owl's authors. For it's a central issue in all current activities of building top ontologies (SUO, USECS, ONTAC, etc.) and SW languages (RDFS, OWL, OWL1.1, etc), and it touches the sorest spot in the whole logical enterprise of OWL ontology passing as an ontological undertaking 'breaking all implicit and explicit assumptions of computing science'. The status, validity, and expressivity of any general representational languages and technologies are chiefly determined by the ways of treating the things in the world and their basic properties. And there are usually three main choices widely practiced; namely: you can define 'Thing' either as an individual, or a class of individuals, or the universal class, viz., the class of all classes. Or, in terms of quantities, as a fixed value (constant), an individual variable, and a class variable. The narrow view of thing as [an individual entity with a specific identity] has its long history as ('a primary substance', 'a bare individual', etc.) and was supported by such modern logicians and ontologists as Quine, for whom 'to be is to be a value of a bounded variable'. In the OWL domain, the extension of the construct owl:Thing has only individual things, being void of other essential meaningful dimensions. In the biological classificatory system, this corresponds to the level of species whose members share a set of essential features and bound by a membership relationship between an individual and its class. Note you can subject a collection of individuals, say, the totality of human beings, to further divisions and subdivisions, such as man and woman, White or Black or Yellow or Red, the aged or the young, the poor or the rich, the working class or the professional class; underworld, lower class, middle class or higher class, etc. Yet they are not (genetically) essential classifications. You are still in the domain of individuals; for even infinitely increasing the number of individuals doesn't allow you to create a new class or species or kind. Therefore we say about two types of difference, in kind or in degree. But more fundamental, profound position is to consider Thing (or Entity) as the class of classes (the set of subsets) at least; at best as the class of all classes (the universal set of all sets), hierarchically ordered by inclusion (containment) relationships (or whole-part relationships). Since, as the class variable, Thing will have as its values lower classes and subclasses as well, or the type of variables whose values are also variables (as a meta-syntactic variable 'foobar', where "the value of f(foo, bar) is the sum of foo and bar"). Returning to our sheep, the OWL semantic language. To be blunt, as a general ontological language it is fundamentally defective and would be a technological catastrophe to use this as 'Ontology Infrastructure for the Semantic Web' [1], for the several evident reasons. First, the polar terms of the OWL vocabulary are individuals, classes, and properties, which are, above all, mathematical and logical abstract terms without real content and substance, i.e., without reference to reality. To be an ontology, its basic construct should be the class of Thing to be equal to the class of all entity classes. And of which the most fundamental are the class of Substance (Object), the class of State (Quantity and Quality), the class of Process (Change or Action) and the class of Relationship. Each one of these Entity classes is organized as a hierarchy of subordinate classes (kinds and types), where particular levels occupied by such individual things (or instances, particulars, and concrete entities) as objects, specific states, unique events and specific connections. Crucially, 'definition', 'class', 'property' and 'statement' (see Topics) should be filled up with real contents and meanings. Even you may have an idiosyncratic set of ontological commitments as your pivotal environmental and cognitive universals still they must be ontological classes, rather than empty logical entities. Second, the construct of owl:Property, with its two basic types: owl:ObjectProperty (mapping individuals to individuals) and owl:DatatypeProperty (mapping individuals to datatype values). In fact, there are monadic and dyadic properties; essential and accidental; atomic, transient, complex, or emergent; particular and general, etc. But mostly important to tell the formal properties (attributes) from the ontological properties, which are generally classified as: 1. the property of being a substance (object), substantial properties; 2. the property of being a state (quantity or quality), quantitative and qualitative properties; 3. the property of being a process (change, action, operation), dynamic, functional, operational properties; 4. the property of being a relationship; relational properties per se. Thus, in the owl domain, owl:Property is badly narrowed to the property of being a formal (functional) relationship, direct and inverse; without explicitly identifying the nature and type of relations between things, such as spatial, temporal, causal, whole/part, syntactic, semantic, pragmatic, etc. Moreover while dealing with only two main types of property: owl:ObjectProperty and owl:DataProperty, existing as disjoint constructions, discard any hope of measuring commensurability between magnitudes (entity variables) and multitudes (numbers, numerical values). There are many other defects and contradictories, particularly in its (subsumption) logic, which may take more time and patience, so i better stop for now. Moral. In difference to the OWL people's feelings and hopes, it is not an ontology but a sort of formal language involving a functional, formal logic, and just need be properly renamed as FoLWL or LWL, Logical Web Language. Accordingly, the semantic web into the formal semantic web, which is a poor abstraction of the real (semiotic) Web [as it' has recently turned out], asking for a firm conceptual foundation, n-relational ontology of things and its complement, ontological semiotics. Or, put away for a long time your lofty hopes about real-life knowledge applications and web-based intelligent systems capable to represent and reason about the world, and have instead a 'wonderweb' blown off billions and billions of public funds. It seems something must be done to stop this fast-going and widely spreading pandemic of nescience. Hans, about you specific problem, you are on the right track. On the ontological abstract level, a pump is a specific class (species) of Thing [>substance > physical substance > artefact > device > mechanism > mechanical > device] marked by a specific [functional property] of moving fluid and gas > [substance] by suction or pressure [process]. This is all about its intensional meaning, its primary definition, while its extension is made up of all types of pumps differed by the type of working substance used and ways of operations, constructions, etc.: gas pump, oil pump, water pump, lift pump, hydraulic pump, hand pump, foot pump, you may continue such a division at infinitum. In the actual world of particular things, a pump is an individual existing as a concrete physical object, a unique instance of a class of physical devices. All the confusion comes from the replacement of fundamental ontological category of Thing or Entity with a empty logical category owl:Class. And please don't throw the 'things' away as the child from the bath, rather discard the empty 'classes', the bath itself [with its dirty water]. > with all respects, > > Azamat Abdoullaev On 3/31/06, Azamat <abdoul@cytanet.com.cy> wrote: > Hans inquired: > ''Is it possible that owl:Individual, that once existed [1], was meant to > be > the class of REAL individuals in a REAL world?'' > Hans decided: > ''I have thrown out the owl:Thing. Much easier to read for humans.'' > > > ----- Original Message ----- > From: "Hans Teijgeler" <hans.teijgeler@quicknet.nl> > To: "'Dave Reynolds'" <der@hplb.hpl.hp.com> > Cc: "'SW-forum'" <semantic-web@w3.org> > Sent: Friday, March 31, 2006 10:27 AM > Subject: RE: owl:Class and owl:Thing > > > > The class Pump is such a case where it is both an owl:Class and an > > individual, as a member of the class ClassOfInanimatePhysicalObject. Yet > > it has not been declared as owl:Thing. I understand from you that that > > is > > OK. > > > > Is it possible that owl:Individual, that once existed [1], was meant to > > be > > the class of REAL individuals in a REAL world? > > > > Regards, > > Hans > > > > [1] http://wonderweb.semanticweb.org/deliverables/documents/D1.pdf > > > > ========================================================================= > > > > -----Original Message----- > > From: Dave Reynolds [mailto:der@hplb.hpl.hp.com] > > Sent: Thursday, March 30, 2006 23:58 > > To: Hans Teijgeler > > Cc: SW-forum > > Subject: Re: owl:Class and owl:Thing > > > > Hans Teijgeler wrote: > > > >> In OWL-Full it is possible to have a class that also is an individual > >> in the context of a class-of-class. We have that a lot. Now my > >> question is whether or not I shall call the same object an owl:Class > >> when it is in the role of class, and call it an owl:Thing when it is > >> in the role of individual. If not, what shall prevail? Or must I > >> declare > > it twice? > > > > You don't *need* to declare it at all in OWL/full. > > > > If you use a resource in the role of a class then it can be inferred to > > be > > a > > class. For example, if you use it as the object of an rdf:type statement > > or > > in an rdfs:subClassOf statement then it can be inferred to be an > > rdfs:Class. > > In OWL/full rdfs:Class and owl:Class have the same extension. > > > > Similarly it can be inferred to be an owl:Thing (for trivial reasons in > > OWL/full) and probably some subclass of owl:Thing based on the > > domain/range > > of whatever properties you apply to it. > > > > However, it may be useful for human readers of your ontology if you > > document > > it's dual nature by declaring both it's types explicitly along with > > appropriate rdfs:comments. > > > > Dave > > > > -- > > No virus found in this incoming message. > > Checked by AVG Free Edition. > > Version: 7.1.385 / Virus Database: 268.3.3/296 - Release Date: 29-Mar-06 > > > > > > -- > > No virus found in this outgoing message. > > Checked by AVG Free Edition. > > Version: 7.1.385 / Virus Database: 268.3.3/298 - Release Date: 30-Mar-06 > > > > > > > > > _________________________________________________________________ > Message Archives: http://colab.cim3.net/forum/ontac-forum/ > To Post: mailto:ontac-forum@colab.cim3.net > Subscribe/Unsubscribe/Config: > http://colab.cim3.net/mailman/listinfo/ontac-forum/ > Shared Files: http://colab.cim3.net/file/work/SICoP/ontac/ > Community Wiki: > http://colab.cim3.net/cgi-bin/wiki.pl?SICoP/OntologyTaxonomyCoordinatingWG > -- http://dannyayers.com
Received on Friday, 7 April 2006 10:06:55 UTC