Re: Advancing translational research with the Semantic Web

On May 20, 2007, at 11:49 PM, Alan Rector wrote:

>
> Chris
>
>
> On 18 May 2007, at 18:10, Chris Mungall wrote:
>
>
>>
>> I'm afraid I'm unclear how to state the OWL n-ary relation pattern 
>> (http://www.w3.org/TR/swbp-n-aryRelations) where I really need it.  
>> In all the examples given, the "lifted"[*] n-ary relation was  
>> never truly a relation in the first place and always better  
>> modeled as a class. It's kind of cheating. What if my n-ary  
>> relation is transitive or if the 3rd argument is a temporal  
>> interval over which the relation holds?
>>
>> I think the former is doable with property role chains. Updating  
>> the n-ary relations note with this - and all the other omitted  
>> details, such as how to re-represent domain/range, functional  
>> properties, n-ary relations in restrictions etc - would take a lot  
>> of work and would make it utterly terrifying to the naive user.
>>
>> Nevertheless the results are clunky and will need special tool  
>> support[**] to avoid going insane.
>
> I'd love to see DLR or similar means worked into future versions of  
> OWL or other standards, although I am not the one to comment on the  
> logical/complexity issues.   I certainly agree that re-expresssing  
> relations as properties carries a modest penalty by being more  
> verbose, but it is manageable.
>
> To take the example in question for some relation R, let's take  
> temperature as an example.  I shall use the subrelations  
> "has_feature" / "has_state" to minimise arguments over what is, and  
> is not a "quality" - an issue not germane to this discussion. Also  
> I will use "has_state" as the property name so we don't have both a  
> property "has_value" and a keyword VALUE.
>
> In the binary relation form in manchester simplfied syntax in OWL  
> 1.0 we have:
>
> Organism has_feature SOME (Temperature_Feature THAT
>         has_temporal_extent VALUE temporal_extent_1 AND
>         has_state SOME (has_magnitude VALUE 37 AND has_units VALUE  
> degrees_C))
>
> where temporal_extent_1 is an individual which has facts
> 	has_start_time VALUE n AND has_end_time VALUE m.
> 	has_magnitude is a functional datatype property and has_units is a  
> functional property.

Here Temperature_Feature is a "history" (sensu Hayes) or a time- 
slice. Do I have this correct?

This sort of thing can always be made to work if the relevant  
concessions are made in the upper ontology. For example, in the above  
I never talk of qualities-as-continuants, but only through their  
histories. To my mind this complicates things a lot - unless you  
fully embrace the 4D view of the world.

What about for relations such as part of and location? For example, a  
protein that is in the cytoplasm at a certain time:

Protein that has_feature SOME (Location_Feature THAT
	has_temporal_extent VALUE temporal_extent_1 AND
	has_location SOME cytoplasm)

Would this be a fair extrapolation?

Would the following be accurate for a 4D representation of the same  
thing?

Protein that has_history SOME (History THAT
	has_temporal_extent VALUE temporal_extent_1 AND
	has_location(4d) SOME (History THAT history_of SOME cytoplasm)

> where n,m are date-time expressions, for simplicity let us assume  
> integers representing milliseconds since some reference point.

Fair enough. A lot of the time you wouldn't have an ordinal scale but  
rather a partial ordering, but this doesn't affect the design pattern

> Inn OWL 1.1 we can do quite a bit better - although again there is  
> a need for improved tools to make it easier.
>
> *	An organism has a given temperature at some point in an interval
>
> anOrganism -->
> 	has_feature SOME (Temperature_feature THAT
> 		has_time_point  SOME (has_coordinate SOME int[>=n, <m])
> 		has_state...
>
> *	An organism has a given temperature throughout an interval.   
> (This has to be expressed as "Any temperature feature of the  
> individual anOrganism in the time interval has the given state"
>
> Temperature_feature THAT
> 	is_had_by VALUE anOrganism AND
>         has_time_point (Some has_coordinate SOME int[>=n, <m]) -->
>             has_state...
>
> where 	is_time_point_of: inverse has_time_point
> 		has_time_point: functional
> 		Axiom: 	(Feature THAT has_time_point SOME Time_point) has_value  
> Max 1 State.
> 		has_coordinate is used here with int since I am assuming it is  
> measured in "ticks since basepoint", but could equally well be a float
>
>
>
>> Nevertheless the results are clunky and will need special tool  
>> support[**] to avoid going insane. In general I am wary of design  
>> pattern type things - they are usually a sign that the language  
>> lacks the constructs required to express things unambiguously and  
>> concisely.
>
> Separate "unambiguously" and "concisely".  Whether or not there is  
> something ambiguous about a design pattern depends on the case.  In  
> this case I think there is no ambiguity.  "Concisely" is a matter  
> for tools and layered "higher level languages".
>
> The history of computing is the history of "design patterns" at one  
> level that eventually get built into "higher level languages" at  
> the next level of abstraction up.

I think I have a less optimistic view of progress in computer  
science. For example, many of the paradigmatic GoF design patterns  
are there to make up for deficiencies in the OO languages that  
*succeeded* more expressive and abstract functional languages.

>   No one would argue against layoring more convenient languages on  
> top of OWL ( or its successors).  The patterns are a first step  
> towards this end, just as they were in the early days of  
> programming languages.  Neither would anyone argue against more  
> expressive languages.
>
> But I would argue that building on known, tested, and proven  
> semantics and computational methods is preferable to inventing new  
> ones.  I'd rather spend my time on improving tooling for something  
> well-understood, standardised, and supported by a community of  
> specialists than on trying to invent something new on my own that  
> was likely to be none of these things.   I'll invent when I have to  
> - when I am convinced that the best available methods do not meet  
> mission critical needs.  But I take a lot of convincing, and even  
> if convinced I will build out from the well understood foundations  
> wherever possible, with just enough extra invention to do what is  
> required.

I don't think I would disagree here

> I speak from experience.  I've done both.
>
> Regards
>
> Alan
>
> -----------------------
> Alan Rector
> Professor of Medical Informatics
> School of Computer Science
> University of Manchester
> Manchester M13 9PL, UK
> TEL +44 (0) 161 275 6149/6188
> FAX +44 (0) 161 275 6204
> www.cs.man.ac.uk/mig
> www.clinical-esciences.org
> www.co-ode.org
>
>
>
>

Received on Monday, 21 May 2007 18:51:12 UTC