a need for validating an owl across software

Hi,

I've been using several owl editors lately (protege3 and 4, swoop, powl).   
They often interpret the .owl differently. This, of course, is not a good  
situation because it undermines a fundamental idea of the semantic web: a  
common, cross domain, interpretation of information.

I've been thinking about practical methods for solving this problem.

- an ACID test -
W3C could post an owl, some data, and some reasoning.
Software could then be validated to come to the same conclusions. Software  
makers have a benchmark to check against.

- a checksum -
A very specific method for serializing core data of the .owl should result  
into something that uniquely identifies the inner logic of the .owl The  
result can be checksum'ed and is thus available for comparing against  
other .owl software.
A use case: a site hosts an .owl for download, the description of it  
includes a number. After downloading the .owl and loading it into a  
software, the number can be checked against the number that is calculated  
by the software itself.

- a round trip mechanisme -
W3C could provide a specific load-save regime: start with one particular  
editor, load .owl, change it, save it. Then open the .owl in another  
editor and repeat the same steps. Etc. Final step is to load the .owl in  
the first editor again and see if all is still okay.

- note on rdf validators -
The online rdf validators at the moment 'simply' check the syntax, but  
considering the application of rdf, this is not enough. They should  
validate the inner logic as well, or.. at least be able to compare two  
'logics' that are derived from the same .owl from two differnet .owl  
editors.

I think the w3c should take initiative in solving this problem. The  
application of SW in internet software would be really helped if people  
can trust a complex entity like an .owl .
Because if the users cannot, the only way to solve it is to go under the  
hood and manually correct the problems. Considering the complexities of  
..owl this will hinder the uptake of it considerably. And we all know the  
problems that still hinder us as the result from web browsers not  
interpreting the html similarly. We should learn from this: give people  
benchmarks so the can communicate and recognize and understand the  
preformances of individual .owl software.

Kind regards,
Aliza Lila
(an enthousiastic software developer )

Received on Tuesday, 19 December 2006 14:31:44 UTC