- From: Dan Brickley <danbri@danbri.org>
- Date: Tue, 23 Jun 2009 09:13:06 +0200
- To: hepp@ebusiness-unibw.org
- CC: Yves Raimond <yves.raimond@gmail.com>, martin.hepp@ebusiness-unibw.org, Michael Hausenblas <michael.hausenblas@deri.org>, Richard Cyganiak <richard@cyganiak.de>, "Hepp, Martin" <mhepp@computer.org>, Hugh Glaser <hg@ecs.soton.ac.uk>, mark.birbeck@webbackplane.com, "Booth, David (HP Software - Boston)" <dbooth@hp.com>, "public-lod@w3.org" <public-lod@w3.org>
On 22/6/09 23:16, Martin Hepp (UniBW) wrote: > > > Yves Raimond wrote: >>> Ontology modularization is >>> a pretty difficult task, and people use various heuristics for deciding what >>> to put in the subset being served for an element. There is no guarantee that >>> the fragment you get contains everything that you need. > There is no safe way of importing only parts of an ontology, unless you > know that its modularization is 100% reliable. > Serving fragments of likely relevant parts of an ontology for reducing > the network overhead is not the same as proper modularization of the > ontology. Can you give a concrete example of the danger described here? ie. the pair of a complete ("safe") ontology file and a non-safe subset, and an explanation of the problems caused. I can understand "there is no guarantee that the fragment you get contains everything you need", and I also remind everyone that dereferencing is a privilege not a right: sometimes the network won't give you what you want, when you want it. But I've yet to hear of anyone who has suffered due to term-oriented ontology fragment downloads. I guess medical ontologies would be the natural place for horror stories? cheers, Dan
Received on Tuesday, 23 June 2009 07:13:46 UTC