Re: minimal requirements for Arch document

Axel Polleres wrote:
> 
> Sandro Hawke wrote:
> 
>> I'm trying to think in minimalist, practical terms about Arch.  What do
>> we really need to say?  I see a few things:
>>
>>     1.  What is required of all systems which take RIF documents as
>>         input?   ("Minimal Requirements for all RIF Systems").
> 

p.s.: I just see I read this wrongly, anyway, that seems to ask bout 
what I asked later on about "system conformance", right?

> I earlier suggested for  this requirement
> 
> "BLD should define a minimal dialect and other dialects MUST agree
>  with the semantics of BLD on the part of the abstract model they share
>  with BLD"
> 
> (Michael later suggested to correct this to logic-based dialect only...
> though I thought this was covered by "on the part of the abstract model 
> they share with BLD")
> 
> 
>>         Until we have a RIF Core, there's not very much to say here.
>>         Maybe if we end up with BLD and PRD but not Core, we'll say it
>>         has to conform to at least one of the two of them.
> 
> 
> Well, that would make PRD also kind of a "core" then.
> 
>>         One thing we do need to mandate here is forward compatibility;
>>         how must a system behave when given a RIF document which does
>>         not conform to the syntax of a dialect it implements?  This
>>         section could get long if we go with a powerful fallback
>>         mechanism.
> 
> 
> Indeed, that will need some work.
> There are several fallback options we discussed (soundness-preserving
> ignoring of rule parts which don't conform, soundness-preserving
> ignoring of whole rules which don't conform, rejecting the ruleset) and 
> which we need to define and order.
> 
> 
>>         I think the BLD document needs a conformance clause.  Or maybe
>>         that goes in the BLD Test Cases document (as it did with OWL).
> 
> 
> Syntactic conformance and semantic conformance in the sense of the OWL 
> document should be doable.
> 
> However, in what sense would we define conformance of *engines*, or, do 
> we want to define this at all?  (This seems to be non-trivial for me, 
> and maybe not necessary)
> 
>>     2.  What does one need to do to define a proper RIF dialect?
>>         ("Publishing a New RIF Dialect")
>>
>>         I suggest that the basic rules are:
>>
>>             * No Language Conflict: every dialect MUST give the same
>>               semantics as each prior dialect does to any document
>>               which has a defined meaning in both dialects.
> 
> 
> Michael wrote here:
> "This may be too strict. An extension dialect should be allowed to make 
> more inferences, but should not invalidate existing inferences of the 
> subdialects."
> 
> This reminds me of what Jos called "loose and strict language layering" 
> [1], which I try to write here a bit simplified/adapted (although this 
> notion might only be useful for logic dialects):
> 
> Let D1, D2 be dialects with semantics S1 ,S2 such that D2 syntactically 
> extends D1. Dialect D2 is strictly layered on top of D1, if for any 
> ruleset r1 in D1 and every condition c1 in L1, c1 is entailed by r1 wrt. 
> semantics S1 if and only if c1 is entailed by r1 wrt. semantics S2.
> Otherwise, D2 is loosely layered on top of D1, if for any ruleset r1 in 
> D1 all he entailed conditions in S1 are also entailed in S2.
> 
> 
> However, we can decide for either solution, but we need to pick one, it 
> seems.
> 
> 1. Jos de Bruijn and Stijn Heymans. A semantic framework for language 
> layering in WSML. In Proceedings of the First International Conference 
> on Web Reasoning and Rule Systems (RR2007), pages 103-117, Innsbruck, 
> Austria, June 7-8 2007. Springer.
> 
>>             * Maximize Overlap: every dialect SHOULD reuse as much of
>>               the the syntax as possible from prior dialects. 
> 
> 
> Yes, although this is indeed more a principle only than something which 
> we can formally check and which we can only state by examples, I guess.
> 
>>         I think we should try defining some dialects using these
>>         principles, coordinating loosely as we like, but eventually I
>>         think we need to figure out how to open the process to 3rd
>>         parties.  That's going to involve some careful work around
>>         defining "prior dialect".
> 
> 
> indeed, it wasn't clear what "prior" means. If such a thing should be 
> kept clean, there would need to be an outhority (in W3C?) which would
> approve dialects and assess whether the principles have been followed 
> and compare among existing, ie. prior, dialects. Without such a body, 
> that seems to be as unlikely though as enforcing reuse of ontologies and 
> RDF vocabularies, honestly. But such a thing is not in the charter of 
> the working group, we can only define guidelines and principles here at 
> best, probably.
> 
>>     3.  What does one need to do to define a RIF extension?
>>
>>         As I see it, an extensions is a "delta" between dialects where
>>         one dialect is a superset of the other.
>>
>>              NewDialect = OldDialect + Extension
> 
> 
> As mentioned earlier, a diealect shouls also be allowed to restrict an 
> old one syntactically, or build upon a restricted subset. (example 
> function-free normal logic programms are an extension of a reestriction 
> of BLD)
> 
>>         What is challenging about extensions is that we want them to
>>         be orthogonal; we want users to be able to combine extensions
>>         which were developed independently.  For this example, I'll
>>         assume Lists end up in an extension, instead of in BLD.  
> 
> 
> If we allow restrictions and extensions, we don't run into this problem, 
> we could leave lists, slots, etc. in BLD, without forbidding restricted 
> dialects, as long as they comply with rif on the syntactic intersection.
> 
>> I don't
>>         want to do that, but it makes a good example:
>>  
>>              BLD_with_NAF = BLD + NAF_Extension
>>
>>                     The BLD_with_NAF dialect should be fully specified
>>                     by the NAF_Extension spec read in combination with
>>                     BLD.
>>  
>>              BLD_with_Lists = BLD + Lists_Extension
>>
>>                     The BLD_with_Lists dialect should be fully specified
>>                     by the Lists_Extension spec read in combination with
>>                     BLD.
>>  
>>              BLD_with_Lists_and_NAF = BLD + NAF_Extension + 
>> Lists_Extension
>>
>>                     We would like the semantics here to be fully
>>                     determined by the two extension specs and BLD.  This
>>                     is the challenge.  How can the documents be written
>>                     such that this is the case?
>>
>>         At the syntactic level this is clear enough, if you think of the
>>         sets of strings/documents conforming to the syntax.  An
>>         extension provides a set of strings, and "+" above is set-union.
>>         The question is how do we address this at the semantic level?
>>         Is there a way to address it across all approaches to defining
>>         semantics, or is this easy to do for model-theoretic semantics
>>         and impossible for procedural semantics?  (My sense is that it's
>>         trivial for proof-theoretic semantics; I'm unclear on the
>>         others.  I think Michael Kifer has in mind how to do this with
>>         MT semantics on BLD, but I don't understand that part yet.)
>>
>>         It may well be that some extensions are incompatible (such as
>>         NAF and classical negation?), in which case the combination
>>         procedure should fail, I would hope.
>>
>> Note that I see no need to mention abstract syntaxes or presentation
>> syntaxes.  For these purposes, all we care about is the XML.  (I'm not
>> thrilled about it, but I can't find a compelling reason to address
>> more than XML in this material.  At least, not yet.)
> 
> 
> I thought only that each dialect should be free to define it's own 
> syntax, as long as it has an XML syntax and two-way
> (bijective or semantic equivalent for the dialect, to be decided?)
> mapping for the dialect.
> 
> 
>> Oh yeah, and external data and data models.  I keep forgetting about
>> that.  Or is that in an extension?  :-) [the charter puts it in Phase 2,
>> remember.]
>>
>>      -- Sandro
>>
>>
> 
> 


-- 
Dr. Axel Polleres
email: axel@polleres.net  url: http://www.polleres.net/

Received on Friday, 12 October 2007 15:22:25 UTC