How core is Core?

The discussion on list datatypes and extensibility reminded me of a
general issue I wanted to raise ...

It seems to me that the Core as we are currently defining it is not
going to be easily implementable by several rule languages of interest
to us (due to function symbols and general equality). Yet we currently
talk about all dialects being extensions of Core and informally talk
about all translators needing to "implement" Core.

There are several solutions to this ranging from changing core, through
tweaking the extension mechanism to careful wording of our compliance 
statement. Now is not the time to do the latter but if we are going to 
take any of the other options now might be a time to at least consider them.

** The issue

I'm going to use terminology in a way I think is consistent with our
discussions on this topic from last year. Specifically a translator
"implements" RIF dialect D if it can translate any ruleset in D to its
native language, preserving its semantics. We used "conforms" for
the reverse direction where the translator might only use a subset of D
in order to interchange its rules.

The current Core is Horn with function symbols, equality and a set of
builtins. Any LP dialect is going to have no problems with that.

To implement this on production rule engine seems to me a little
tricky because it requires general unification between Uniterms whereas
I had understood that many PR engines don't support unification [*].

If this is correct then it suggests that the "all dialects extend Core" 
design principle is going to be a problem for the future PR dialect.

** Options

It seems to me we have several options.

(1) Ignore it. So some vendors might not be able to implement RIF Core 
as a pure translator, tough. We might phrase our compliance statement to 
emphasise conforming over implementing.

Seems to me that doesn't do much for interoperability but perhaps 
interoperability over Core isn't interesting and it is only in phase 2 
this really needs to be worried about.

(2) We take general function symbols out of Core, dropping back to
function-free horn plus builtin predicates. Perhaps phase 1 should then
be Core plus a first extension which puts the function symbols back in.
That would be a test of the extensibility mechanism and allow us to both 
have a really core Core and yet continue to deliver a 
Horn-with-function-symbols dialect in phase 1.

(3) We define the notion of a dialect profile. This has been mentioned 
before but it seems to me worth raising now because it affects the 
extensibility discussion.

In this case I'm thinking of a profile as being a purely syntactic 
restriction on a dialect. The ruleset metadata should be able to carry 
the intended-profile information. I think we should define at least one 
profile which is more PR compatible. Leaving restrictions completely 
open doesn't do much for interop either so predefining one (or more) 
whilst not precluding others seems like the right balance.

Comments?
Dave


[*] Is that right? I'm not a PR vendor and Jena isn't going to be able
to implement RIF anyway so I don't have any axe to grind here other than
help RIF be successful.

I realize that a typical PR engine could implement some general Java
datastructure to represent recursive Uniterms, write a unification
algorithm for that and include that as a new library component. However, 
that seems to violate our  "implementable just by means of translators" 
requirement.

The fact that none of the PR folks seem concerned with the current Core
design is a source of surprise to me and suggests I might be getting my
facts wrong.

-- 
Hewlett-Packard Limited
Registered Office: Cain Road, Bracknell, Berks RG12 1HN
Registered No: 690597 England

Received on Thursday, 28 June 2007 16:57:45 UTC