Re: Comments on * DRAFT * Rules Working Group Charter $Revision: 1.60 $

Dieter,
  I think there is a lot of confusion her .  Let me address three points
  1 - what the intent of the WG being defined should (IMHO) be
  2 - the relationship with OWL which I think you mistakenly undervalue
  3 - the issue of NAF (and SNAF) which is so important to this stuff 
working on the Web

1 - INTENT OF THE WG

  I think the real problem Sandro and the W3C have been facing is NOT 
about OWL (I return to this later) but primarily about whether we are 
looking at a Rules language envisioned as a language for USING rules 
on the Web, or a Rules language envisioned as a language for 
EXCHANGING rules on the Web.  As Sandro got more feedback from 
industry (some of it on this list), it is clear that there was more 
of a need for the latter among the business rules community than for 
the former.  Where most of us in research (thee and me included) have 
been focused on rule as reasoning, we have been less focused on the 
exchange of rules.   The former wants a language which is 
computationally efficient and usable, the latter needs a very 
expressive language.  In fact, the business rules community has 
rejected most of the LP approaches used in reasoning in the past (and 
papers at this workshop said so in no uncertain terms) because their 
needs were not addressed by what we were producing.  The current 
charter claims FOL is the right choice for this more expressive 
language - something which you question (and I do too, although not 
because of the same reasons you state).  So one way to think of this 
WG is as a "RULES EXCHANGE" WG and the stuff we do and care about 
would be to define a subset of the language that is computationally 
efficient (much as OWL-DL is a subset of OWL that is computationally 
better - OWL Full is the real ontology exchange language, OWL-DL is a 
subset for certain kinds of reasoners that need computational bounds).

2 - Relationship with OWL

    I think you also underestimate the importance of OWL viz rules.  I 
have sent several use cases to various of the email lists discussing 
this (too often in parallel), here's the one I consider real-world 
and most compelling.  Consider a user organization which has and uses 
a large OWL ontology - for example the NCI which has an ontology with 
over 45,000 classes in the current version (with about 8-10 people 
whose job responsibilities include the curation and extension of this 
ontology).  Suppose a researcher wants to check whether some data 
corresponds with this ontology.  The user would likely want to use a 
rules-type engine since they want to do data reasoning (and I think 
you and I both agree this would be the preferred approach to use). 
However, it would be crazy to think the scientist would be willing to 
recreate the ontology in a rules form, would be hard to imagine 
him/her getting it right without some sort of tool to do so, and 
would be tremendously unrealistic to expect that the NCI would be 
willing to double (at least) its personnel investment to separately 
maintain and curate a corresponding rules base.   A program which 
could turn a maximal set of OWL (Full if possible, DL or Lite if not) 
into a rules language that could be used for the data analysis would 
clearly be a big win.
  (I note that being able to go the other way, from Rules to OWL, 
would also be nice, but since it is likely the rules language would 
be more expressive, I think this direction remains a research 
challenge for now, and I wouldn't expect it of the WG).
  In fact, I'd like to note that at the workshop several of the papers 
by the rules vendors mentioned or implied that OWL compatibility was 
important to them, and it would be foolish for the W3C not to listen 
(in fact, as an AC member I would certainly object to any Rules 
charter that ignored issues of OWL compatibility - having spent a lot 
of my organization's resources on helping to get OWL to happen, I am 
totally unwilling to reinvest in a new way to say the same things 
where it wasn't required) - so the rules interchange language being 
able to encode a large subset of OWL is important (at the same time 
that the rules reasoner sublanguage would cover as large a subset of 
OWL as possible - also important to me).  As currently drafted, the 
Rules interchange language would cover a large subset (if not all) of 
OWL Full (even if it couldn't be proven consistent) and that strikes 
me as a good thing,

3 - Addressing NAF
  Here's the thing I cannot understand in what you propose - NAF 
requires a closed world to reason with respect to (that is we must 
say "if X is not true in Y, then Z" - there must be a Y there).  The 
Web clearly cannot be considered such a closed world (even if it 
could, its current size is such that nothing invented to date could 
possibly contain all the information in it - even for the current 
semantic web this is foolish to consider).   What is easily imagined 
is that a mechanism can be derived that expresses what particular 
world some rule base is closed with respect to.  For example, if I 
use SPARQL to query some triple store (which itself may be thought of 
as an open world since it is linked to other stores and changes over 
time etc.) for the response to some query , then that response can be 
made into a graph that I can consider closed (and which I can name - 
since the date/time of the query could be appended, or other such 
mechanism).  SImilarly, if I apply the rules to some database, that 
database could be named.  This was called SNAF at the workshop (and 
has a number of other names in the research community, but we'll go 
with SNAF for now).
  In principle, this puts almost no constraints on the use of the Web 
Rules language - I can imagine a lot of designs in which a header is 
used to designate the type of entity the rules are expected to be 
applied to (database, RDF graph, web document, etc.) and then the 
rules could be expressed using a NAF mechanism with respect to that 
-- this means I would know seeing a rule set intended for some 
particular application what it was meant to be applied to.  These 
things could be very specific (here is my set of rules with respect 
to some particular website) or very general (this set of rules is 
assumed to work for any RDF graph).  As best I can tell, cases like 
this last one would be virtually indistinguishable from NAF in 
practice, but would be critically different with respect to the Web, 
since developers' intent as to what the closed world is expected to 
be could be encoded.
  There's nothing pejorative with respect to rules in thinking that 
the Web should not be considered a closed world - just seems like an 
obvious thing to do given the size and dynamicity of the Web.
  (btw, I think the discussion of NAF/SNAF in the proposed charter 
needs work, and that ruling NAF out of scope makes it sound like 
limited versions would be disallowed - which I agree with you would 
likely be a mistake.)

SUMMARY

  So - if you look carefully, I'm agreeing with most of your main 
points, but recasting them somewhat
  i. The rules language you have been promoting seems to me should be 
a subset of the more general rules exchange one.  I'm open to that 
happening within the WG (the way OWL DL was created within the OWL 
Full language) or not depending on the members and the chair and 
their desires.  If the WG doesn't do it, I would fully expect to see 
several member submissions, and maybe a de facto standard, by the 
time the WG was at CR (and maybe the WG would consider making the 
existence of such a CR exit criterion).
  ii. Your gratuitous OWL bashing aside, it is pretty clear to anyone 
who is talking to large companies that there need to be both OWL-like 
vocabulary definitions and rule-like "data handlers" that coexist to 
the maximum extent possible.  Different parts of the organization are 
likely to be developing the two - and the more incompatible they are, 
the more the situation looks like the problem we're trying to solve - 
lack of interoperability among different parts of the information 
space.  The proposed charter leaves a lot of space for this to happen 
- again, this may be within the WG or without - but I do think the WG 
is responsible for defining it precisely based on the W3C process 
rules which  essentially require newer languages to explain 
compatibility (or not) with earlier languages - with a high bar to 
incompatibility
  iii. The distinction between NAF and SNAF is not nearly as 
complicated as people seem to think conceptually (the devil is, of 
course, in the details).  In both cases the intent is to allow a form 
of closed world reasoning in the open world of the web.  Rules for 
the Web cannot possibly be identical to non-Web rules or we wouldn't 
need anything new -- much as the Web is to traditional hypertext 
systems, or OWL is to traditional AI KR languages, the Web rules 
language will need to extend what has been done in the past in 
(preferably) small and subtle ways to make it work with the Web.  I 
think SNAF v. NAF is one of those pieces of magic that will help make 
this work.


  Hope all this helps
  -JH

-- 
Professor James Hendler			  Director
Joint Institute for Knowledge Discovery	 	  301-405-2696
UMIACS, Univ of Maryland			  301-314-9734 (Fax)
College Park, MD 20742			  http://www.cs.umd.edu/users/~hendler

Received on Monday, 22 August 2005 17:03:49 UTC