Re: Extending SPIN with meta-templates

On Nov 11, 2014 1:12 AM, "Holger Knublauch" <holger@topquadrant.com> wrote:
>
>
> On 11/11/14, 9:34 AM, Eric Prud'hommeaux wrote:
>>
>>
>> If I understand it correctly, this will take care of at least one
>> level of conttextual constraints, potentially more. This is quite
>> important to most of my use cases.
>
>
> Good, we may be getting somewhere!
>
>
>> Can it handle multiple levels of
>> constraints <http://w3.org/brief/NDEz>:
>>
>> Data:
>>    <P>
>>      core:cureIndication [
>>        core:hasIntervention :KidneyTransplant ;
>>        core:hasOutcomeAssessment [
>>          core:hasResultValue :ImprovedToNormal
>>        ]
>>      ] .
>>
>> ShExC Schema:
>>    :PatientShape {
>>      core:cureIndication {
>>        core:hasIntervention (:KidneyTransplant :HeartTransplant),
>>        core:hasOutcomeAssessment {
>>          core:hasResultValue (:ImprovedToNormal)
>>        }+
>>      }
>>    }
>>
>> OWL/ICV Schema:
>>    :TransplantProcedure owl:oneOf (:KidneyTransplant :HeartTransplant) .
>>    xplant:PatientShape
>>      rdfs:subClassOf [
>>        owl:onProperty core:cureIndication ;
>>        owl:allValuesFrom xplant:GraftSurvivalAssessment
>>      ] .
>>    xplant:GraftSurvivalAssessment
>>      rdfs:subClassOf [
>>        owl:intersectionOf (
>>          [ owl:onProperty core:hasIntervention ;
>>            owl:value :TransplantProcedure ]
>>          [ owl:onProperty core:hasOutcomeAssessment ;
>>            owl:allValuesFrom [
>>              owl:intersectionOf (
>>                core:FunctionOutcomeAssessment
>>                [ owl:onProperty core:hasResultValue ;
>>                  owl:hasValue :ImprovedToNormal ]
>>              )
>>            ]
>>          ]
>>        )
>>      ] .
>
>
> The draft of the SPIN extension would only cover one level of context.
While I am sure further levels could be added, I believe that if we go down
this route then we are rather re-inventing a light-weight version of
SPARQL, and I wonder where this ceases to make sense. Even the OWL example
above looks already too complex (and similar to how the SPIN RDF triple
notation would look like). At some stage these object models stop being
useful, and we could just as well open up to any SPARQL.

Wouldn't it be easier for users and implementors alike of the if there were
a consistent syntax for representing constraints at any depth? The ShExC
above is trivially compiled to SPARQL so nesting doesn't impose complexity
constraints.

The example above for a constraint (patients in the positive outcome
report) is typical; imposing an arbitrary depth limit will just confuse
users.

> If our main goal is to create a high-level syntax that uses objects
instead of SPARQL strings, then we have various alternatives such as using
something similar ShExC as one possible input notation, which then gets
compiled into SPIN/SPARQL for execution.
>
> I can imagine one use case of the high-level "object" notation may be
graphical editors for constraints. Such graphical editors would typically
be hard-coded against certain patterns. For things like dependencies that
are two levels deep, I believe those tools quickly become too complicated.

Why would the user interface be easier or harder to use if it's getting
integrity constraints on 2, 3, or 10 levels of nested constraints? If you
mean the creator of the constraint, they'll only have a harder time if the
shapes syntax imposes some arbitrary limit.

> Where to draw the line? Hard to say without seeing more real-world use
cases where simple one-step-contexts are not sufficient yet falling back to
SPARQL is not acceptable. I believe that space may be very small and I'd
rather keep the language and engine simple before expecting too much from
an adoption standpoint.

I agree with the goal of simplicity. I think that if you experiment with
this a bit, you'll get to really like it and find that it simplifies things
for your users.

> HTH
> Holger
>
>

Received on Tuesday, 11 November 2014 08:26:15 UTC