Re: Implementation feasibility

On 3/23/2015 15:34, Jose Emilio Labra Gayo wrote:
> What I am against is that the SHACL high-level language depends on a 
> full SPARQL engine and that one cannot implement it without that 
> engine. So if we add a language construct to define macros, then that 
> language construct should not be tied to SPARQL.

You keep repeating the same incorrect claim over and over again. Of 
course you can implement the high-level language without a full SPARQL 
engine. Please explain why you cannot, based on my draft.

And the language construct to define macros is not part of the core 
profile, so you don't even need to look at templates outside of your 
Core profile.

Furthermore, the language construct to define macros is already 
independent from SPARQL, so I have no idea what you are unhappy about.

http://w3c.github.io/data-shapes/shacl/#templates

>
> I propose is to identify that SHACL high-level language expressive 
> enough to cover most of the use cases without having to use the 
> extensibility mechanism for those use cases which would mean that the 
> shapes defined with one SHACL processor would be incompatible with the 
> shapes defined with another one.

To cover most of the use cases, this high-level language would need to 
include

  * 2.6Complex Constraints
    <https://www.w3.org/2014/data-shapes/wiki/Requirements#Complex_Constraints>

      o 2.6.1Expressivity: Patterns
        <https://www.w3.org/2014/data-shapes/wiki/Requirements#Expressivity:_Patterns>
      o 2.6.2Expressivity: Non-Existence of Patterns
        <https://www.w3.org/2014/data-shapes/wiki/Requirements#Expressivity:_Non-Existence_of_Patterns>
      o 2.6.3Expressivity: String Operations
        <https://www.w3.org/2014/data-shapes/wiki/Requirements#Expressivity:_String_Operations>
      o 2.6.4Expressivity: Language Tags
        <https://www.w3.org/2014/data-shapes/wiki/Requirements#Expressivity:_Language_Tags>
      o 2.6.5Expressivity: Mathematical Operations
        <https://www.w3.org/2014/data-shapes/wiki/Requirements#Expressivity:_Mathematical_Operations>
      o 2.6.6Expressivity: Literal Value Comparison
        <https://www.w3.org/2014/data-shapes/wiki/Requirements#Expressivity:_Literal_Value_Comparison>
      o 2.6.7Expressivity: Logical Operators
        <https://www.w3.org/2014/data-shapes/wiki/Requirements#Expressivity:_Logical_Operators>
      o 2.6.8Expressivity: Transitive Traversal of Properties
        <https://www.w3.org/2014/data-shapes/wiki/Requirements#Expressivity:_Transitive_Traversal_of_Properties>
      o 2.6.9Expressivity: Aggregations
        <https://www.w3.org/2014/data-shapes/wiki/Requirements#Expressivity:_Aggregations>
      o 2.6.10Expressivity: Named Graphs
        <https://www.w3.org/2014/data-shapes/wiki/Requirements#Expressivity:_Named_Graphs>
      o 2.6.11Expressivity: Closed Shapes
        <https://www.w3.org/2014/data-shapes/wiki/Requirements#Expressivity:_Closed_Shapes>


The resulting high-level language would become almost like SPARQL, yet 
not be SPARQL. This is exactly what the RIF people did: if no agreement 
can be found, let's introduce yet another intermediate language and then 
define mappings from that language into the languages that people 
actually use in practice. The difference is that SPARQL is the 
established and official standard for exactly those scenarios. Users 
like SPARQL and it is widely supported and optimized by all kinds of 
tools. I remember Iovka stating that her team used Jena to implement 
ShEx. Well then, why not just call Jena's built in SPARQL engine, go 
with the mainstream and have a far more powerful engine?

Why the obsession about avoiding SPARQL at all costs?

Regards,
Holger

Received on Monday, 23 March 2015 06:17:26 UTC