RE: Relationship between filter properties and hydra:supportedProperty (was Re: Filters as views (ISSUE-45))

On Monday, February 15, 2016 12:02 PM, Pierre-Antoine Champin wrote:
> Hi all, I've been following this thread and did only find time to
> reply now, so here are a few thoughts;

Same here.. I'm still quite busy so I'm still just able to respond every now and then. But I'm tinkering with a few new designs.

 
> 2/ in a similar line of thought, I am not sure there is a need to
> distinguish direct mappings and indirect mappings. Even if the
> representation of a person only exposes the schema:birthDate property,
> it is not semantically incorrect to assume that this person also has a
> foaf:age (even if implicit). If a client "understands" the notions of
> schema:birthDate and foaf:age, then it should also "understand" the
> relations between the two, and having a filter on the implicit
> property foaf:age should not be much different, for that client, than
> on the explicit property schema:birthDate.

Sure, the question however is whether the server understands that. If the server doesn't advertise somehow that it also supports foaf:age the client can't assume that it can use it. From an usability point of view, I would find it very surprising if the server would allow to filter by foaf:age but then not return it in the result.


> 3/ we have been focusing on the object (value) of the properties (and
> how they should be compared to the variables in the templated link),
> but not on the subjects of those properties. I see distinct cases in
> the examples we have been working on:
> 
> 3.1/ views on collections usually consider that *members* of the
> collections to be the subjects of the filtering properties;

Right. Everything we have been discussing so far has been in the context of collections.


[...]
> 4/ I can't help but think there is already a language capturing the
> full range of examples we have been giving, and that is SPARQL...
> So there could be something like:
> "template": "/laptops{?minPrice,maxPrice,color,lang,page}",
> "sparqlMapping": """
>   SELECT * {
>   <> hydra:member [
>     schema:price ?p ;
>     schema:color ?c ;
>     ?prop ?lit ;
>   ]
>   FILTER (?minPrice <= ?p && ?p <= ?maxPrice && CONTAINS(?c, ?color)
>     && (!isLiteral(?lit) || LANG(?lit)=?lang))
> }
> LIMIT 10 OFFSET 10*(?page-1)
> """

That works fine if you have a triple store as backend. Not so much otherwise. IMO this is bordering on leaking implementation details which we should avoid. I would like a generic, higher-level solution.


> I'm not at all saying that we should go down that path, especially
> because it would be hugely complex for the client to parse an
> arbitrary SPARQL query to understand the semantics of a filter. What

+1


> I'm pointing out is that this complexity is linked to the
> expressiveness of the filter. If we want that kind of expressiveness,
> we have to face that complexity (and then we'd rather reuse an
> existing language such as SPARQL rather than inventing a new one). If
> we want a less complex language (which I think we do), then we have to
> chose what part we leave aside...

Agreed. The more I think about this, the more I feel this should be combined with operations (I know, Ruben isn't a big fan of going down that path though).


--
Markus Lanthaler
@markuslanthaler

Received on Tuesday, 23 February 2016 21:42:02 UTC