- From: Peter F. Patel-Schneider <pfpschneider@gmail.com>
- Date: Fri, 16 Oct 2015 09:21:17 -0700
- To: kcoyle@kcoyle.net, public-data-shapes-wg@w3.org
The "defaults" that I am proposing are indeed no minimum/maximum cardinality processing is done when there is no minimum/maximum cardinality constraints. peter On 10/08/2015 10:17 AM, Karen Coyle wrote: > > > On 10/8/15 5:53 AM, Peter F. Patel-Schneider wrote: > >> ISSUE-91 >> >> As I indicated last week, in >> https://lists.w3.org/Archives/Public/public-data-shapes-wg/2015Oct/0004.html, >> I vote +1 for resolving ISSUE-91 in a way that does not require default values >> for cardinalities, i.e., min 0 and max unbounded. I vote -1 for other ways of >> resolving ISSUE-91. > > Well, what you say there sure sounds like defaults. I think the non-default > option is that no cardinality processing is done when there are no cardinality > constraints. That's fine. But there does need to be a way to express > cardinality constraints that includes min 0 and max unbounded. > > Note that as regard defaults, we already have a default of "open shapes" that > cannot be expressed explicitly, which I believe needs to at least be made > clear in the documentation. > > kc > > >> ISSUE-92 >> >> The proposals for resolving ISSUE-92 suggest that repeated constraints on the >> same property be considered as "additive". I do not feel that there is much >> evidence to support the need for this reading. In particular, the example in >> https://lists.w3.org/Archives/Public/public-data-shapes-wg/2015Sep/0107.html >> does not need an additive reading as there is no overlap between the two sets >> of possible values - all that is needed are qualified cardinality constraints >> and a cleanup on constraints to generalize node-based constraints. A proposal >> for this cleanup is described in ISSUE-98. I vote for this approach and >> against the other approaches. > > The purpose of this is user-facing, not engineering-necessitated. Yes, > qualified cardinality constraints can be used, but at a human cost, since 1) > it's not a concept that most folks are aware of or comfortable with 2) it > appears to require quite a bit more coding. > > The idea, as I understood it, was to make SHACL easier to grasp and thus more > likely to be used. > >> >> ISSUE-93 >> >> I agree that there is a conflation of good style with requirements on SHACL >> shape graphs, particularly for rdfs:label and rdfs:comment. I also agree that >> good style should not be diluting the requirements for SHACL. >> >> Separately, I agree that the requirements for SHACL shape graphs and SHACL >> engines are smushed together and that the requirements for SHACL engines >> should be refactored. I think that the way to do this is to define SHACL >> operations like validation and then describe what implementations of these >> operations must or should or may do. Validation reports are part of SHACL >> validation, so it is not possible to just define what is required for data to >> satisfy a shape as suggested in >> https://lists.w3.org/Archives/Public/public-data-shapes-wg/2015Sep/0232.html > > > +1 > > kc > >> >> ISSUE-94 >> >> I do not feel that there is any need to completely remove the RDF syntax >> definition from the current SHACL document. I do agree that there should be >> more care taken to discuss SHACL constructs without appearing to require them >> being RDF. I also agree that the semantics of the constructs should be >> written without depending on the RDF representation of SHACL constructs. >> >> ISSUE-96 >> >> I feel that SHACL validation results already contain adequate information to >> identify the construct in question. Adding more information only complicates >> an already-complex system. I vote -1 for such additions. >> >> >> peter >> >> >> >> >> >> >> >> >> >> >> >> >> >
Received on Friday, 16 October 2015 16:21:49 UTC