- From: Kingsley Idehen <kidehen@openlinksw.com>
- Date: Mon, 6 Dec 2021 10:37:33 -0500
- To: public-rdf-star@w3.org
- Message-ID: <843216ad-c6ca-43a6-8810-71fe4ed0b0b4@openlinksw.com>
On 12/6/21 5:58 AM, Lassila, Ora wrote: > I think David makes some valid points here. Hi Ora, > > A couple of observations: > > 1) The "dependence on semantics" is an interesting issue. For sure I have built applications that very much depended on RDF's semantics (and used a reasoner). To not do so would mean to build everything from scratch, which is not what I prefer to do. So even if I technically may "control all data" in my application, it is *much* easier for me to rely on RDF than not. Are you asserting that the use of reasoning is the sole indicator of RDF semantics? I don't want to jump to conclusions about your assertion, hence my question. > > 2) Related to #1, the RDF vs. LPG question, even I you reduce it to the question of "different graph representations", has to take into account the fact that there is a lot of "machinery" that comes with RDF that you would end up building yourself if you used LPGs. This does not necessarily reflect negatively on LPGs, since it may suit you just fine to build whatever mechanisms and machinery you need. I see Neptune customer use cases where one or the other approach makes more sense. Personally, the issue of concern to me about RDF-Star and SPARQL-Star has little to do with LPGs being good or bad, but rather: Why are we twisting RDF into a pretzel as a strategy for interoperability with LPGs? I don't think there's a single pure LPG player that cares an iota about such endeavors. Why? Because they aren't about dealing with a worldview that matches what RDF is about, not even close. > If I do want mechanisms like those that RDF offers (including reasoning and well-defined semantics), I prefer to take what RDF gives me rather than rolling my own. I (and many other folks) went through a lot of effort and pain to get RDF where it is now. Yes, which is why today we can scribble down stuff such as the following: ## Turtle Start ## @prefix event: <http://purl.org/NET/c4dm/event.owl#> . @prefix schema: <http://schema.org/> . @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> . @prefix owl: <http://www.w3.org/2002/07/owl#> . @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> . @prefix foaf: <http://xmlns.com/foaf/0.1/> . @prefix skos: <http://www.w3.org/2004/02/skos/core#> . @prefix : <#> . ## Modeling Option 1 using events, since everything observed is an event. [ a :InPersonMeetingEvent ; schema:name "Alice meets Bob" ; skos:preLabel "Alice knows Bob" ; schema:dateCreated "2020-01-01"^^xsd:date ; :knows :alice, :bob ; ] schema:publisher <https://www.nytimes.com/#this> . ## Modeling Option 2 notetaking metaphor. [ a :Note ; schema:dateCreated "2020-01-01"^^xsd:date ; :item [ a :Person ; schema:name "Alice" ; :knows :bob ] ; schema:author [ a schema:Person; schema:worksFor <https://www.nytimes.com/#this> ] ; schema:publisher <https://www.nytimes.com/#this> ] . ## Modeling Option 3 notetaking metaphor. [ a :Note ; schema:dateCreated "2020-01-01"^^xsd:date ; schema:about [ a :Person ; schema:name "Alice" ; :knows :bob ] ] schema:author [ a schema:Person; schema:worksFor <https://www.nytimes.com/#this> ] ; schema:publisher <https://www.nytimes.com/#this> . ## Marriage ## [ a :MarriageEvent ; schema:name "Alice and Bob Marriage Event"@en ; :hasStartDate "2020-01-01"^^xsd:date ; :hasBride :alice ; :hasGroom :bob ] schema:author [ a schema:Person; schema:worksFor <https://www.nytimes.com/#this> ] ; schema:publisher <https://www.nytimes.com/#this> . ## Richard Burton and Elizabeth Taylor Marriages. [ a :Note ; :schema:about [ a :Marriage ; :startYear "1964"^^xsd:Year ; :endYear "1974"^^xsd:Year ; :hasBride dbpedia:Elizabeth_Taylor ; :hasGroom dbpedia:Richard_Burton ] ] schema:publisher <https://wikipedia.org/#this> . ## Turtle End ## > > 3) The question on how to nest RDF-star statements to represent the right semantics (in my marriage use case) is important. But Marriage isn't a problem solved by reification. A Marriage is an Event, and Events has clearly defined characteristics in the world expressed for eons in natural language. As you know, natural language is also informed by logic. > How you nest (that is, in which order) makes a difference, because whoever is going to query that representation has to know it. In general, I stand by my characterization of "awkward", since I want to take into account the "user experience" of querying. Formal Language can be perceived to be awkward, hence the prevalence of slang. You can't turn formal language into slang. Naturally, you can make a variety of translations of slang back into formal language, but you don't redefine formal language for the sake of slang -- as the basis for interoperability . > The ability to query a representation is dependent on whether there is a likelihood that you manage to write a correct query. This in addition to semantic correctness: if the query does not give you the answers you need, what good is the query? No good at all, but I don't see how that justifies neither RDF-Star nor SPARQL-Star. Query solutions are very dependent on the modeling using to define the data they operate on etc.. LPGs usecases, are simply modeling challenges that can be handled in RDF without reification. My big concern, above all else, is that an LPG surface graph is being eye-balled through a single RDF prism i.e., reification. How about a reality where those LPG examples have nothing whatsoever to do with reification i.e, they are basically graphs that are fundamentally devoid of any semantics whatsoever? ## Turtle Start ## # I can add an owl:inverseFunctionalProperty to the mix if identity reconciliation becomes a concern i.e., using the right tool for that job @prefix : <#> . [ a :Relationship ; :hasLabel "Some Labeled to Graph" ; :hasStartNode :startNode ; :hasEndNode :endNode ] :hasRelationshipProperty :relationshipProperty1 . ## Turtle End ## My fundamental point is that interoperability doesn't require an inadvertent magnification of the core weaknesses of RDF, and by consequence SPARQL -- which ultimately boils down to general misunderstanding due to a variety of historical challenges (some self-made, IMHO). Both of these standards on the very best of days are still barely understood by the broader public (i.e., developers, architects, decision makers, VCs), so why are we adding more confusion at a stage in the game where clarity needs to be paramount? I don't believe LPG and RDF interoperability is a problem solved via a spec, especially when one party to the problem (LPG vendors) has absolutely no interest in any baseline semantics and will remain eternally allergic to the letters R-D-F. > Also note that even the nesting in my example did not fully solve the problem at hand. Naturally :) Kingsley > > Regards, > > Ora > > > On 12/5/21, 7:00 PM, "David Booth" <david@dbooth.org> wrote: > > CAUTION: This email originated from outside of the organization. Do not click links or open attachments unless you can confirm the sender and know the content is safe. > > > > >> On 12/3/21 6:31 AM, Pierre-Antoine Champin wrote: > >>> In my view, the impedance mismatch > >>> between RDF and PGs is not due to some arbitrary restriction on the > >>> RDF model. It is due to the fact that RDF is a logic, that can be > >>> represented as a graph, while PG is a graph data model, without any > >>> semantic commitment. > > I respectfully but very much disagree. I see RDF being used to solve > problems, just like PGs. And although I like RDF's grounding in > semantics, I have never seen an RDF application that truly depended on > that semantic grounding. Consider this: > > - For an application in which you control all of the data, clearly > your application does not depend on RDF's semantics, because your > application could just as well CHOOSE to apply RDF's semantics. > > - And for an application in which you do NOT control all of the data > -- I'm thinking here primarily of Linked Data applications -- do you > really think that those applications would not work if the data > producers had published PGs for you to consume instead of RDF (and your > application used PGs)? Personally, I seriously doubt it. > > Even with RDF's grounding in a standard semantics, every application > developer who uses RDF from other sources needs to look carefully at > that external data in advance to see if its semantics matches the needs > of the application. Otherwise the application will likely produce > garbage output. In other words, even though RDF itself has a standard > semantic grounding, that grounding is no get-out-of-jail-free card to > bypass the need to apply application-specific semantics. > > I have always viewed the most significant differences between RDF and > PGs as being purely practical choices of graph representation. But > maybe this is just a difference in perception? > > Best wishes, > David Booth > > -- Regards, Kingsley Idehen Founder & CEO OpenLink Software Home Page: http://www.openlinksw.com Community Support: https://community.openlinksw.com Weblogs (Blogs): Company Blog: https://medium.com/openlink-software-blog Virtuoso Blog: https://medium.com/virtuoso-blog Data Access Drivers Blog: https://medium.com/openlink-odbc-jdbc-ado-net-data-access-drivers Personal Weblogs (Blogs): Medium Blog: https://medium.com/@kidehen Legacy Blogs: http://www.openlinksw.com/blog/~kidehen/ http://kidehen.blogspot.com Profile Pages: Pinterest: https://www.pinterest.com/kidehen/ Quora: https://www.quora.com/profile/Kingsley-Uyi-Idehen Twitter: https://twitter.com/kidehen Google+: https://plus.google.com/+KingsleyIdehen/about LinkedIn: http://www.linkedin.com/in/kidehen Web Identities (WebID): Personal: http://kingsley.idehen.net/public_home/kidehen/profile.ttl#i : http://id.myopenlink.net/DAV/home/KingsleyUyiIdehen/Public/kingsley.ttl#this
Attachments
- application/pkcs7-signature attachment: S/MIME Cryptographic Signature
Received on Monday, 6 December 2021 15:37:51 UTC