- From: Manu Sporny <msporny@digitalbazaar.com>
- Date: Mon, 3 May 2021 10:05:00 -0400
- To: Ivan Herman <ivan@w3.org>, Dan Brickley <danbri@danbri.org>
- Cc: Ramanathan Guha <guha@google.com>, Dan Brickley <danbri@google.com>, Phil Archer <phil.archer@gs1.org>, Aidan Hogan <aidhog@gmail.com>, semantic-web <semantic-web@w3.org>, Pierre-Antoine Champin <pierre-antoine@w3.org>
Just providing my own thoughts on this as an Editor of the specifications that are being considered for the standards track. I agree with much of what Ivan has stated. Some more thoughts below: On 5/3/21 5:06 AM, Ivan Herman wrote: > Would it apply to schemas published at http: URIs or only https: URIs? It could... or it might not. All of the technologies are optional, not mandatory expectations on the entire RDF / Semantic Web / Linked Data communities. If the technology is helpful to your use case, use it... if not, ignore it. > Are we convinced that there is application-level value in having > assurances over instance data without also having them for the schemas and > ontologies they are underpinned by? Yes, I am. Much of the work in Verifiable Credentials utilize schemas that are cached client-side (usually permanently, and enforced by software). We don't need schemas to adopt the technology for it to be useful. It would be more useful if schema publishing used the technologies, but I don't think anyone is placing that as a MUST along this road (because there is no need to create a dependency there). > Is there an expectation that schema/ontology publishing practice would need > to change to accommodate these scenarios? No, I don't think there is an expectation. There is a possibility there that would need years of examination once this technology is standardized... but that can happen in parallel. If it helps the schema publishing ecosystem -- great... if it doesn't, the technology is still useful elsewhere. > Would schema-publishing organizations like Dublin Core, Schema.org > <http://Schema.org>, Wikidata, DBpedia, be expected to publish a JSON-LD > (1.0? 1.1?) context file? What change management, versioning, etc practices > would be required? Would special new schemas be needed instead? I don't think there is any such expectation at present. It's too early to tell if schema publishing organizations or data consumers would find this technology useful enough to mandate... but that's also why it would be good to have the schema publishing organizations at the table. > For eg. if instance data created in 2019 uses a schema ex:Foo type last > updated in 2021, but which has since 2018 contained an assertion of > owl:equivalentClass to ex2:Bar, and an rdfs:subClassOf ex3:Xyz, are changes > to the definitions of these supposed to be relevant to the trustability of > the instance data? If so, why does > https://w3c.github.io/lds-wg-charter/index.html > <https://w3c.github.io/lds-wg-charter/index.html> not discuss the role of > schema/ontology definitions in all this? That is a complex question -- and I don't think we have a clear answer today (and may not for a while). That question, however, is one that can be contemplated in parallel with the work -- but seems like it's out of scope. Schema security need to contemplate many things... being able to digitally sign the schema is just one of those things (at a lower layer). At a higher layer is how one would interpret the sorts of statements you outline... and I expect those to be outside of the scope of the (presumably more lower level) current charter. -- manu -- Manu Sporny - https://www.linkedin.com/in/manusporny/ Founder/CEO - Digital Bazaar, Inc. blog: Veres One Decentralized Identifier Blockchain Launches https://tinyurl.com/veres-one-launches
Received on Monday, 3 May 2021 14:05:29 UTC