RE: Layering/Composability of Verifiable Credentials (was: Re: Market Adoption of VCs by Fortune 500s (was: Re: DHS Verifier Confidence/Assurance Level Expectation/Treatment of non-publicly defined Vocabulary/Terminology -- by using @vocab))

One last update … This will be the last update until after/around IIW.

Internet Credential Architecture Reference Model 0.69 ft. InterCredential 4-Corner Interop Model
https://www.youtube.com/watch?v=6dEWQqzznRE&list=PLU-rWqHm5p44AsU5GLsc1bHC7P8zolAAf&index=55&t=2685s

This version 0.69 of the InterCredential tutorial presents the InterCredential Credential Composition and Media Types models in addition to the InterCredential 4-Corner Credential Interoperability model.
The first 2 models are closely related - the latter being the cross-product of the InterCredential Credential Abstract Data Model and the range of potential Credential Data Model Serializations.

InterCredential Examples (unchanged)
https://www.youtube.com/watch?v=uo4RuT__IXw&list=PLU-rWqHm5p44AsU5GLsc1bHC7P8zolAAf&index=5&t=1377s


Cheers,
Michael Herman
Web 7.0

From: Brent Shambaugh <brent.shambaugh@gmail.com>
Sent: Thursday, February 9, 2023 9:43 PM
To: Michael Herman (Trusted Digital Web) <mwherman@parallelspace.net>
Cc: Nate Otto <nate@ottonomy.net>; public-credentials@w3.org; Manu Sporny <msporny@digitalbazaar.com>; Christopher Allen <christophera@lifewithalacrity.com>; Daniel Hardman (daniel.hardman@gmail.com) <daniel.hardman@gmail.com>; Kristina Yasuda (kristina.yasuda@microsoft.com) <kristina.yasuda@microsoft.com>
Subject: Re: Layering/Composability of Verifiable Credentials (was: Re: Market Adoption of VCs by Fortune 500s (was: Re: DHS Verifier Confidence/Assurance Level Expectation/Treatment of non-publicly defined Vocabulary/Terminology -- by using @vocab))

As always, thank you Nate, Manu, Michael, Christopher, Kim, Daniel, and all:
I am reminded that more help could always be used in the DIF interop group. I'm thankful for the views and people we've hosted; although this is Kaliya's and Daniel's doing thus far.
I have my own perspective on what might work best, but it is 1 perspective and I was MIA during the early days even though I was part of the Web Payments CG years ago.
I concur with Nate that the learning curve to verifiable credentials is huge, I haven't mastered the protocol layer or really got VC working from scratch (terror of breaking machines) and it is especially harder for an individual. I can attest that Kim Duffy almost single handedly helped me build momentum in the space, because she treated me like a human and encouraged me along the way despite my voiced fears and struggles.
Bringing back to the topic, views like this are always welcome and I've found this thread informative.

-Brent Shambaugh

GitHub: https://github.com/bshambaugh

Website: http://bshambaugh.org/

LinkedIN: https://www.linkedin.com/in/brent-shambaugh-9b91259

Skype: brent.shambaugh
Twitter: https://twitter.com/Brent_Shambaugh

WebID: http://bshambaugh.org/foaf.rdf#me



On Wed, Feb 8, 2023 at 11:15 PM Michael Herman (Trusted Digital Web) <mwherman@parallelspace.net<mailto:mwherman@parallelspace.net>> wrote:
Why does the W3C not simply give *developers the option* to completely leave out the JSON-LD/RDF VC extensions if they don’t need them for their particular ecosystem/trade association/decentralized system?

I discuss an approach here (a single slide):
https://www.youtube.com/watch?v=uo4RuT__IXw&list=PLU-rWqHm5p44AsU5GLsc1bHC7P8zolAAf&index=5&t=2685s


Examples (1 slide): https://www.youtube.com/watch?v=uo4RuT__IXw&list=PLU-rWqHm5p44AsU5GLsc1bHC7P8zolAAf&index=5&t=1377s


Why does the W3C insist on making VCs more complicated for *all developers* than they need to be?

Michael Herman
Web 7.0


From: Nate Otto <nate@ottonomy.net<mailto:nate@ottonomy.net>>
Sent: Wednesday, February 8, 2023 1:35 PM
To: public-credentials@w3.org<mailto:public-credentials@w3.org>
Cc: Christopher Allen <christophera@lifewithalacrity.com<mailto:christophera@lifewithalacrity.com>>; Manu Sporny <msporny@digitalbazaar.com<mailto:msporny@digitalbazaar.com>>
Subject: Re: Layering/Composability of Verifiable Credentials (was: Re: Market Adoption of VCs by Fortune 500s (was: Re: DHS Verifier Confidence/Assurance Level Expectation/Treatment of non-publicly defined Vocabulary/Terminology -- by using @vocab))

Christopher Allen <christophera@lifewithalacrity.com<mailto:christophera@lifewithalacrity.com>> wrote:
>  (For instance, you really need to use SPARQL if you are serious about using JSON-LD data in a large database given an open-world model, and using that requires you to have a deeper understanding of RDF that JSON-LD abstracts out.)

For what it's worth, I also agreed with most of the rest of your message, Christopher, but didn't quite feel this assumption was right. And that's from my perspective wearing hats over time of the product owner, full stack developer, and engineering manager on a project (Badgr) that implemented Open Badges 2.0 import/export in JSON-LD for production with hundreds of thousands of users. Our databases were MySQL and Mongo, and we did not have JSON-LD specific tooling built in at the database or ORM layer. We did all our JSON-LD processing as part of application code prior to storage in the database. Our business logic never really tried to truly embrace the open-worldedness of the OB data model; it stuffed extra data into a JSONField and only used it for exports or data presentations to users, not for any automated understanding of achievement data. This was the right choice for that product and seems to work pretty well in a bunch of other products I've seen in this space as well.

As it turned out, only a small number of developers on the team needed to have much JSON-LD or RDF knowledge. I mostly carried that burden alone and just made sure that when relevant sections of the system were modified across a team of upwards of 50 people including over a dozen code committers, that important functionality didn't break. Most of the important functionality was covered by automated testing in any case. I do agree in general that RDF concept knowledge and JSON-LD practical knowledge is helpful to have somewhere on the team or among project advisors for implementing VC 1.1, OB 3.0, CTDL or other JSON-LD-based specs or vocabularies in this space.



Manu Sporny <msporny@digitalbazaar.com<mailto:msporny@digitalbazaar.com>> wrote:
> Our focus at present is on improving the ecosystem/community tooling so developers don't have to do as much "low-level coding" as they had to do in the past and can focus more at the application layer...

I very much like this focus, and I'm contributing to VC (OB 3.0 flavor) tooling that aims to serve the same purpose. This is the same pattern that I found successful in Badgr. The JSON-LD processing for OB 2.0 was all done in an open source package published by the 1EdTech standards body (code<https://github.com/1EdTech/openbadges-validator-core>, hosted service<https://openbadgesvalidator.imsglobal.org/>). Other implementers besides us used the same package, either over HTTP API or directly as a python module in their service to canonicalize and validate OB inputs into a consistent format (compacted into the OB v2 context). The presence of this tooling, I think, helped developers adopt OB 2.0 more easily. Issuer implementers could pretty much just copy the examples from the OB 2.0 spec<https://openbadgespec.org/> and not even really care why they were putting a @context property in the document, as long as the validator didn't give them any errors. Verifier-side implementers could integrate with the validator API and also not really care much about JSON-LD other than recognizing that there might be some additional properties in the documents beyond those specified in the spec.

The issuer quickstart guide<https://github.com/1EdTech/openbadges-specification/blob/4a6b1417c0311b6b8673e2c53872623b22691a55/ob_v3p0/impl/getting-started.md> the OB community is writing for OB 3.0, which uses the VC Data Model 1.1 is a bit more involved than for 2.0 (comments and suggestions welcome for a couple more weeks), but issuers (90%+ of the ecosystem) still don't need to think too much about the JSON-LD details, they just need to include the relevant contexts and follow the JSON-schema provided by the OB 3.0 spec. When it comes to making a proof, I hope they don't roll their own, but that they use a library for their selected proof technique written by cryptography experts. Key management is still a significant challenge, and there are other technical hurdles that I hope we can make easier for implementers by providing tooling that addresses more of the layers of the VC/OB stack.

As of 2020, 1EdTech (formerly IMS Global) counted over 43 million issued Open Badges<http://content.imsglobal.org/badge-count-2020/badge-count-2020/>. Knowing their sources, these were all in Open Badges 2.0 JSON-LD format with a long-tail distribution across source platforms. As most of these platforms all upgrade to OB 3.0 over the next 12 months (all the major ones have committed to do so), these credentials and the millions more that have been issued since 2020 will be available to recipients in Verifiable Credentials 1.1 JSON-LD format (maybe some will be JWT-flavored). A relatively small number of individuals will need to know the deep inner workings of JSON-LD and RDF in order to effect this transition. I'm sure there will be technical difficulties on the way, but we're going to work collaboratively to make this possible, and I think the next few months' worth of advances in shared tooling could be really helpful in making this possible.



Anyway, long story short, I've been in the role of "the person who worries about the JSON-LD/RDF stuff so that other people don't need to" in product teams and standard groups fairly successfully over the years, and I've advised more than a dozen companies on how to implement OB 2.0 in JSON-LD (now I am actively advising companies on OB 3.0/VC). In that process, I found that it is helpful when somebody I'm talking to has more than surface-level understanding of JSON-LD, but it was absolutely possible to do everything necessary with almost zero specific knowledge of these concepts, if the implementer was knowledgeable about JSON.


Nate Otto
nate@ottonomy.net<mailto:nate@ottonomy.net>
Founder, Skybridge Skills
he/him/his

Received on Friday, 10 February 2023 13:26:51 UTC