Re: Canonicalization: VC-JWT is the URDNA2015 of JWTs.

Just chiming in here to share my experience, as I think it can be valuable
to the group.

I started learning about VCs about 50 days ago, from zero prior knowledge.
Even after that much time, it's still difficult to follow along some pieces
of standard; it is my opinion that this is because it's trying to do too
much. So the things that resonate strongly with me "*goal should be to make
the VC Data Model easy to use and interoperable with W3C and IETF
standards.*", as well as *"Doing a bad job at "a lot" is way worse than
doing a great job at "a little" *(this is the Unix philosophy for you).

Another fun fact from a newbie: I initially thought that what Orie proposed
as the VC-JWS was the way VCs were being secured, before that they were
JWTs. I was surprised to find out that information included in the VC, was
also included in registered JWT claims (like iss, for example).

Given my experiences above, y agree with all parts of this. Thanks Orie.

On Wed, Dec 14, 2022 at 11:55 AM Mike Prorock <mprorock@mesur.io> wrote:

> Orie,
> Thank you for compiling all of this.  I think it is a good summary of the
> issues and complications currently resulting from the various
> transformations involved in signing and verification.
>
> big +1 to this:
>
>> VC-JWT could be made consistent with the three approaches taken above, by
>> treating the "credential content" as the claim set,
>> and by promoting the relevant "proof metadata" parameters to the JWT
>> header.
>
>
> A future VC-CWT could do the same, but rely on securing a content type of
>> `application/credential+cbor`.
>
>
> As I mentioned at TPAC, our goal should be to make the VC Data Model easy
>> to use and interoperable with W3C and IETF standards.
>
>
> I am very much in favor as well with your suggestion here:
>
>> I believe the working group should strongly consider withdrawing the
>> VC-JWT work item, and instead focus on profiling JWS and COSE Sign1 to
>> secure a generic JSON data model.
>
>
>
> Mike Prorock
> CTO, Founder
> https://mesur.io/
>
>
>
> On Wed, Dec 14, 2022 at 9:48 AM Orie Steele <orie@transmute.industries>
> wrote:
>
>> Friends,
>>
>> I believe that transforming data before and after signing and verifying
>> is a mistake, and will lead to severe interoperability and security issues.
>>
>> I believe that VC-JWT is currently implementing a form of
>> canonicalization / transformations that, even if it were to be
>> drastically improved, would still be harmful.
>>
>> Here are some refreshers on this topic.
>>
>> - https://en.wikipedia.org/wiki/Canonicalization
>> -
>> https://www.cisa.gov/uscert/bsi/articles/knowledge/coding-practices/ensure-input-properly-canonicalized
>> - https://en.wikipedia.org/wiki/Graph_canonization
>>
>> Several months ago, I implemented another version of vc-jwt, that
>> attempted to resolve these issues by replicating claims from the JWT claim
>> set to the header.
>>
>> https://github.com/transmute-industries/verifiable-credentials
>>
>> VC-JWT could be made consistent with the three approaches taken above, by
>> treating the "credential content" as the claim set,
>> and by promoting the relevant "proof metadata" parameters to the JWT
>> header.
>>
>> A future VC-CWT could do the same, but rely on securing a content type of
>> `application/credential+cbor`.
>>
>> As I mentioned at TPAC, our goal should be to make the VC Data Model easy
>> to use and interoperable with W3C and IETF standards.
>>
>> The working group should avoid "reinventing" IETF standards such as JWT /
>> CWT.
>>
>> The working group should also avoid "making IETF / IANA the core data
>> model"... since that is essentially already an option, that is what you get
>> when you use a vanilla JWT.
>>
>> I'd also like to share some of the feedback we got when chartering the
>> SCITT WG at IETF:
>>
>> https://mailarchive.ietf.org/arch/msg/scitt/EPz6i2X84TSKkwehRVq6mUpWeaU/
>>
>> > I think it's important to focus the efforts of this group on work that
>> can be feasibly accomplished. To that end, I think we ought to make sure
>> that we shape the protocol and architecture to work with different identity
>> formats (as needed for different deployments), but not discuss anything
>> else about how those identities are obtained, validated, etc. As a point of
>> comparison, TLS basically punts on how X.509 certificates are
>> authenticated, and I think we ought to do the same thing here.
>>
>> There are a few points here that are relevant:
>>
>> 1. It's important to focus a WG on what can feasibly be accomplished...
>> Doing a bad job at "a lot" is way worse than doing a great job at "a
>> little".
>> 2. W3C Verifiable Credentials are not limited to one identity format, so
>> we should avoid making assumptions in the core data model that are biased
>> by or derived from an identity format, for example X.509 or OIDC.
>>
>> Here is another point of feedback we got, that can be interpreted more
>> generally as a criticism of JWT and CWT, not specifically VC-JWT.
>>
>> https://mailarchive.ietf.org/arch/msg/scitt/x6xlPoNvOaS6WfZA1U7Tg6B0-zY/
>>
>> > In order to validate the signature, with VCs (and JWT in general) you
>> have to reach into the payload (and decode it) to extract the issuer (the
>> DID) in order to discover the public key used to sign. In our model, we
>> separate those layers more strictly, *where the issuer is part of the
>> signature envelope header* and *you don't have to reach into the payload
>> to validate the signature*. That's how we stay payload agnostic in
>> SCITT. Whether that's the best strategy is another question, but it seems
>> cleaner to do it that way in my opinion.
>>
>> In my opinion, the approach taken in SCITT is superior to JWT / CWT for
>> cases where you have a widely diverse set of payloads ( which is the case
>> for software supply chain artifacts, and I believe is also the case for W3C
>> Verifiable Credentials ).
>>
>> There are a lot of similarities between W3C Credentials and software
>> supply chain artifacts secured with SCITT at IETF.
>>
>> The most obvious one is that both use cases are meant to cover a very
>> diverse set of data structures and details, where registering every single
>> term in say:
>>
>> https://www.iana.org/assignments/cwt/cwt.xhtml
>>
>> Is likely not a path to success... because of the need to be clear on
>> terminology, and the "cost" of registering in CBOR ( where small tags are
>> worth more ).
>>
>> It is true, we could register all the *required*  terms from the core
>> data model, but this would introduce near duplicates into the IETF
>> registries...
>>
>> I believe this approach would be harmful both to W3C and IETF.
>>
>> I believe the working group should strongly consider withdrawing the
>> VC-JWT work item, and instead focus on profiling JWS and COSE Sign1 to
>> secure a generic JSON data model.
>>
>> ... alternatively, we should consider directly addressing the "long term
>> vision" for JWT / CWT based verifiable credentials holistically,
>> potentially even from a green field perspective.
>>
>> I believe that in the case the WG does produce a *JWT profile*, it would
>> be good to design it from the ground up,
>> based on WG consensus, as opposed to continuing to work through the
>> current version which appears to lack working group consensus on several
>> critical areas.
>>
>> We're having a similar greenfield conversation on this issue, so I feel
>> it's wise for the working group to be consistent in addressing lack of
>> consensus:
>>
>> https://github.com/w3c/vc-data-model/issues/947
>>
>> There does not appear to be consensus on what the data model is, so there
>> cannot be consensus on how best to secure it.
>>
>> Potentially this represents an opportunity for the working group to fix
>> both issues in a superior manner.
>>
>> Regards,
>>
>> OS
>>
>> --
>> *ORIE STEELE*
>> Chief Technical Officer
>> www.transmute.industries
>>
>> <https://www.transmute.industries>
>>
>

Received on Wednesday, 14 December 2022 21:13:38 UTC