RE: Why I can no longer support OAuth2 (was: Re: Why can't I pay using a Verifiable Credential?)

Manu, Tim, all,

+1 on moving past OAuth2 for our VC work. 

From a B2B/B2G point of view, two things are important: 
     (A) persistent, authenticated channels between organizations, and 
     (B) semantically rich credentials that machines can process across ecosystems.


Below are our observations from B2B/B2G use cases:

1) Persistent channels are a business requirement in B2B/B2G

Large supply chains run 24/7. Flows are automated. Parties must exchange revocations, updates, and counter-signatures without user clicks. OpenID4VP’s baseline is still request/response over HTTPS with redirects. That is fine for consumer web authentication, but it does not give a durable, encrypted, pairwise channel that both sides can use for automation and event-driven updates. 

Protocols such as DIDComm gives that: authenticated, end-to-end encrypted, long-lived relationships with message threading and delivery receipts. This supports asynchronous pickup, which many org wallets need behind firewalls and queues. There is also a formal analysis and verification of the DIDcomm protocol: https://eprint.iacr.org/2024/1361

"Use Case Example" from EU projects: From a legal perspective, a business registry update can lead to a revocation of a Legal Person Identifier (LPID) in the EU/EAA. If a LPID stored in an enterprise wallet is revoked and €100M of goods are in stuck with supply chain partners or in customs, you cannot wait for a user to re-click a redirect flow. You need the registry to push an updated LPID credential over a secure, persistent channel, seconds later, to clear the blockage. Protocols such as DIDComm fits this automation pattern; OAuth redirect flows do not. The need for persistent channels is common pattern in B2B messaging for server-to-server exchanges between legal (and government) entities. We are expecting to see more regulatory and technical requirements based on legal controls from EU regulator(s) for this feature until end of the year.


2) B2B needs semantic richness; SD-JWT (as used with OID4VC) is not a good fit

B2B credentials carry structured trade, product, industry, digital twin and conformance data. These require shared vocabularies, versioning, and machine reasoning across domains. JSON-LD is the dominant approach for this in many data spaces and public programs:
- UN/CEFACT guidance for cross-border trade explicitly tells implementers to use JSON-LD vocabularies and the correct @context. (UNECE)
- Gaia-X / Manufacturing-X / Catena-X Credentials (all B2B sovereign data sharing industry ecosystems) are JSON-LD and rely on linked data semantics and policies. 
- European Blockchain Service Infrastructure (EBSI) profiles are built on W3C VCs and linked-data proofs. GS1 also documents VC/DI adoption with JSON-LD. 

By contrast, SD-JWT VC is a compact JSON token format with selective disclosure (for semantically poor data structures such as simple human identity and driver license cards). It does not carry JSON-LD semantics; claim meanings are not globally linked by design. That is fine for minimal human KYC or consumer login, but it undermines interoperability for rich B2B payloads (customs, product passports, quality, ESG). SD-JWT could embed JSON-LD objects, but this would add further complexity.

If we want verifiers to validate complex business claims consistently across ecosystems, we should use the VC Data Model and Data Integrity proofs, not SD-JWT tokens as the primary data model. 


3) Security and delegation: favour message-level signatures and capabilities

OAuth2 can be hardened (mTLS, DPoP, FAPI 2.0), but the result is still a bearer-token mindset. B2B/B2G workflows benefit from simpler, explicit cryptography on each request and delegation:
- HTTP Message Signatures give non-repudiable, per-message authentication and integrity. 
- Authorization Capabilities / Trust Chaining give chainable delegation or authorisation suitable for organizations and automation (here is still some work to do done for defining chained credentials semantics). This is where we should put our effort, as Manu noted. 


4) Protocol stack that meets these B2B/B2G needs

A pragmatic stack looks like this:
- Transport/Channel: DIDComm v2 for persistent, E2E-encrypted, asynchronous exchanges (issuance, presentation, updates, problem reports). 
- Interaction Protocols: WACI / DIDComm Issue-Credential 3.0 and Present-Proof 3.0 profiles for org wallets. 
- Data Model & Proofs: W3C VC Data Model 2.0 with JSON-LD contexts; Data Integrity suites. Status List (bitstring). 
- APIs: VC-API for server wallets and verifiers, including audit-friendly, message-signed exchanges. 

OID4VCI/OID4VP remain useful for browser-to-wallet consumer flows and platform DC-API integrations. They are not the right default for org-to-org, gov-to-org, M2M and A2A automation. Even the OID4VP spec says its baseline rides on HTTPS redirects; it is not a persistent channel. The persistent comms channels and Agent-2-Agent (A2A) will dominate the technical requirement discussion.


Given the above, we support the view to leave OAuth2, OID4VCI, and OID4VP behind for B2B/B2G use cases, and to focus our profiles and test suites on DIDComm + VC-API + JSON-LD + message signatures/capabilities. This aligns with what many sovereign data-exchange initiatives already do, and it meets the operational needs of industry and government. 


References and prior analysis: our write-up on organizational wallet protocols and why SD-JWT / OID4VC are not a fit for B2B automation. As part of our work in the European Wallet Consortium (EWC, a EU Commission Large Scale Pilot), we summarised our findings here:  https://github.com/spherity/eudi-wallet-rfcs/blob/org-wallet-protocols-and-formats/organisational-wallet-protocol-and-format-discussion.md


Carsten Stöcker,
Spherity GmbH

-----Original Message-----
From: Marcus Engvall <marcus@engvall.email> 
Sent: Montag, 18. August 2025 00:24
To: Manu Sporny <msporny@digitalbazaar.com>
Cc: W3C Credentials Community Group <public-credentials@w3.org>; Tim Bouma <trbouma@gmail.com>
Subject: Re: Why I can no longer support OAuth2 (was: Re: Why can't I pay using a Verifiable Credential?)

I think that while discussion can be had whether or not OAuth’s security model is appropriate in 2025 (especially with new protocols like GNAP and ZCAP), the criticism of OAuth being dependent on insecure bearer tokens is not reflective of modern best practice in high-assurance environments.

For instance, OpenID FAPI 2.0 explicitly specifies that conforming implementations must use *either* OAuth 2.0 mTLS (RFC 8705) or OAuth 2.0 DPoP (RFC 9449), both of which cover the threat vectors mentioned in the article. Message signing with HTTP Message Signatures is optional in case non-repudiation is required. OAuth 2.1, still in the draft stage, also mentions both RFCs, but stops short from recommending or requiring them.

In my opinion, the primary issue with OAuth, which is also highlighted in the article, is that it is a complicated patchwork of standards built upon a narrow model of authorisation. It’s optimised for implementation flexibility at the cost of varying levels of security guarantees in said implementations. If you properly implement high-assurance OAuth 2.0, however, you will have a modern, reliable, and secure system. The onus however is still on you and your counterparties to actually implement it properly, which is admittedly a big ask for certain organisations.

—
Marcus Engvall
https://www.linkedin.com/in/qasaur/

> On 17 Aug 2025, at 18:50, Manu Sporny <msporny@digitalbazaar.com> wrote:
> 
> On Wed, Aug 13, 2025 at 5:18 PM Tim Bouma <trbouma@gmail.com> wrote:
>> Several months back I took a hard look at OAuth and what I learned from my implementation endeavour, I wrote this.
>> 
>> https://open.substack.com/pub/trbouma/p/why-i-can-no-longer-support-o
>> auth
> 
> This is an excellent article, Tim. It matches our experience at 
> Digital Bazaar implementing both OAuth2 and the OID4VCI/OID4VP 
> protocols. We have repeatedly experienced developers in the ecosystem 
> accidentally leak their credentials via OAuth2 because of a lack of 
> understanding in how it works, or the pitfalls with "framework-based 
> standards".
> 
> OAuth2 was what drove us to incubate and standardize HTTP Signatures, 
> which does at least depend on cryptography to prove authorization.
> 
> Dick Hardt has recently written a related post on how OAuth2 is not a 
> good fit for things like MCP (and how HTTP Message Signatures are
> better):
> 
> https://www.linkedin.com/posts/dickhardt_mcp-oauth-ai-activity-7358178
> 115673616384-RKzc/
> 
> ... granted, he still tries to loop OAuth2 back into the mix (a 
> mistake, IMHO -- we should just leave OAuth2 behind and move on to 
> stuff like Authorization Capabilities), but dropping the complicated
> OAuth2 dances for clients is certainly an improvement.
> 
> In any case, just wanted to echo what you wrote Tim -- OAuth2 was 
> better than sharing passwords a decade or more ago, but we need to 
> move on to real cryptographic authentication, authorization, and 
> delegation.
> 
> -- manu
> 
> --
> Manu Sporny - https://www.linkedin.com/in/manusporny/
> Founder/CEO - Digital Bazaar, Inc.
> https://www.digitalbazaar.com/
> 





-- 




Spherity GmbH <https://www.spherity.com/>  |  Emil-Figge-Straße 80  |  
44227 Dortmund

LinkedIn <https://www.linkedin.com/company/spherity>   |  X 
<https://twitter.com/spherity>   |  YouTube 
<https://www.youtube.com/@spherity2407>

Managing Directors: Dr. Carsten 
Stöcker, Dr. Michael Rüther

Registered in Dortmund HRB 31566

Received on Tuesday, 19 August 2025 19:19:18 UTC