Re: The German Government slams JSON-LD

How do I say I need JSON-LD without saying I love JSON-LD

Gesendet von Outlook für Android<https://aka.ms/AAb9ysg>
________________________________
From: Markus Sabadello <markus@danubetech.com>
Sent: Friday, February 27, 2026 5:28:42 PM
To: public-credentials@w3.org <public-credentials@w3.org>
Subject: Re: The German Government slams JSON-LD


Caution: This email originated from outside of the organization. Despite an upstream security check of attachments and links by Microsoft Defender for Office, a residual risk always remains. Only open attachments and links from known and trusted senders.

+1 to all that.

Two funny observations:

1. The section which says "Here’s exactly how structured data + semantics + governance flow together, without JSON-LD inside the credential." has a screenshot which seems to contain several well-known JSON-LD terms such as "JsonWebKey2020" and "publicKeyJwk".

2. The specification for Attestation Rulebooks seems to be based on several existing RDF vocabularies:
https://github.com/eu-digital-identity-wallet/eudi-doc-standards-and-technical-specifications/blob/main/docs/technical-specifications/ts11-interfaces-and-formats-for-catalogue-of-attributes-and-catalogue-of-schemes.md

Markus

On 2/27/26 1:59 PM, Lluís Alfons Ariño Martín wrote:

Dear all,

Article raises operational questions that deserve serious engagement, and its conciliatory tone is welcome. Several of its observations about the verification flow are correct. But the article also contains factual errors, conflates distinct technical concepts, and rests on a false dichotomy that, if left unaddressed, could mislead the ecosystem at a critical moment - precisely when the draft amending Implementing Regulations are open for public consultation.

I want to address the key issues as concisely as I can, while providing enough technical grounding for this audience to evaluate them independently.

1. SD-JWT VC does not provide unlinkability

SD-JWT VC provides selective disclosure - the holder can choose which claims to reveal. But it does not provide unlinkability. When the same SD-JWT VC credential is presented to two different verifiers, the signature and key binding remain constant across presentations, making them correlatable. This problem is amplified in the EUDI architecture, where presentations may be routed through qualified validation services - centralised points that receive a high volume of credential presentations and could technically link them across relying parties. Without additional measures such as single-use credentials (which impose a significant availability burden on issuers), SD-JWT remains linkable by design.

To be clear: this is not a format-specific deficiency - any credential format that couples the signature to the payload without zero-knowledge proof mechanisms faces the same issue (W3C VC signed with JAdES included). The difference is that BBS cryptosuites, which are native to the JSON-LD W3C-VC ecosystem, provide a concrete path to unlinkability: each presentation generates a cryptographically distinct proof that cannot be correlated with other presentations of the same credential, even by colluding verifiers. This is not a minor distinction - it is the core privacy property that leading European cryptographers criticised the EUDI Wallet for lacking in 2024.

Conflating selective disclosure with unlinkability is a category error that obscures what is at stake in the format debate.

2. The article addresses only half the verification flow

The article correctly observes that a verifier must construct the Presentation Definition before seeing the credential (what I'll call Moment 1: request construction). It then concludes that semantics inside the credential are "operationally useless" because they "arrive too late."

This ignores the second half of the flow - Moment 2: response interpretation. When a German verifier receives a credential issued in Spain, it needs to know what the fields mean, not just that they exist. @context provides precisely this: a machine-readable link from each property in the credential to its definition in a governed vocabulary.

The rulebook tells the verifier what to ask for. @context tells the verifier what it has received. Both are necessary. The article addresses only the first.

Cross-border interoperability is fundamentally a Moment 2 problem. A verifier in one Member State receiving a credential from another needs to determine that the fields it received correspond to the concepts it expected, even when different national vocabularies, languages, or qualification frameworks are in play.

This is what @context enables - and no amount of pre-agreed schemas eliminates this need at the point of credential interpretation.

3. JSON Schema cannot replace @context - they operate at different levels

The article asserts that "all the semantics defined in JSON-LD contexts can be expressed just as well in JSON Schemas with descriptions." This is false.

JSON Schema validates structure: this field is a string, this array has at least one element, this value comes from an enumerated list. It answers: "is this credential well-formed?"

@context provides semantic binding: this field corresponds to this concept in this ontology, accessible via a resolvable URI. It answers: "what does this credential mean?"

JSON Schema cannot express that "architect" in a Spanish credential and "Architekt" in a German credential refer to the same regulated profession under Directive 2005/36/EC. It cannot express that a competency described using ESCO in one Member State is equivalent to a competency described using a national framework in another. SHACL - the Shapes Constraint Language for RDF, which is explicitly supported in ETSI TS 119 472-1 clause 7.2.1.3 as "ShaclSchemaCredential" - can validate both structural and semantic constraints. JSON Schema cannot.

Adding a "description" string to a JSON Schema property gives you human-readable documentation. Adding a URI to @context gives you machine-readable semantic binding to a formal ontology. These are categorically different capabilities.

Conflating them undermines the core argument of the article.

4. The rulebook proposal reinvents @context without the standardisation

The article's own example includes "rulebookURI" - a field that points to an external document defining the semantics of the credential, plus "schemaURIs" pointing to external schema definitions.

This is functionally identical to what @context does: linking the credential to its semantic definitions via URIs. The difference is that @context is a W3C standard with deterministic processing rules (JSON-LD Processing Algorithms 1.1, W3C Recommendation), a global ecosystem of implementations across education (Open Badges 3.0, European Learning Model, Europass Digital Credentials), supply chain (Catena-X), and identity (EBSI, DC4EU), and years of interoperability testing. "rulebookURI" is a bespoke field with no formal specification, no standardised processing algorithm, and no implementation track record.

Replacing a standardised mechanism with a non-standardised one that does the same thing is not simplification. It is a regression from standardised to ad hoc.

5. The vocabulary scale problem

A question in the LinkedIn thread deserves attention here, because it identifies the architectural fault line: would controlled vocabularies like ESCO (approximately 14,000 skills, 27 languages, regular update cycles) need to be copied into each rulebook, or referenced by each rulebook?

If copied into: every vocabulary update requires synchronised updates across every rulebook that references it. Multiply ESCO by the European Learning Model, the thousands of regulated professions under Directive 2005/36/EC, and 27 national qualification frameworks, and you have an unscalable maintenance explosion.

If referenced by: the rulebook needs a standardised, machine-readable mechanism for resolving those references by URI. That mechanism is @context - or rather, it is exactly what @context already provides.

The article's position, followed to its logical conclusion, requires either vocabulary duplication (unscalable) or the reinvention of @context within the rulebook architecture itself.

6. The false dichotomy at the heart of the article

This is the central analytical error. The article presents a binary choice: either JSON-LD inside the credential or semantics in rulebooks. In reality, these are not alternatives. @context is the mechanism that links the credential to its governing vocabulary - which can perfectly well be a rulebook, a catalogue, or any governed semantic framework.

A credential with @context pointing to a rulebook-governed vocabulary is not "JSON-LD complexity inside the credential." It is a single JSON property containing a URI that provides machine-readable linkage to the governing semantic framework. Processing it requires a URI lookup, not an RDF reasoning engine.

The article attacks a strawman - heavy semantic-web processing inside every wallet - rather than what @context actually does: a standardised pointer from the credential to its vocabulary. This is precisely what the article's own "rulebookURI" attempts to do, but without the standardisation.

7. What the formal specifications actually say

ETSI TS 119 472-1 V1.1.1 (December 2025) - which is part of the formal EUDI Wallet technical framework - explicitly mandates @context for JSON-LD W3C-VC EAAs (requirement EAA-7.2.1.2-01) and specifies that it shall contain URIs referencing documents that map URLs to short-form aliases (EAA-7.2.1.2-03). It supports SHACL as a schema mechanism alongside JSON Schema (EAA-7.2.1.3-03). It references BBS cryptosuites for selective disclosure with embedded proofs (EAA-7.4-02, citing W3C Candidate Recommendation "Data Integrity BBS Cryptosuites v1.0").

The ecosystem it describes - where @context is unnecessary, JSON Schema suffices, and BBS is dead - does not match the ecosystem that ETSI's own experts have specified.

The formal technical framework already provides for exactly the capabilities the article argues against.

8. BBS is not "dead" - it is in process

The article declares "BBS+ is dead. Regulation killed it." Several corrections are needed. First, BBS+ is not BBS - the BBS cryptosuites currently under development at IETF are a different scheme with different security properties; the terminology matters. Second, the claim that they are unusable because they are "not approved by BSI or ANSI" is circular: they are not yet approved because the standardisation process is ongoing, not because they have been evaluated and rejected. The same logic would have disqualified ECDSA before its own approval. Third - and this is architecturally important - neither BSI nor ANSI have a mandate to define cryptographic standards at European level. That mandate belongs to the European Standards Organisations. BSI itself references ETSI TS 119 312 in its national standards. The article elevates national agency positions to a regulatory veto they do not hold.

ETSI TS 119 312 - the European cryptographic algorithm catalogue - is currently under revision, and the inclusion of privacy-preserving cryptographic mechanisms including BBS is within scope. It should be noted that the direct reference to this TS was removed in the latest adaptations of the Implementing Acts, which creates a regulatory gap that needs to be addressed. But the reference chain remains indirect (through EN 319 411-2 via EN 319 411-1), and the ongoing revision is expected to be completed in the near term.

To be clear: it is true that the EUDI Wallet ecosystem requires approved cryptographic algorithms, and BBS is not yet approved. This is a fact, not a disputed point. But "not yet approved" is not the same as "dead" or "killed by regulation." It means the standardisation process has not concluded.

The question is not whether BBS will be standardised at European level, but when — and whether the regulatory framework will be ready to accommodate it when it is.

9. The cost asymmetry the article doesn't acknowledge

The article proposes that sectors "lift" their JSON-LD vocabularies into rulebook-format JSON Schemas and presents this as a painless migration. It is not. Rewriting the European Learning Model, ESCO mappings, and EQF ontologies as JSON Schemas would lose semantic expressiveness (because JSON Schema cannot represent ontological relationships), impose a massive re-engineering cost on the sectors that invested earliest, and require those sectors to adopt a solution they already evaluated and rejected precisely because it could not meet their cross-border interoperability requirements.

The education sector did not choose JSON-LD by accident or fashion. It chose it because JSON Schema could not express the semantic relationships needed for cross-border credential recognition across 27 Member States with different national qualification frameworks.

In summary

The article is right that verifiers need to know what to ask for before seeing the credential. It is right that governance, versioning, and policy belong in governed registries. It is right that the ecosystem needs to be as simple as possible.

But it is wrong that SD-JWT VC provides unlinkability. Wrong that JSON Schema can express what @context expresses. Wrong that @context is heavy semantic-web processing inside wallets. Wrong that BBS has been killed by regulation. And wrong that the choice is between JSON-LD inside credentials and semantics in rulebooks - because @context is the bridge between the two, not an alternative to either.

The EUDI Wallet ecosystem needs three things that are not in competition: governed semantic frameworks (rulebooks, catalogues), a standardised mechanism for linking credentials to those frameworks (@context), and privacy-preserving cryptography for presentation (e.g. BBS). Removing the second does not simplify the architecture. It severs the link between the credential and its meaning - and either leaves that link broken or forces its reinvention under a different name, without the standardisation.

Lluís


From: carsten.stoecker@spherity.com<mailto:carsten.stoecker@spherity.com> <carsten.stoecker@spherity.com><mailto:carsten.stoecker@spherity.com>
Date: Friday, 27 February 2026 at 12:06
To: 'Melvin Carvalho' <melvincarvalho@gmail.com><mailto:melvincarvalho@gmail.com>, 'Anders Rundgren' <anders.rundgren.net@gmail.com><mailto:anders.rundgren.net@gmail.com>
Cc: 'W3C Credentials CG' <public-credentials@w3.org><mailto:public-credentials@w3.org>
Subject: AW: The German Government slams JSON-LD


Hi all,


The subject line “The German Government slams JSON-LD” is not supported by the sources linked in the post.



(1) The original message in this W3C mailing list thread only links to a Medium article. It does not quote any German government statement or publication.



(2) The Medium article is authored by an individual on Medium. It is not an official publication of the German Federal Government.

(3) The author of the Medium article is publicly listed as working on the EUDI Wallet topic at SPRIND. SPRIND is the Federal Agency for Breakthrough Innovation (SPRIN-D). The Federal Government is the sole shareholder. This does not mean that a personal Medium post either represents SPRIND or an official German government position.



(4) Germany uses JSON-LD in production public-sector data infrastructure today. Evidence:

     + GovData’s metadata catalogue is available via endpoints “in RDF, Turtle and JSON-LD”

     + IT-Planungsrat/GovData documentation describes harvesting DCAT-AP.de RDF endpoints, including RDF-XML, JSON-LD, and Turtle

     + DCAT-AP.de documentation (Pflegehandbuch) references RDF/XML and JSON-LD conventions

     + Mobilithek metadata upload documentation expects metadata as JSON-LD or RDF/XML



(5) In European data sovereignty and space trust models, JSON-LD and W3C VC concepts are in active use (especially for machine-readable, semantic statements).

     + Example: Gaia-X, Manufacturing-X and Caten-X (AND many data spaces in other European countries) credentials are described as W3C VCDM in JSON-LD
     + The same data space work describes EDC catalogue exchange as DCAT (dcat:Catalog) serialized as JSON-LD and Credential Exchange in DCP protocol in JSON-LD



(6) Eclipse Dataspace Components (EDC) a key connector implementation in this space and supported by the German Government’s R&D arm: https://github.com/eclipse-edc/Connector



There are differences between JSON-only credential formats (e.g., SD-JWT VC or ISO mdoc) and JSON-LD credentials with Data Integrity. B2B and B2G scenarios often require machine-processable semantics, policy enforcement, and provenance across many parties and systems, which can differ from typical citizen identity use cases. The community should assess these requirements case by case and choose architectures alternatives and credential profiles accordingly.



Regards,

Carsten

















Von: Melvin Carvalho <melvincarvalho@gmail.com><mailto:melvincarvalho@gmail.com>
Gesendet: Freitag, 27. Februar 2026 07:25
An: Anders Rundgren <anders.rundgren.net@gmail.com><mailto:anders.rundgren.net@gmail.com>
Cc: W3C Credentials CG <public-credentials@w3.org><mailto:public-credentials@w3.org>
Betreff: Re: The German Government slams JSON-LD







pá 27. 2. 2026 v 7:04 odesílatel Anders Rundgren <anders.rundgren.net@gmail.com<mailto:anders.rundgren.net@gmail.com>> napsal:

https://mmollik.medium.com/why-the-eudi-wallet-should-stop-pretending-it-needs-json-ld-in-credentials-a37d09a26d06



I appreciate the elegance of JSON-LD, but I recognize it can be too heavyweight for certain applications. That's why I sought a more lightweight alternative with Linked Objects.



https://linkedobjects.org/



I will be doing a new round of work on this, quite soon.




Anders


[https://cdn.spherity.com/logo/Spherity-logo-horiz-blue-rgb-sm.png]

Spherity GmbH<https://www.spherity.com/>  |  Emil-Figge-Straße 80  |  44227 Dortmund

LinkedIn<https://www.linkedin.com/company/spherity>   |  YouTube<https://www.youtube.com/@spherity2407>

Managing Directors: Dr. Carsten Stöcker, Dr. Michael Rüther

Registered in Dortmund HRB 31566

Received on Friday, 27 February 2026 16:33:20 UTC