DPV TOM extension with more PETs

Dear DPV CG members,

We are looking for ways to describe conditions when making sensitive 
data available for re-use.
For the access control, we are likely to use the DUO (digital usage 
ontology) by GA4GH.

However, that does not cover the technical measures that a data provider 
takes when making sensitve data available.
Usually, this boils down to a set of privacy enhancing technologies 
(PETs) like pseudonimization, filtering the data, and only making the 
data available for analysis, but not for download.

The DPV TOM module describes some of these PETs, like pseudonimyzation, 
synthetic data, secure MPC, and (fully) homomorphic encryption.
However, some others are missing. In particular algorithm-to-data and 
federated machine learning.

Would there be interest to add these concepts as technological measures 
to future versions of DPV?
If not, would anyone be able to recommend other ontologies that 
describes these concepts, prefable one that works will with DPV and/or ODRL.

The main concept we're interested in is algorithm-to-data: rather than 
making sensitive data available for download, the data provider runs the 
analysis requested by a researcher and only makes the result of that 
analysis available. There are a few variants, but a variant were the 
data is made available in a secure environment is now referred to as a 
"Trusted Research Environment" (TRE) in academic context. We and some of 
our partners are offering such an environment, and we like to describe 
this in a machine-readable format.

Wih kind regards,
Freek Dijkstra

-- 
Freek Dijkstra
| SURF Innovation Lab |
| M +31 6 4484 7459 |
| Available on Mon, Tue, Wed, Thu |

SURF is the collaborative organisation for ICT in Dutch education and research

Received on Thursday, 26 March 2026 12:45:51 UTC