- From: UGent-imec <Beatriz.Esteves@UGent.be>
- Date: Thu, 26 Mar 2026 15:16:46 +0000
- To: Freek Dijkstra <freek.dijkstra@surf.nl>, "public-dpvcg@w3.org" <public-dpvcg@w3.org>
- Message-ID: <GV2PR09MB84024AA1D20D64649DE621338456A@GV2PR09MB8402.eurprd09.prod.outlook.com>
Dear Freek, Thanks for reaching out to the DPVCG. It is super nice to see that SURF is looking at the work we do in DPV. I would be very happy and interested to work with you on these topics. Maybe we should have a follow up call to discuss it in more detail? Best regards, Beatriz Esteves Postdoctoral Researcher IDLab, Ghent University - imec Enviado a partir do Outlook para iOS<https://aka.ms/o0ukef> ________________________________ De: Freek Dijkstra <freek.dijkstra@surf.nl> Enviado: Thursday, March 26, 2026 11:01:48 AM Para: public-dpvcg@w3.org <public-dpvcg@w3.org> Assunto: DPV TOM extension with more PETs Dear DPV CG members, We are looking for ways to describe conditions when making sensitive data available for re-use. For the access control, we are likely to use the DUO (digital usage ontology) by GA4GH. However, that does not cover the technical measures that a data provider takes when making sensitve data available. Usually, this boils down to a set of privacy enhancing technologies (PETs) like pseudonimization, filtering the data, and only making the data available for analysis, but not for download. The DPV TOM module describes some of these PETs, like pseudonimyzation, synthetic data, secure MPC, and (fully) homomorphic encryption. However, some others are missing. In particular algorithm-to-data and federated machine learning. Would there be interest to add these concepts as technological measures to future versions of DPV? If not, would anyone be able to recommend other ontologies that describes these concepts, prefable one that works will with DPV and/or ODRL. The main concept we're interested in is algorithm-to-data: rather than making sensitive data available for download, the data provider runs the analysis requested by a researcher and only makes the result of that analysis available. There are a few variants, but a variant were the data is made available in a secure environment is now referred to as a "Trusted Research Environment" (TRE) in academic context. We and some of our partners are offering such an environment, and we like to describe this in a machine-readable format. Wih kind regards, Freek Dijkstra -- Freek Dijkstra | SURF Innovation Lab | | M +31 6 4484 7459 | | Available on Mon, Tue, Wed, Thu | SURF is the collaborative organisation for ICT in Dutch education and research
Received on Thursday, 26 March 2026 15:16:54 UTC