Re: Novel (to me) architecture for control of personal data

It's helpful to consider the value propositions and business models that
can drive adoption of SSI because they can provide objective input into
layering standards. More directly, standardized VCs can participate in
different value / business opportunities depending on how we design and
layer applicable protocols.

As this real-world example implies, there are three principal value
propositions:

1 -  Aggregation by a trusted delegate AA for convenience in policy-based
authorization and the business opportunity offered by standards that enable
free choice of these delegates.

2 - Increasing trust (reduced surveillance) and reducing the cost (security
and regulatory) of the AA by keeping the contents of the transaction out of
the AA.

3 - Increased privacy by blinding the issuer and verifier from each other.

#1 and 2 are independent of #3. Absolute blinding adds overall complexity
and cost for authenticity and revocation. Relaxing the absolute requirement
of #3 through audit and accountability mechanisms such as logs and notaries
can still provide much of the privacy value.

So we have three patterns for applying VCs in the marketplace. They range
from absolute blinding (
https://github.com/w3c/vc-data-model/issues/821#issuecomment-1241110115)
with no Account Aggregator > through the BIS / India pattern of
encryption-isolated AA > to a delegated authorization AA that also isolates
the content of the transaction but does not use the AA as proxy.

Over this spectrum, VCs apply to solve the authenticity issue as applied to
Data Providers and also to reduce the risk of an authorization decision as
applied to Data Users making a request to the AA.

VCs will be most successful in the marketplace if we design protocols that
facilitate VC use (and reuse) across all three patterns of control.

Adrian

---------- Forwarded message ---------
From: devi prasad <dprasadm@gmail.com>
Date: Sat, Sep 10, 2022 at 4:05 AM
Subject: Re: Novel (to me) architecture for control of personal data
To: Steve Capell <steve.capell@gmail.com>
CC: Adrian Gropper <agropper@healthurl.com>, W3C Credentials Community
Group <public-credentials@w3.org>, Chris Gough <
christopher.d.gough@gmail.com>, <namlleps.drahcir@gmail.com>



Steve, the consent manager is an Account Aggregator (AA) in this ecosystem.
AAs are regulated by the central bank - the Reserve Bank of India (RBI).
The Financial Information Users (FIUs) generally pay per
consent/transaction - the amount is not fixed by RBI. It is left to
the market.

Refer to the official site for more details: https://api.rebit.org.in/

IMO, substantial trust is placed on AAs when a Financial Information User
(FIU) fetches financial data from a Financial Information Provider (FIP - a
bank, for example).
Ephemeral keys are used per data fetch between FIU and FIP. ECDH using
Curve25519 is mandatory.

There's a central registry of FIPs and FIUs maintained by the non-profit
organization, Sahamati - https://sahamati.org.in/.
This is indeed an interesting model that works at India scale.

The data request from FIU to an FIP via AA is documented here:
https://swagger-ui.rebit.org.in/?url=https://specifications.rebit.org.in/api_specifications/account_aggregator/FIP_1_1_3.yaml#/Data%20Flow/post_FI_request
This is the context where I think an FIU trusts AAs because the former
exposes the initial key material it wants to share with the FIP for
subsequent cryptographic operations (signing as well as data encryption).

This text (by a licensed AA) offers more details :
https://docs.setu.co/data/account-aggregator/encryption

Regards
Devi Prasad



On Sat, Sep 10, 2022 at 3:05 AM Steve Capell <steve.capell@gmail.com> wrote:

> 
> Thanks for sharing that Adrian
>
> It’s very interesting - all the more so because it’s apparently live and
> working in the worlds most populous democracy.  Some thoughts occur to me
>
> 1 - It doesn’t say how the key management works for that encrypted flow of
> private data works. Orchestrated by the consent manager but not visible by
> the consent manager.  I guess it must be asymmetric encryption based on
> public key discovery of data users
>
> 2 - it doesn’t say how the commercial model works. Who pays for the
> consent manager service? The data user maybe?  Or is the consent manager a
> government run public good utility?
>
> 3 - although it’s different, I’m not entirely sure how / whether it’s
> better than a VC / EDV model where the subject is also the consent Manager?
> It might have something to do with the question about commercial
> incentives.  Possibly the most interesting thing about the Indian model is
> not the tech pattern but the commercial model for a fee-charging consent
> manager who’s profit motives are to protect the data subject’s data rather
> than to profit from the aggregation / analysis / resale of it
>
> Not expecting you to answer these questions Adrian - just sharing them as
> they occur to me ;)
>
> Kind regards
>
> Steven Capell
> Mob: 0410 437854
>
> On 10 Sep 2022, at 12:55 am, Adrian Gropper <agropper@healthurl.com>
> wrote:
>
> 
>
> https://www.brookings.edu/blog/future-development/2022/09/08/give-people-control-of-their-data/
>
> Builds around delegation to an intermediary that does not see the data
> itself.
>
> Adrian
>
>
>

Received on Saturday, 10 September 2022 17:49:30 UTC