Re: new ACLU Legislative Guidance

First, let me be clear that I am still very proud of the work all of us
have done on the VC and DID data models and glad that work is continuing,
including VC Barcodes.

If only the work on the protocols and pilots would have put people first
and left the work on stuff to some other groups.

More specifically, here are my responses to some of Manu’s questions:

- On interaction with some civil society .orgs, I too tried my best to
involve them. My understanding is that they could not afford to engage at
the highly technical level the work was being done at. Our community could
have made different choices, such as putting digital driver’s licenses,
delegation, and notarization first. That was our choice and efforts to
raise this point were met with strong enough pushback from leadership that
I chose to step back and limit my participation to an occasional post. To
this day, I have avoided saying how I feel about this outside of our
community because I do respect all of you and am hoping we can do better.

- With respect to the human use-cases, (such as government-issued licenses,
delegation, contextual human reputation, and notarization by regulated
intermediate issuers) I am not aware of any work in any part of our broader
community on these except in a reactive mode for mDL. If our scope were
clearer, we might have been able to engage more diverse standards groups
such as IETF.

- On biometrics, yet another example of putting people first, we would have
considered biometrics as _the_ top priority in identity. Now, we’re playing
catch-up with mDL, Clear, and id.me. I agree with Manu on the strong
character of the folks in this community but our actions have left a void
that other interests have been happy to fill. Without our leadership in
this respect, politicians are more easily manipulated.

- On “chain of custody”, drivers licenses, paper-first, barcodes, etc, are
self-verifying in use and do not inherently depend on certified hardware
for their management. Why did the community decide to confuse the subject
and the holder in the way it did? How does this confusion impact
government-issued identities and credentials that are, like Aadhaar,
destined to become the foundation for private use. AMVA has been
particularly closed, vague, and unhelpful in this respect.

- On delegation, IETF GNAP has done the heavy lifting here for years. Let’s
start by building on top of it instead of extending OAuth and OIDC that
have been shown to be easily captured by private interests.

- On Sybil resistance and reputation, I am not aware of a single deployment
or pilot based on VCs and DIDs in this use-case. I do see good work on BBS+
and ZKP that could lead to advances in this direction, but that work has to
compete for attention with a slew of other approaches that are not as
respectful of human rights. This is another example of why we should not
treat humans and machines in the same workgroups.

I hope this list is clear and concrete enough to warrant more conversation.

Adrian

On Sat, Oct 12, 2024 at 2:25 PM Manu Sporny <msporny@digitalbazaar.com>
wrote:

> On Fri, Oct 11, 2024 at 10:17 PM Adrian Gropper <agropper@healthurl.com>
> wrote:
> > Manu’s review is a good start. But the admittedly good intentions of our
> community must not pave the road to digital hell.
>
> Oh, yes, clearly. I don't think any of us get out of bed in the
> morning so sure of ourselves that we blindly proceed without careful
> deliberation on the technical, legal, ethical, moral, and political
> choices that are being made as this technology goes into production.
>
> I also hope that no one in this community is under the impression that
> any of us have this all figured out. We don't... but when stuff like
> this ACLU report comes out, we talk about it and debate it openly,
> which is not happening for many of the alternative technologies you
> mentioned.
>
> That we are able to have public discussions, and have been having
> these discussions for over a decade in this community, and have been
> acting on the outcomes of those discussions in ways that result in
> global standards that attempt to address the ACLU, EFF, and EPIC's
> concerns (many of them valid) is one of the more important aspects of
> this community. This is what I was getting at wrt. building these
> technologies in the open, in full transparency, for all aspects of the
> design, incubation, standardization, and deployment process.
>
> > In my opinion, our community has left the hard work of protecting human
> rights to politicians and lawyers at almost every fork in our journey.
>
> You paint with too broad of a brush. I know many people in this
> community that hold protecting human rights as a necessary duty of
> care -- you're not the only one, Adrian. :)
>
> In fact, of those that have been with the community the longest, and
> have built and deployed systems into production... of those
> individuals, I know of no one that does not care for or blatantly
> disregards human rights. On the contrary, I can't think of a single
> person that wouldn't be deeply disturbed and saddened if the
> technology they are building is used to violate human rights.
>
> That doesn't mean it can't happen. I know many of us are regularly
> concerned of the "unknown unknowns", the unintended side effects of
> the technologies we are building. There's only so much a tool can do,
> and at some point, the law needs to step in and take over. We don't
> make laws at W3C, we standardize technologies, but that doesn't mean
> those technologies are not guided by principles and ethics.
>
> Some further thoughts on your points below...
>
> > - we went ahead without participation by EFF, ACLU, and EPIC
>
> Definitely not true. I have personally reached out to each of those
> organizations, and others, and requested that they engage and
> participate in the standards setting process and have done so for
> years. I know others in this community that have done the same, and
> they have engaged, and continue to engage (per the article that Kaliya
> linked to that kicked off this whole discussion). Perhaps not as much
> as we'd like, and perhaps not in the way that you'd prefer, but it's
> not true that we are proceeding without participation.
>
> > - we combined non-human use cases like supply chain management with
> human ones
>
> Verifiable Credentials are a generalized technology that enables an
> entity to say anything about anything. There is no differentiation or
> "combining" of use cases there and I have no idea how we'd try and
> force that if we thought it was a good idea.
>
> That said, the requirements for non-human use cases are different than
> ones involving humans, and in those cases, many of us building these
> standards and solutions are keenly aware of that difference and the
> human rights implications.
>
> I don't really understand what you're getting at here.
>
> > - we completely ignored the role of biometrics
>
> Did we? How? I don't know if you're arguing for more biometrics, less
> biometrics, or no biometrics. What is "the role" and what were you
> hoping would happen?
>
> > - we relied too much on chain-of-custody models that promote coercive
> practices
>
> Can you provide an example of a "chain of custody model" that the
> community is promoting?
>
> > - we ignored the importance of reputation and other Sybil-resistance
> issues in practical applications
>
> My recollection is that we've spent considerable time talking about
> reputation and sybil-resistance in this community, and that is largely
> what drove some of the privacy-preserving solutions that have been
> deployed to date. What else were you hoping would happen that isn't in
> process or might not happen?
>
> > - we ignored the fundamental need for delegation in human affairs
>
> While I agree that we're not there yet and need to focus more on this
> in the coming years... that we "ignored" it seems a bit much. What
> would an ideal solution look like to you?
>
> > - we were very sure of ourselves even as ISO, Clear, and id.me gained
> scale
>
> "Sure of ourselves", in what way? In that we have failed to stop the
> worst parts of ISO mDL, Clear, and id.me from being pushed to
> production and large scale? We all know that the best solution
> sometimes doesn't win in the short term, and sometimes even fails to
> win in the long term. That doesn't mean we should stop trying to make
> the world a better place by putting well thought out, viable
> alternatives into the market.
>
> The notion that Verifiable Credentials would be one of the global
> standards used by some of the largest nation states on the planet was
> unthinkable when we started this bottom-up movement over a decade ago;
> that the technologies we have created here are peers to initiatives
> put forward by far more monied and/or powerful interests continues to
> fill us with awe, even though that was always the plan.
>
> I don't know about others, but I'm certainly not so sure that the most
> ideal, secure, privacy-respecting, and human rights-respecting
> technologies will win in the end. We've certainly got some real
> stinkers in the market now that are in use. I hope the best
> technologies win in the end, but that's the struggle many of us have
> signed up for knowing full well that none of this is guaranteed, a
> straight path, or an easy win.
>
> There are monied interests that benefit from the status quo or are
> pushing towards what we believe to be dystopian outcomes. I know we
> won't get to a better future if we don't keep going. We have to keep
> fighting for what we believe is the best thing for global society...
> and that is inclusive of technology, ethics, model legislation, and to
> your point, human rights.
>
> You might be mistaking being boundlessly determined for being too sure
> of ourselves. The former can be filled with doubt and dread while the
> latter is not. I'd put most of those that are contributing in this
> community to be simultaneously filled with doubt and dread while being
> boundlessly determined. I can understand how that might come across as
> misplaced confidence to some.
>
> > I appreciate Manu’s academic review but I see little indication that our
> community is heading to a healthy outcome.
>
> Then what needs to change, Adrian? Can you define what you mean by a
> healthy outcome? This is an open forum, those that debate, design,
> incubate, build, and deploy in this community have moved the needle in
> positive ways over the years. What concrete, additional set of actions
> do you think we should be taking?
>
> -- manu
>
> --
> Manu Sporny - https://www.linkedin.com/in/manusporny/
> Founder/CEO - Digital Bazaar, Inc.
> https://www.digitalbazaar.com/
>
>

Received on Saturday, 12 October 2024 21:29:47 UTC