- From: <meetings@w3c-ccg.org>
- Date: Tue, 3 Feb 2026 18:57:17 -0500
- To: public-credentials@w3.org
- Message-ID: <CA+ChqYfLg9m1e=pOJ=MyofMOHXfatOGASz6WYR2eYvdPT=2wxg@mail.gmail.com>
Meeting Summary: CCG Atlantic Weekly - 2026/02/03
This meeting focused on updates from community members and a presentation
from Scott Jones of Realize, a computer vision company, discussing their
work in identity verification and potential collaborations.
Topics Covered:
- *Community Updates:*
- Vote for the new Verifiable Credentials Working Group charter is
open until the end of the month.
- The Verifiable Credential Render Method specification, including an
HTML-based render method, is progressing in the W3C VCWG.
- Announcement of a credential summit in Philadelphia next month.
- *Presentation by Scott Jones (Realize):*
- Introduction to Realize, a computer vision company with origins in
advertising technology.
- Discussion of their evolution into identity verification,
particularly face verification and authentication, leveraging their
experience with real-world, non-hygienic data.
- Overview of their work with Meta, including account authentication
and fake celebrity ad detection.
- Explanation of the limitations of current heavyweight ID
verification and the "white space" for less invasive, yet
secure, solutions.
- Introduction of their "Passkey Plus" concept, combining passkeys
with biometric person binding for enhanced security.
- Discussion of "continuous verification" for ongoing identity
assurance, particularly relevant in the gig economy.
- Emphasis on their "personhood," "uniqueness," and "attributes"
verification capabilities, achieved without government IDs.
- Presentation of their defense layers against presentation attacks,
deep fakes, and device integrity issues.
- Details on their privacy-by-design architecture, including
client-side processing and immediate deletion of biometric images.
- Highlighting their commitment to responsible AI and demographic
fairness, with validated performance across diverse skin tones and ages.
- Discussion of a collaboration with the Cyros group for a
passkey-enabled digital identity wallet validating personhood and age.
- Exploration of collaboration opportunities with the CCG, including
standards alignment and a global human verification credential network.
Key Points:
- *Realize's Differentiators:* Their core strength lies in their ability
to handle real-world, imperfect conditions for biometric analysis, stemming
from their ad-tech background. They emphasize a privacy-first, client-side
processing approach, avoiding central biometric databases and immediate
deletion of images.
- *Passkey Plus:* This concept aims to address the limitations of
passkeys by adding person binding (verifying unique human identity) to
device binding, filling a critical gap.
- *Continuous Verification:* Realize offers a solution for ongoing
identity assurance within a session, preventing account takeovers and
ensuring the same individual remains active, particularly useful in
high-security environments and gig economies.
- *Responsible AI and Fairness:* Realize highlights their commitment to
demographic fairness, evidenced by their performance on darker skin tones
and age verification, contrasting with the perceived "wild west" of AI.
- *Collaboration Interest:* Realize is keen to integrate with the W3C
ecosystem, contribute to standards (human verification credential schema,
confidence method specification), and potentially build a global human
verification credential network.
- *Zero-Knowledge Proofs:* Realize is actively exploring ZKPs for
client-side biometric matching and ZK pseudonyms for credentials, aligning
with the community's direction.
- *Challenges and Future Work:* Key areas of ongoing development and
open questions include minimizing friction in passkey creation, optimizing
credential refresh intervals, privacy-preserving revocation, and
cross-community uniqueness measurement.
- *Revocation:* Current revocation methods involve tenanted collections
of embeddings tied to specific customer use cases, with rules for data
retention and siloed storage.
Text:
https://meet.w3c-ccg.org/archives/w3c-ccg-ccg-atlantic-weekly-2026-02-03.md
Video:
https://meet.w3c-ccg.org/archives/w3c-ccg-ccg-atlantic-weekly-2026-02-03.mp4
*CCG Atlantic Weekly - 2026/02/03 11:54 EST - Transcript* *Attendees*
Benjamin Young, Dave Lehn, Dmitri Zagidulin, Elaine Wooton, Erica Connell,
Geun-Hyung Kim, Gregory Natran, Harrison Tang, JeffO - HumanOS, Jennie
Meier, Joe Andrieu, Kaliya Identity Woman, Kayode Ezike, Leo Sorokin,
Mahmoud Alkhraishi, Manu Sporny, Parth Bhatt, Phillip Long, Rob Padula,
Scott Jones, Ted Thibodeau Jr, Will Abramson
*Transcript*
Harrison Tang: Hey Scott, nice to see you again.
Scott Jones: Hey there, Harrison. Great to see you, too.
Harrison Tang: Yeah, thanks for jumping on and spending the time. I think
either Mammud or will actually host this meeting later. Yeah.
Mahmoud Alkhraishi: Hello We're just going to get started in two minutes.
Mahmoud Alkhraishi: Okay, let's get going. thank you everyone for joining
us today. It is Tuesday, February 3rd for our regularly scheduled CCG call.
As a quick reminder, please make sure that you read and adhere to our code
of ethics and professional conduct. An IP note, please make sure that you
have signed any substantive contributions to the CCG must have signed the
agreement. Also, if you're not a member of the CCG, please we're going to
put a link to that in chat. before we go on to today's call, does anyone
have any announcements or any community updates they'd like to provide?
List.
Manu Sporny: Hey moment. Thanks. I sent an email out to the mailing list on
this, but there is the vote for the new verifiable credentials working
group charter. I'll put the link in the chat channel for that. that, went
out at the beginning, I guess, maybe a week ago. the vote is open until the
end of this month, but the sooner you get in kind of the vote, the better
we have an understanding of kind of where we are on the vote. so please
poke the W3C member companies to go ahead and vote on the charter. this
includes, seven new specifications that the group would like to work on. so
that's item one.
00:05:00
Manu Sporny: the second item is that as folks know we incubated a
specification called the verifiable credential render method in this
community that was handed off to the W3C verifiable credential working
group a while ago. Work has continued on that work item there. We meet
every other Wednesday to kind of push that work forward as an official VCWG
work item.
Manu Sporny: there is a render method that's probably of interest to
everyone called, it's an HML-based render method. So, it's a sandboxed HTML
render method that does advanced rendering of, things that have complex
layouts, like education certificates, transcripts, vital records, supply
chain documents, things like that. So, please take a look at that. I think
we've finished the first draft of that and we want to make sure it works
for the whole community. and that's it. back over to you, Mmud.
Mahmoud Alkhraishi: Thank Does anyone else and please make sure you go and
vote on that charter? Does anyone else have any announcements they'd like
to make? All right.
Mahmoud Alkhraishi: Hearing none. go ahead.
Phillip Long: I want quickly.
Phillip Long: Yeah, just for folks in the US, there is a credential summit
next month in Philadelphia sponsored by One in Tech. It's the 17th through
the 21st, I believe. The link is in the chat.
Mahmoud Alkhraishi: Thank you, Phil. Does anyone else have any other
announcements they'd like to make? All right. Scott is here joining Scott
Jones, thank you for doing that. Would you mind doing a quick intro of
yourself and then walking us through your slides?
Scott Jones: Yeah. Hi everyone. Apologies. getting Google to cooperate with
me and it's not cooperating. There we go. Hang on. One more try here. Cool.
Hi everyone. I am Scott Jones. I'm VP of product with a company we are a
computer vision company. We were founded in 2007. So whenever I say that, I
always make the joke that we've kind of feel like we've been waiting for
this present- day AI renaissance to catch up for quite a while.
Scott Jones: the genesis of why I'm here meeting all of you and excited to
be here. thank you for having me. I've been meeting members of the SSI and
decentralized identity communities since it hasn't quite been a year yet,
but coming up on almost a year later next quarter. and I'm excited to share
what we've been building at Realize and explore how we might collaborate
with members of the WC3 community. we come from an unusual background and
I'm excited to tell you more about that. We did not start in identity. We
actually started in advertising technology but it gave us some really
interesting perspectives on problems re relevant to kind of identity
challenges at large but also we see a perspective on what this community is
working on.
Scott Jones: so excited to walk you through some materials kind of like a
roar shack test, not exactly ink blocks, but concepts and get your reaction
on it. So, our origin story, as I said, we did not come out of identity. we
actually got started with proprietary models for measuring attention and
emotion off of webcams and in the context of advertising. So in effect
helping the world's largest brands and agencies derisk their media spend by
testing channel specific variations of ads with a target audience. that
genesis that origin story actually gave us some really interesting
advantages as we shifted into identity. I'll tell you a little bit more
about that shift but the opportunities it afforded us or the advantages
relate to the real world conditions of online market research. it's not
hygienic.
Scott Jones: It's not predictable. You'll find people are taking surveys
all over the place, from the toilet, honestly, from their car, from
situations where you can barely see them, weird angles, etc. They're not
necessarily going to hold up their phone and give you a pristine image. so
you have to deal with those conditions. And by virtue of that, all of our
models that we were developing going back not quite 20 years at this point,
but let's say 10 to 15 years came out of that environment. So built in was
this notion of handling occlusions. You can't even see someone. Their face
is blocked. They're not giving you a pristine image. the lighting is not
great, etc. And that afforded us this really interesting differentiation as
we got into the identity category. And that story started three years ago.
we already had a relationship with Meta at that point. we were helping them
build avatars. It's a service We call it AI data collection.
00:10:00
Scott Jones: because we are so effective at building face models and
ethically sourcing and building the whole pipelines that customers help
hire us to help them. And in that case we were helping Meta build avatars.
I had just joined the company at this point. So it was about little more
than three years ago. and we were focused still on attention and emotion.
But at that point very beginning of 2023 we had learned from Meta that they
were looking for a face verification model. we threw our hat in the ring of
their competition and ended up winning by March of 23. They selected us as
their winner, but it wasn't quite that easy. It took a year and a half to
get into production. and the way they used us starting from December of
2024 and then onward, they've been aggressively rolling us out. The way
they used us gave us all this information on how our technology can be
valuable and differentiated in the context of identity. So, they've been
using us.
Scott Jones: The first use case was fake celebrity ads and fake celebrity
profiles using face verification to detect those accounts or those ads. But
then the lion share since then has shifted into account authentication. So
if a user can't remember their password or they think there's been an
account takeover, the user now has an option to submit a video Meta's
pipeline extracts frames from that selfie, known profile photos, and
interestingly, if a government ID has been submitted in the past. They will
capture that image from that government send all of that to our model, and
I'll pick on Harrison because I saw him earlier on the call. The idea,
let's imagine Harrison went through that selfie experience trying to regain
access to his account.
Scott Jones: known profile images of Harrison frames from the selfie he
just created and if they have it an image from his I government ID would
all be sent to our model to essentially ask do you think this is Harrison?
and the differentiation we found and the impact from this is what got us
into the identity space. we were finding this opportunity and then really
capitalizing on the problems we found that we solve and we've quickly grown
now since they went live short just a bit more than a year ago and the
expansion we've had with other customers as well we've quickly grown to a
global scale so last year we did 125 verification calls to our system in
total at the end of the year and this month now we're averaging more than a
billion calls per day and billions of that is just meta alone
Scott Jones: another part of our origin story, I'm in Chapel Hill, North
Carolina. I'm freezing right now. I've got my Harry Potter gloves on, but
this company is largely European. I'm kind of like the lone wolf in the US.
we are headquartered in London and actually Estonia. Our tech teams in
Budapest. We are GDPR native. It's built into everything we do. and we
structure ourselves per the spirit of GDPR as data processors. getting into
this identity space, the realization we had, it really came out of our ad
testing business where that business gave us a front row seat to the steady
rise of fraud online. So click farms, people lying about who they are. huge
problem on the internet at large, obviously, as all of but in the market
research space, it's truly insane. You can actually expect on the buy side,
40% of what you get on average will be bad, and you can't tell until you
buy it.
Scott Jones: double clicking on that industry and then expanding from
there, it gave us the realization really expressed on this chart. a
heavyweight ID verification check is really the best thing you can do on
the internet still today. predicated on a livveness check and a selfie to
ascertain is this a real human being? Is that a real government ID? And do
I think this person is that person on that great for high stakes use cases,
but way too expensive for a lot of what we do.
Scott Jones: So it tends to cost a dollar or more. it tends to result in a
lot of churn. It's very invasive. It takes multiple minutes. and a lot of
users don't want to do it unless there's a value exchange there. so the
idea is saying okay if that's the most I can do if I can draw a spectrum
here that's the most I can do. What's less than that that still works? And
the realization again that came from market research but has been validated
across a variety of verticals is there's really not much you can rely on.
people are still using cap shows. They're using bot detection, device and
network fingerprinting. if AI is not already defeating it, humans can
literally learn on YouTube on courses available how to usurp those systems.
SMS and email authentication are still hooks that people are hanging their
hat on for a way to say it. Those are too easily usurped too. It's very
well known how to get past those systems.
00:15:00
Scott Jones: so what we saw was this very interesting whites space at the
intersection of both being very easy to use but also giving a good level of
security and what we designed it is it does multiple things at once. So it
keeps out bots just like a capture but also being extremely easy to use and
frictionless. but then it also validates uniqueness demographics and once
you go through it once you can be seamlessly reauthenticated just like what
meta is doing at hypers scale now. So then the opportunity as again I said
I was meeting networking into did and SSI communities from last summer
onward.
Scott Jones: the opportunity we found related to this community is the
holder binding and the idea that you're solving the credential issue it's
in and verification very elegantly but the notion of how can you actually
verify that a holder is truly human really unique and actually is there in
the moment when you're trying to get these attestations it's been known
that biometrics are a way to do this that scales but there's different
approaches that have been taken including things like specialized hardware
so we've taken a focus this kind of comes out of our origin story of humans
and human understanding going back to ad testing starting in the early
teens of this century and the idea of really focusing on that third
question about humanity. Are you unique? And are you really present? And
that's where we really doubled down with our opinion and approach and point
of view. just to look at a very popular example right now on how to address
this problem.
Scott Jones: I watched their presentation to this group May of last year I
believe. World ID is one that everyone's looking at. So you all know their
story. founded about six years ago, truly launched two to three years ago.
designed as an anonymous open digital identity infrastructure. they use
bespoke hardware that is tied to the iris sophisticated anti- spoofing and
more. but the challenge is the adoption of that. So in these years got I
believe the latest count was 15 million verified humans 1 1500 of these
orbs distributed globally six of those in the US. So the idea that the
realization is the hardware distribution is really the challenge of what
they're trying to achieve here and they've had that realization too I
believe with their shift towards more what they can do on the mobile side
with mobile verifications.
Scott Jones: Our approach is coming from a different direction. Again, this
genesis in adtech where and ad testing where I could not mandate what
device a user's on. we're compatible with any device that can run a camera,
any resolution. and I had to be flexible like that with the technology we
built. we cannot require pristine conditions. I can't tell this user, hey,
can you please lift your camera up to your face? I can barely see we had to
deal with these real world conditions where the user is kind of up to their
own own valition how they want to deal with you in these moments and you
can't require them to behave in the way that you want. and that all led us
into this path we're on now where we're going into client side processing
which I think is the future of all of this where models can run locally on
the client side with CKP to allow people to attest who they are without
revealing who exactly they are.
Scott Jones: not storing biometric images ever. By policy, we generate
embeddings and delete the images. we're using those bed embeddings only for
the notion of uniqueness. and I know there's been philosophy on How
irreversible are they truly if you had a quantum computer and God himself
backing the project, the idea that under normal conditions, these are
irreversible. And then adding to that the notion of zero knowledge proof
generation for ribute so this idea of really and the framing we've come out
of this that's been really interesting and something we're bringing to
market too is this notion of pass key for personhood and how this
technology can actually uplevel the promise of pass keys and the traction
they've gotten but fill some critical gaps. So that's one thing to talk
about first. This is part of the framework we've been developing. We're
calling it pass keys plus again this notion that pass keys solve a lot of
problems.
Scott Jones: they've transformed authentication but they solve device
binding but they don't solve person binding. I think that's a critical way
to frame it. So what they do solve very difficult to potentially impossible
fishing resistant authentication. there's no password to remember or leak
and that can be compromised. cryptographic proof of a seamless ability to
sync across devices. But what it doesn't tell you is it creating this
account? Is this human unique in my system? How many devices do they have
pass keys for example? has control of this account changed hands and what
can the user do if the account's lost? Because typically pass keys are tied
directly to a device and you hit a dead end there. so this idea of putting
biometric person binding with pass key authentication is the opportunity
that we found that we're working on right now.
00:20:00
Scott Jones: And then another one that's been really interesting is the not
not just doing upfront verification, but something we're calling continuous
verification. And this live now in the context of gig economies. the
opportunity with upfront verification, it makes a lot of sense. And this
could be that high stakes moment where I want to check your government ID.
I want to pay a dollar. I want to make sure this is really to pick on him
again, Harrison. I want to make sure it's really Harrison right now. and
that typical use cases would be account creation, highv value transactions,
agegated content, credential issuance, those key moments where you want to
raise the bar, but then the idea of great I've validated it's the right
person at the start of the session or day zero or I've gone deep on that.
Now they're in the session. what can I do to still ascertain it's still the
right person?
Scott Jones: And what we've kind of pioneered here is this notion of
continuous verification where we can run seamlessly in the background. You
don't have to do a full livveness check that already happened getting the
user into the session. Now you can imagine kind of the flashing light of a
camera briefly coming on at random intervals to say is a human still there?
They didn't replace themselves with a bot. Is it still the right human? And
they didn't bring in someone else to take over for them. and we're live
right now in the gig economy. The context there is an AI annotation
platform where they're kind of incentivized. They get paid great hourly
rates to give feedback to models and there's a fraud incentive. what if I
could get someone else to join me and do this work with me? so the idea
that it defeats account takeovers. Make sure it's the same person in the
right session. and is great for high security environments where you need
that continuity of control to be verified.
Scott Jones: So what we verify we really distill it down into this notion
of personhood or humanness. Is that a real actual human being in front of
the camera? It's not a mask. It's not an image. It's not a deep fake.
secondly, uniqueness. Is this the same person that it was before? or is
this actually a unique person on a new account or do they already have
another account pretending to be or masquerading as someone else? And then
the notion of attributes. We started with demographics on age and gender
using models that we've developed. there's more in the pipeline we've been
asked about, but this is just generally the idea of what more
dimensionality you can provide without having to see a government ID. So
that a classic example I use would be in the context of a dating app. yes,
we can be validated as a human with my technology. I'm unique. I don't
already have an account, but I'm claiming to be a 20-year-old woman. I'm
trying to catfish people. Our technology will allow you to know, yeah, it's
a real human.
Scott Jones: They don't have another account right now, but they're not a
20-y old woman. And depending on what you're trying to solve for, you could
use that intelligence again without having to see a government ID. So, the
threat landscape, I've seen u multiple presentations to this group. I know
these are not new concepts to you. You all are well aware. presentation
attacks, what can happen in the moment when somebody is being presented on
photos, videos, masks, deep fakes, even digital injection attacks. Can I
even bypass that camera entirely? deep fakes and face swaps. and kind of
the leading edge of all the technologies that's available now for that. And
then device integrity signals. is that a virtual camera? is the feed being
manipulated? Is that a cracked OS entirely? what is the state of the device
where all of this is happening? And then at the defense layers we're
working with.
Scott Jones: So server side deep fake a livveness detection pipeline where
we're currently server side but moving to the client side and then a
breadth of options. We call this kind of progressive verification where you
have options on what you want to do for those high-risk moments or a lot of
suspicion or either risk or signals are telling you this is a moment that
matters. You can go raise the bar entirely and do a In the middle is a less
active livveness check that's very seamless to do. And then the lowest
level of it is more of that continuous verification I mentioned where
you've already gone past the riskiest moments and now you just want to make
sure are they still there. privacy by design is part of our architecture.
What I'm showing now is a client side vision of what we're working on
that's in development right now.
Scott Jones: So from the user device capturing an image off the camera the
model running locally to process to the server the embedding can be
transmitted and then a credential can be issued with ZKP on it. the image
itself in this construct never leaves the device. it's an irreversible
transformation. Again, assuming quantum computers and massive projects are
not going against this given everything we know about kind of the
state-of-the-art right now. And then selective disclosure, it affords the
user to share, for example, that they are 18 years old or greater without
revealing their exact age. this is just an example of a deployment we're
working on. We live right now in the cloud, airgapped, onrem.
00:25:00
Scott Jones: and in those cloud deployments the construct is images are
processed and then immediately deleted and never stored. And the idea being
that we don't need a central single biometric database to do this but
rather we're taking it from the angle of community scoped uniqueness. So
meta for example the users of communities inside the WC3 sort of bounds
open source developers as a community. You can imagine these different sort
of cohorts where you can create these collections and scope the embeddings
to that. So you don't need a single global index as a sort of honeypot.
another big differentiation we've had coming out of our evaluation with
meta and then others is responsible AI in general but specifically here is
about demographic fairness.
Scott Jones: So coming out of our ad testing business we've been validated
as having the largest in the wild collection of faces. It covers several
million identities from more than 93 countries. But most importantly we
have GDPR compliant consent giving us the rights to test and train models
on those faces. And what we've learned is the majority of marketing face
models are using publicly available data sets to test and train. those data
sets index largely on lighter skin tones. It makes the accuracy numbers
look great when they're presented. But unfortunately, in those real world,
in the wild conditions, their models are perceived as being unfair and not
robust. and on the unfairness side, that's led to several years now of bad
R about racist models.
Scott Jones: so what I'm showing now is a extract from a fairness
evaluation we did last year. But part of our journey to production with
Meta was they actually challenged us with their responsible AI board's new
demographic fairness testing protocol. that was one of the hoops we had to
go through over an 18-month journey. And in that test, we showed fairness
across darker skin tones. It's actually never been demonstrated publicly
before. or nothing like that's been seen or demonstrated with public
visibility. So we're very much disrupting kind of the status quo of the
perceived wild west of AI AI with stolen data, unfair models, etc. We are
consciously the antithesis of that and take it very seriously. on the age
side, after getting into face verification with meta, they asked us to get
into age verification and deep fake detection.
Scott Jones: we came a long way very quickly on age and this is all part of
the kit we offer. we're already more accurate overall than the industry
incumbent for computer vision age estimation that's what you're seeing here
these are mean absolute errors across these age bands. and what we've also
found is for the bands of users 13 to 17 were already more accurate than
humans in estimating those continuing to work in this space as well on age
verification as it's that added element of the attributes of the users that
we can provide without needing to see a government idea reveal who people
are. also exciting to note we have active work underway just to kind of
paint a picture of the possibilities. So with the Cyros group and their WW
wallet we met them last quarter and we are actively building together.
Scott Jones: the premise is a pass key enabled digital identity wallet that
can validate personhood and age and the flow we're working on is a verify
verification using our technology a livveness check and a uniqueness check
with the camera credential issuance of a human verification red That
credential is stored in the wallet with a pass key binding and then the
user can present it at any participating relying party. and the notion
therein of the user verifies once the credential is stored in the wallet
and you can present that proof to any service that wants it. WW wallet was
originally before we met them looking to integrate a capture check to do
this into their wallet but we were able to uplevel it with the humanity
check but there's a lot of open questions. It's not that easy. So here's
the items we're working on.
00:30:00
Scott Jones: So the notion of pass key creation friction is the first part.
this does require a user interaction. We are all about seamless experiences
for the user. So the question is how do you minimize friction while
maintaining security and user sovereignty. We don't want things just
happening in the background. We want the user to know but we don't want
them to have to be tortured to participate in this. From there what's the
optimal interval for refreshing these credentials? how often should
humanness be reverified? Is it context dependent?
Scott Jones: Is it fixed? what's the heristic on that? privacy proverb
preserving revocation of these credentials. how can you revoke them without
creating a trackable event? and then the notion of cross community
uniqueness. So when is it valuable and imperative to measure uniqueness
across communities? we're also currently exploring zero knowledge proofs
with this and using ZK pseudonyms as the first scheme we're working on. and
the idea that these are quite challenging problems as I'm sure everyone on
this call So, we're sharing what we're working on rather than just say
being real handwavy and saying, "Hey, we figured it all out." very much
underway and very exciting. So, the idea of how does that work? what is my
perception of how it fits into the verified credential model? The idea that
the verify service is an issuer.
Scott Jones: it issues this credential once the user verifies from their
camera and they attain that human verification credential from there the
holder is the user and the wallet and they can then present this proof to
the verifying party that can confirm their humanness or their age without
ever accessing the biometric data itself. and then again at the bottom this
idea of what it all handles that humanity proof of uniqueness attesting
your age and then reauthentication so you can be seamlessly reauthenticated
it over and over and over we're currently operating in these spaces social
media I don't have to read them all but you can tell there's been a breath
of where we've seen these same challenges and the opportunities where the
existing
Scott Jones: kind of authentication tools are obsolete a little too
heavy-handed or maybe a little too is a lightweight way of saying They can
be way too heavy and way too expensive depending on what you're trying to
solve for. So finding kind of validation on that across a variety of
verticals. So the thinking about the collaboration opportunities to offer
to this group for consideration. we'd love to contribute to this ecosystem
and are working steadfastly on trying to position ourselves to do standards
alignment is one thought. how could we integrate with the standards managed
by this group to ensure we're interoperable with the broader ecosyem? a
human verification credential schema. Is there a standardized way to
express proof of humanness, uniqueness, and livveness?
Scott Jones: aligning as well on the confidence method specification. And
then something else we're working on is a global human verification
credential network. just given the rapid scale at which we've been able to
deploy already. we're looking for founding partners to build the
infrastructure. imagine kind of a much more scaled version of the WW wallet
opportunity where human verify credentials can be stored and presented
across services. So in summary, the premise of what we're working on, we've
already reached very large scale very quickly.
Scott Jones: We don't require specialized hardware. Just anything truly
that can run a camera currently live on devices as old as 2010 or older.
privacy first by personal client side when we biometric storage at all. on
the client side but for cloud enabled integrations we don't store images.
We delete them immediately. the robustness and fairness that's
differentiated and then seeing ourselves as being credential ready for
integrations into ecosystems like this. I wasn't quite sounding like an
auctioneer, but I think that was fast. I know I'm normally a fast speaker,
but we've reached the end.
Mahmoud Alkhraishi: Thank you so much.
Scott Jones: I now see chats here.
Mahmoud Alkhraishi: Let me walk you through the questions then. the first
one is from Ted. can we get a link to the stack?
Scott Jones: Certainly. What was the first one you said?
Mahmoud Alkhraishi: Awesome. If you could just share with the chairs or
just share directly with the community group, that would be wonderful. you
either share with the chairs.
Scott Jones: Share it with Cool. Yep.
Mahmoud Alkhraishi: So it's myself, Will, or Deng. And yeah. the second
question is what's the data model that you're currently using for your
identity credential?
Scott Jones: I will have to come back to you on that.
Mahmoud Alkhraishi: What's the format?
00:35:00
Scott Jones: I'm not sure I can answer that. I will sync with our tech team.
Mahmoud Alkhraishi: Manufacture.
Manu Sporny: Yeah, thanks. hi Scott. Wonderful to meet Great presentation.
I'm one of the editors for the verifiable credential specification and the
decentralized identifier specification and we also work kind of in the
retail sector that does age verification and things of that nature. So
really really interested in the stuff that you were talking about. I think
the key differentiator that you went through was the preserving privacy
first approach, I think one of the biggest challenges with biometrics is
that when these systems are used today and you can take any of your
competitors, Usually you open a video stream and you send that video stream
to who knows where it's going, right?
Manu Sporny: and you have no idea if the third part is storing it, if
they're training on it. there are all these things which are really
stopping people from using the technology in places where it could. So, I
think the thing that you talked about that is really aligned with this
community is the zero knowledgebased client side biometric matching. If we
can figure out how to do that and in and in and educate people that hey all
of this is happening on your device that your image isn't going somewhere
else there's no video feed going somewhere else. I think that is really
compelling and is something that a number of us in the community have been
arguing for a long time. So that's great. I noticed that you said
confidence method is a place you could hook in. Yes, absolutely.
Manu Sporny: Joe I think is on the call, today and he's, leading that work.
He might have dropped off. but that would be a great place for you guys to
get slotted in. It makes perfect sense for you to provide, mechanism
there's some stuff in the verifiable credential, evidence field how did you
find out that this person was a human? What are the mechanisms there?
Scott Jones: Yeah.
Manu Sporny: I think the template format, your embeddings, format would be
a really great, thing to work on. and I will note that this is actively
being standardized right now. So, the time to engage is because in six
months we're going to probably close up, what we can do in version 10.0.
And it'd be great if some of this technology that you're talking about the
privacy preserving stuff was there. so I guess the first question to you is
are you willing to contribute some of this technology to a global open
standard and do you realize that that means you have to also provide the
patent stuff in your binary format to the global standard or is that not on
the table?
Manu Sporny: And then really interested in kind of the there's also zero
knowledge stuff that this community is doing around BBS. So, not the
Longfellow stuff, but something more lightweight. so interested in figuring
out if there's a collaboration there. But I think that the question is,
you're talking about interacting with the community. That's great. one of
the things is that you,…
Manu Sporny: when you join this community and make it a part of a global
standard, there is a certain amount of IP that you have to release for
usage in the standard. And have you guys had that discussion yet?
Scott Jones: We have and…
Scott Jones: I don't have a crisp answer on what it would be but my
understanding is we can be particular about what is the right words to use
what is I was going to use the word exposed what sort of IP is exposed
versus what can still remain behind the covers, if that makes sense. so
that was kind of the lens we were looking through, but we've not figured
that out yet, but we'd love to kind of negotiate on it.
Mahmoud Alkhraishi: Are there any other questions? I had one personally.
you mentioned that you're doing revocation today. How does that work?
Mahmoud Alkhraishi: What are you doing to revoke identities? Is it all just
on a centralized system or is there anything louder than
Scott Jones: Generally the idea that we have tenited collections tied to…
Scott Jones: however customers are using it. and the idea that those
collections are embeddings and a face ID and and no images are stored and
those emdings those collections have rules around how long we'd even retain
them depending on the use case and what the customer is trying to solve
for. so that's generally the idea. The images used to create those the
embeddings themselves are encapsulated in storage as a vector storage. God
an engineer would say that better. a vector search database.
00:40:00
Scott Jones: and they are tened so it's not like a giant repository but
rather siloed
Mahmoud Alkhraishi: Thank you. does anyone else have any questions? Manu.
Manu Sporny: Yeah, I guess I've got and these might be, questions for your
engineers, but what I'm presuming about the way your system works is
somebody would go and on enroll, with your technology. It doesn't have to
be at a centralized site. you could have it at many different ones, but the
end result of that enrollment process is going to be a set of embeddings
that you then issue as a verifiable credential that the person then puts in
their digital wallet, which basically means that I think the credential
that you're issuing and it might be bound to something else like a employee
ID card or something like that.
Manu Sporny: that thing is given wholly to the individual and they get to
put it where they want and they can take it wherever they want when they
show up to some kind of ric gate authentication gate either online or in
the real world what they're going to do is they're going to present that
credential and then the embedding data from let's say the confidence method
goes over to the entity trying to verify actually no sorry what I would
expect to happen is those embeddings are used in the wallet or…
Scott Jones: All right. Nope.
Manu Sporny: or they're not given to the verifier. so there's some kind of
secure process that is run in the wallet or in some other system that the
individual trusts and has control over. there's a zero knowledge proof
that's produced and then that zero knowledge proof is the thing that's
actually sent to the verifier where the verifier just gets something saying
effectively, yeah, I checked this person's driver's license photo against
aliveness check and it came out correct and here's your proof. you don't
need the biometric data. You don't need the facial template. You don't need
any of that stuff. a trusted machine running in their wallet elsewhere that
they trust has produced this proof and that's all you need. Is that
correct? Is that okay?
Scott Jones: This is my understanding. Yep. Yep.
Manu Sporny: And that is absolutely the model that we want to go towards
for biometric stuff none of this I know you said that sometimes you also
take video streams for deeper verification but for some of this light touch
verification like age verification you really don't need to be that
invasive with the data streams.
Scott Jones: Right. in the context of a wallet,…
Manu Sporny: Okay, good to know.
Mahmoud Alkhraishi: Is the process that Mana described live today or…
Mahmoud Alkhraishi: is that something that you guys are building with the
Cirrus Foundation?
Scott Jones: it's being built right now, but generally the idea of a system
that takes images validates livveness uniqueness and demographics that's
live and available. Thank you for having me.
Mahmoud Alkhraishi: Does anyone else have anything they would like to ask
Scott, for your time. thank you so much everybody for showing up today.
that's going to conclude our call. Have a great rest of your week. Thank
you.
Scott Jones: Wow, explosive clapping. I haven't seen that before.
Harrison Tang: Thanks.
Scott Jones: Have a great rest of your day. Bye.
Meeting ended after 00:44:24 👋
*This editable transcript was computer generated and might contain errors.
People can also change the text after it was created.*
Received on Tuesday, 3 February 2026 23:57:28 UTC