W3C home > Mailing lists > Public > public-credentials@w3.org > November 2015

Re: Solutions to the NASCAR problem?

From: <henry.story@bblfish.net>
Date: Tue, 24 Nov 2015 19:04:37 +0000
Cc: Dave Longley <dlongley@digitalbazaar.com>, Steven Rowat <steven_rowat@sunshine.net>, W3C Credentials Community Group <public-credentials@w3.org>, public-webid <public-webid@w3.org>
Message-Id: <911C4FE2-6877-4416-988B-D4ED6992A113@bblfish.net>
To: Chadwick David <d.w.chadwick@kent.ac.uk>

> On 23 Nov 2015, at 16:52, David Chadwick <d.w.chadwick@kent.ac.uk> wrote:
> Hi Henry
> replace authentication with authorisation then what you say is correct.
> All your 4 credentials are authz ones

This is how I see things from my experience programming these systems.


 is a process of verifying the authorship of some statements: 
  a) that the statement is about the user presenting them
  b) that the statement was really made by some other agent
For example:

1. with HTTP Signatures, an nice minimal RFC, which I have implemented 
on the client and server https://github.com/solid/solid-spec/issues/52 
the statement of the client is that he is in possession of the private 
key corresponding to the public key identified by a URI known to the
server. My server looks if the key can be dereferences to find the key,
allowing a key to be used cross origin. But this would also work for
single origin keys.

2. with WebID authentication ( over TLS see https://webid.info/ ) the
X509 certificate can be self signed or not. 

 a- The Server could just verify that the client has the private key of the
  public key in the certificate, and use the public key as an identifier
  as FIDO does.

 b- If it is self signed then the server needs to verify that the public key 
  in the certificate is actually published from the WebID profile. This
  would give the server another verified claim.

 c.  If the certificate were  signed by some other entity ( a CA or 
some entity with a Issuer Alternative Name WebID ) then the server could
verify also that the issuer actually signed the statements.

3. with OpenID the openid verification steps need to be followed and one then
  ends up with the claim that the user is identified by the OpenId.

I call these identity Statements Principals and user can be identified by 
any number of them, forming a Subject.

 see:  https://github.com/read-write-web/rww-play/blob/bfa76510b44cdcc7acb003ac004809d89f4632f0/app/rww/auth/AuthN.scala#L35

We could have the following Principals:
  - PublicKeyPrincipal(publickey), 
  - WebIDPrincipal(webid)
  - WebKeyPrincipal(webkey), 
  - OpenIdPrincipal(openid), 
  - and more

In order to enhance my code for Credentials I would also want to add to the Principal a
space for a credential that would be a statement about the Principal made by some other 
Subject. For WebID and WebKey at the moment we just fetch these statements from the web,
but they could be passed by the client too. If we had those then these statements would
also be authenticated as having been made by the issuer.


 Once the server has a list of Principals ( and signed claims ) it can verify if anyone one of them is authorized for the action requested (GET, PUT, PATCH, POST, SEARCH, ... ) on the resource. This is the Authorization step.

To do this the server looks at the WebAccessControl resource for the resoruce requested. This resource could state what properties the subject must be proven as having for him to access
the given resource in the given mode. 

The code to do this is not that long, and it can be done asynchronously in a reactive 
fashion. Here is code that does both WebID and WebKey authentication: 


Later I will add PublicKey, e-mail, openId, and other methods to tie into the existing

The WebAccessControl resource could specify also who it trusts with claims for a
given statement. This would then allow very flexible trust networks to be build,
as described in the previous mail, and get us out of the sclerosis that we have
with fixed Certificate Authorities. It is then up to the resource's ACL to state
how much it trusts statements made by different authorities. After all it is unlikely
that all of us will agree as to who to trust. Here we have a protocol that gives 
maximum flexibility to all.


This is the process that ties Authentication and Authorization together. This is
just one small function:


Notice that because TLS auth can happen after the connection, we can try that after
having tried http auth.

> regards
> David
> On 23/11/2015 11:01, henry.story@bblfish.net wrote:
>>> On 21 Nov 2015, at 16:41, Dave Longley <dlongley@digitalbazaar.com
>>> <mailto:dlongley@digitalbazaar.com>> wrote:
>>> On 11/21/2015 11:30 AM, Steven Rowat wrote:
>>>> On 11/21/15 7:31 AM, Dave Longley wrote:
>>>>> On 11/21/2015 02:11 AM, Anders Rundgren wrote:
>>>>>> I'm interested hearing what's available and what's cooking:
>>>>>> http://indiewebcamp.com/NASCAR_problem
>>>>>> Just the core (and links), no TL;DR BS please.
>>>>> There's a very simple demo here:
>>>>> https://authorization.io
>>>> Interesting. But I'm not sure it functioned as intended in my browser.
>>>> Some steps were fully Graphic UI, whereas others, confusingly, printed
>>>> full code on screen.
>>> Yes, the demo is very rough. I sent it primarily for Anders with very
>>> little information, per his request.
>>>> I.e., in part of step two, at "Issuer Dashboard" what's between the
>>>> '++++++' below appeared instead of buttons that I was expecting. This
>>>> happened at two other places; but a GUI was fully functional before and
>>>> after, in other steps. Is this the way it's supposed to function at
>>>> present? [I'm using Firefox 43 on Mac OS 10.6.8]
>>> If you'd like to try something a bit more fleshed out (but still rough
>>> in places and untested on Mac + Firefox), you can take a look at the
>>> demo we presented on the Credentials CG call a little while back. I
>>> recommend using Chrome.
>>> 1. Go to https://demo-idp.identus.org. Sign up for a fake/demo account
>>> there.
>>> 2. Go to https://demo-issuer.identus.org. Login in and issue yourself
>>> some credentials.
>>> 3. Go to https://demo-consumer.identus.org. Go through the various
>>> demos, presenting credentials from step #2.
>>> This demo has only been tested on Chrome on Windows and Linux, and
>>> Firefox on Linux. Using Ubuntu requires no special configuration, but I
>>> know that if you're using Debian you have to click a box to allow 3rd
>>> party sites to access localStorage via an iframe ... so maybe there is
>>> something that is similarly required for Firefox.
>>> The demo shows three of the players in a potential future Identity
>>> Credentials ecosystem: Identity Providers, Issuers, and Consumers. As a
>>> credential holder, you are in the middle of those players.
>>> You use an IdP to store, provide, and manage your credentials.
>>> Authentication with your IdP is just username+password at the moment,
>>> but this could be anything in the future. The same authentication that
>>> occurs at the issuer (credential-based authentication) can also be used
>>> once you have a decentralized identity.
>>> Issuers issue credentials to you that you store at your IdP ... where
>>> the issuer makes a browser API call to request storage (potential target
>>> for standardization at W3C). Credentials are digitally-signed so that
>>> consumers only need to trust issuers in the system, not your IdP. This
>>> gives you agility to choose/change IdPs as you see fit -- or to use
>>> whatever IdP you want (run your own, etc).
>>> Consumers consume credentials. They may ask for them using a browser API
>>> call (again, target for standardization). The browser will figure out
>>> who your IdP is so you can provide the credentials.
>>> The browser API is polyfilled using this library:
>>> https://github.com/digitalbazaar/credentials-polyfill
>>> It is meant to be an extension to the Credential Management API that
>>> Mike West has been working on:
>>> https://w3c.github.io/webappsec-credential-management/
>>> The Credentials CG has been in communication with him to ensure its
>>> design allows for these extensions.
>>> In additional to the browser API, another piece is required to polyfill
>>> what the browser must do. The browser needs to be able to discover
>>> user's IdPs; this is polyfilled by the authorization.io website. Part of
>>> the Credentials CG work is to make it possible for people to be in full
>>> control of their own identifiers. This is what the "Decentralized
>>> Identifier" (or DID) is about. This is basically an identifier that you
>>> can claim cryptographic ownership over (by generating public/private
>>> keypair and doing a claim).
>> I have a feeling that the missing piece here is that of the resource on
>> which the request 
>> is being made by the user. I prefer to think of resources needing
>> authentication, 
>> rather than whole sites, as different resources may require different
>> levels of
>> authentication. We get this on social networks all the time: some posts
>> require one to be friends with someone, other family, and others are public.
>> Many people don't have too many accounts, as it is a lot of work to
>> maintain one.
>> We should envisage cases that are more complex.
>> Our use case makes the hyperdata structure of the web site very visible,
>> so that it may well be that the same client needs access to 
>>  (1) http://shop.example/drinks/beer/buy     <- proof of age
>>  (2) http://shop.example/profile/123/        <- proof of account creation
>>  (3) http://shop.example/bike/bmwk1200rs/buy <- proof of drivers licence 
>>  (4) http://shop.example/xxx/kalashnikov  <- proof of participation in
>> well regulated militia
>> The client may have as policy only to ever authenticate when needed and with
>> the right credentails, as it is interested to make sure that the prices are 
>> not being particularly tailored to the higer value for its profile.
>> So we take it as a given that following RESTful architecture, it is 
>> resources that require authentication. Hence we make it easy for any 
>> resource to describe the type of agents and individuals that need access.
>> As shown on the Web Access Control page diagram, each resource (may)
>> point to
>> a description showing what types of agents have access.
>> http://www.w3.org/wiki/WebAccessControl
>> In that diagram the <2013/card> document there is open to read by
>> anyone, and the </2013/protected>  resource is only visible to members
>> of a specific group.
>> At present we have only considered WebID authentication as that was good
>> enough for our current needs, for a few small teams, where we need to 
>> explore many other things including UI frameworks, LDP, etc...
>> But of course the description of the agents that get access to a
>> resource could
>> be much richer. The class of agents could be described as the class of
>> people
>> who can prove they are over 18, and that the site accepts certain actors as
>> verifiers of such proof. There may be a well established list for each
>> country of
>> such actors. In the case of driving licences it is clear that for most
>> countries in 
>> the world the list of agencies able to proove the ability to drive is
>> quite restricted.
>> So each country could list its agencies, and list the agencies in
>> different countries 
>> that fullfill the same role. 
>> A client that comes across such a resource would then 
>> 1. receive a 401 response
>> 2. follow the link to the Web ACL resource
>> 3. discover that it needs to show proof of a driving ability, and what
>> the relevant 
>>   authorities for such claims are. ( the site may point to a few, but
>> the client
>>   may also compare that with its trust anchors, to find an overlap ).
>> For example
>>   in the US opening a bank account may be good enough to get a rifle,
>> but it
>>   Switzerland it requires being active in the milia .
>> 4. check the Credentials Wallet for any such claim
>>  a. if such a claim exists ask the user ( or follow a policy under the
>> users control for such situations )
>>  b. if such a claim does not exist, alert the user, and provide means
>> of him aquiring the credentials. This may require either just going to
>> the authority and getting the existing credentials, or doing a full
>> driving course, and passing the tests.
>>  c. one may want to discover on the side wether the sales requirements
>> are actually legally
>> sufficient for one's own authorities. I may be able to buy a gun in the
>> US, but not allowed to
>> with those credentials in france. 
>> 5. if the user provided the credential use it for the next attempt to
>> accomplish the action
>> So to summarise: to solve the NASCAR problem the client needs to 
>> 1) have some way to access the list of credentials of the user,
>>  and their properties
>> 2) know which credentials are usable for the resource
>> 3) be able to discover how to create appropriate credentials
>> The Web Access Control "API" provides the minimum needed to answer 2) and
>> 3) . With this it then becomes possible for the client to retrieve the
>> right set of credentials.
>> Because there are so many credential possibilities, and so many 
>> attributes that may need to be verified, it is clear that this has
>> to be build up from the beginning in an extensible manner.
>> Btw, issuers themselves can have WebIDs, and I developed the notion
>> of institutional web of trust in 2011 at the eID conference
>> in Switzerland
>>   http://bblfish.net/blog/2012/04/30/
>>> The concept is similar to getting a bitcoin wallet, but without the
>>> unnecessary complexities of the blockchain or mining, etc. Once you have
>>> this identifier, which is a URL with the new scheme "did", you can
>>> associate credentials with it. You can fetch that URL and get back
>>> public information about the identity, such as the IdP that is presently
>>> associated with it. The IdP acts as an agent for storing/getting
>>> credentials for that identifier.
>>> It should be noted, that credentials can be associated with any URL, for
>>> example an HTTPS one. So the system is designed to work with these as
>>> well -- and the technology could be standardized in steps over time. The
>>> first "baby" step could involve registration of your IdP with the
>>> browser rather than with a decentralized network. Of course, when
>>> registering with the decentralized network, you get better portability
>>> characteristics and so forth. The decentralized network or the
>>> technology for it isn't built out yet, it is polyfilled, also by
>>> authorization.io.
>>> Hopefully this explains some of the background and the concepts that are
>>> being experimented with here.
>>> One more quick note about how the authentication works. When credentials
>>> are selected at your IdP, they are wrapped up as an "Identity" (and
>>> identity being a collection of identity credentials that assert various
>>> claims about the identity). This identity and the target domain (the
>>> consumer's) is digitally-signed at authorization.io (which would be the
>>> browser in the future, it is just a polyfill). The digital signature
>>> includes an identifier, a URL, for the public key used.
>>> If the URL has a "did" scheme, the related DID document is retrieved
>>> from the decentralized network (polyfilled by authorization.io). Based
>>> on some cryptography and a ledger), the consumer can trust the
>>> authenticity of the DID document and can look for the public key therein
>>> -- and then check the signature. More details about this decentralized
>>> network are slowly being worked out here:
>>> http://opencreds.org/specs/source/webdht/
>>> If the URL has an "HTTPS" scheme, the CA system can be piggy-backed off
>>> of, using the WebID protocol for authentication. Essentially, the public
>>> key URL can be dereferenced, returning Linked Data that asserts the
>>> owner of the key, which is a WebID, another HTTPS URL. That WebID URL
>>> can be dereferenced to get more Linked Data that will assert that the
>>> public key is truly owned by that WebID. The trust here, again,
>>> leverages the existing CA system.
>>> -- 
>>> Dave Longley
>>> CTO
>>> Digital Bazaar, Inc.
Received on Tuesday, 24 November 2015 19:05:13 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 11 July 2018 21:19:26 UTC