Re: What we were using public key authentication for

> On 24 Mar 2016, at 00:25, Harry Halpin <hhalpin@ibiblio.org> wrote:
> 
> 
> 
> On Wed, Mar 23, 2016 at 6:08 PM, Tim Berners-Lee <timbl@w3.org> wrote:
> 
>> Virginie
>> 
>> You asked at the AC lunch for me to describe what it was that we had been using  public key authentication for.
>> 
>> This is the system we have been using for someone to be able to identify themselves as the owner of a web page, like a blog.
>> 
>> - A user of a web browser runs a web app A
>> - The web app uses data stored in various different web sites B (Like health data, bank data, calendar, etc)
>> - The user has a global ID which is an https: URI which points to public info about them (or a persona) like their blog
>> - The access control on the data stores is set to allow access referring to that ID string
>> - The user accesses the data from within the webapp by using a private key which is stored securely on the device (or hardware gadget)
>> - The user’s public key is accessible only though unfakable "browser chrome"
>> - The cert is NOT signed by anything — it can be self signed. (the trust is from the public key being on the web)
>> - The public web home page for the user lists the corresponding public key publicly
>> - The client cert does contain the global ID URL
>> 
>> The essential characteristics of the system which the browser architecture seems to fight against:
>> 
>> - The whole point is to be have a public identity (like your W3C or github id) which can be to show you are the same person on multiple sites
>> - It isn’t just about logging onto different we sites, it is about accessing the same data you own on many sites B from many different sites A
>> - The signature on the cert is not part of the system, so the keychain won’t recognize the cert as “trusted”.  This does NOT use the PKI
>> 
>> We have been doing this using client certs, but the UI was unloved, the  keygen API weird and had bugs, browsers have had a war against client certs.  But the system worked!   Until keygen got turned off in Chrome and Firefox  a few days ago.
>> 
>> So can we get this functionality using web crypto in the short term?
> 
> Here's my best guess!
>  
> 
>> Suppose we
>> 1) generate the keys in a web app using web crypo, store them in ephemal strage, and rejenerate them any time the use clears the key storage. Use them in a custom HTML form based auth session with the server in which we invent our own PK based authentication protocol.
> One way to prevent keys from destroyed if localStorage is cleared to 'wrap' long-term private keys with another per-session key and upload long-term (i.e. exportable) keys to your server and then download those keys from the server and unwrap when you need them. That long-term keypair could have a public key that you could then display on your URL w/i a WebID.
> 
> The main issue is you'd probably still want a keypair to wrap. To get per-session key, I'd use as a temporary measure like Secure Remote Password (SRP) and then upgrade to WebAuthn (FIDO 2.0) when it happens [1]. Another option is HOBA (no passwords) and it needs recoding in WebCrypto I think [2]

Tim forgot to mention another aspect of our use case, and that is that there will in the end be millions of applications hosted on many origins, that allow us to interact with various types of data, linked across the web. People would choose the application required for their task, tuned for their cultural background, designed for their accessbility needs, etc... Each of these apps come from a different origin, both because each app is built by different organisations, but also because app security is origin bound. 

On your proposal a user would need to end up remembering a password for each application, in order to fetch the public/private key pair stored on each origin, which means that we are back to the useability problems of password maintenance and reliance on keychains. Also it seems like Hoba won't do as it relies itself on the WebCrypto API and so would suffer the same problem if the key storage were cleared.  The problem with symmetric keys (passwords) are well known, and that is why assymetric key crypto was such a groundbreaking invention.


>> 2) genearte the keys using math, possibly web cryto with “exportable” keys, and download a .pem file to the user’s desktop.  Get the user to click on the .pem and go through the process of installing the cert on their site. Hope fingers crossed the browsers don’t just block the use of client certs at all!
> 
> What would the self-signed cert get you that a key would not? It seems that localstorage vs. TLS keystore is the issue, where it's believed TLS keystore is much more secure? Would be a good thing for a security audit to investigate, such as iSec partners.

The relevant difference is not one between TLS and non TLS stores, but rather between the ability to use public key cryptography across origins _with_ user consent.  Currently TLS client certificates as implemented in browsers enables public key cryptography authentication across origins with user consent supported by the browser chrome. And it is the only technology deployed in the browsers that does this. 

The WebCrypto API:
 • does not have chrome support, and so the user has to trust the JS from an origin to correctly notify the user when authenticating across origins. So user consent is optional here. As a result the WebCrypto API actually allows applications to create super cookies, and authenticate the user across all origins without the user knowing about this. Still one cannot really fault the WebCrypto API for this, as it is easy for developers to roll their own crypto libraries, even without the WebCrypto API and do the same thing one way or another.
 • can only create a key per Application Origin, requiring therefore an extra method to tie identities together, so that an application that needs to view information generated by another application that should be visible to  the same user can do so. [Note 1] 
   
[Note 1]: Even systems like FIDO which start with a philosophy of unlinkability have to acknowledge that even simple requirements of current web users requires linkability, and furthermore that they cannot stop this. See their documentation in the section "OpenID, SAML, and OAuth"
   https://fidoalliance.org/specs/fido-uaf-v1.0-ps-20141208/fido-uaf-overview-v1.0-ps-20141208.html#relationship-to-other-technologies -

I actually think the WebCrypto API can be used in a good way to launch a more HTTP/2 friendly way to authenticate using HTTP Signatures
https://github.com/solid/solid-spec/issues/52

But I imagine TLS also has advantages otherwise why would Microsoft and Mozilla be working on TLS1.3 integration into HTTP/2
"Reactive Certificate-Based Client Authentication in HTTP/2"
https://tools.ietf.org/html/draft-thomson-http2-client-certs-01

My guess is that TLS allows one to proove point to point security, whereas HTTP-Signatures by itself, if it does not surface the TLS session in a header, does not allow one to guarantee that the packets were not viewed (and perhaps modified for chunked responses) by an 
intermediary...

As they say in the WebCrypto API this is subtle. It needs to be discussed with care and in a cross disciplinary fashion, since we
are moving between cryptography, protocols, web architecture and user interface design.

>>  3) Switch the while thing over to use openID-connect to connect people to some password-protected site?

> OAuth can be used to transfer information (such as data from your access-control). You would need to have your WebID hdata store be OAuth enabled as your 'identity provider.' 
> 
> There's nothing preventing an OAuth provider (MIT CSAIL is also moving from certs to OpenID Connect) from working with a URI-based identifier (RDF) and shipping RDF-based data around. In fact, you could do all sorts of fancy stuff like encrypted the data by storing it in JSON-LD (I think you could do n3 as well likely if you normalized a bytestring via base64) in a JWT and signing using a JWS. 

Can you describe your thinking in more detail. I am not sure what you mean by OAuth being used to transfer information. It seems to me that HTTP is much better suited to transfering linked data in whatever format.

We are of course quite aware that Web Access Control [note 2] can be used with different authentication schemes. Each resource could through a WWW-Authenticate header specify the types of authentication protocols they support, be it e-mail, one time password, OpenId, HTTP Signature or even with the proposed Reactive TLS-1.3 based authentication.  I have started looking into that with OAuth too [note 3].

What is needed then is to look at each of these protocols to analyse them in detail along a number of different dimensions.

Manu Sporny has actually started work on this:
  http://manu.sporny.org/2015/credentials-retrospective/

[Note 2] https://www.w3.org/wiki/WebAccessControl
[Note 3] https://github.com/solid/solid/issues/66

>  
> 
> If the Fido system will give us all we need by the end of this/next year, that will be great, but we need something now we can improvise say using  web crypto and JS.
> 
> 
> [1] http://srp.stanford.edu/
> [2] https://hoba.ie/
> 
> 
> 
> Best
> 
> Tim
> 
> 
> 
> 
> 

Received on Thursday, 24 March 2016 10:07:43 UTC