W3C home > Mailing lists > Public > public-credentials@w3.org > March 2016

Re: Verifiable Claims Telecon Minutes for 2016-03-29

From: Henry Story <henry.story@bblfish.net>
Date: Thu, 31 Mar 2016 05:53:56 +0100
Cc: Carvalho Melvin <melvincarvalho@gmail.com>, Manu Sporny <msporny@digitalbazaar.com>, Kaliya IDwoman <kaliya-id@identitywoman.net>, Credentials CG <public-credentials@w3.org>
Message-Id: <AF51C161-6F9D-4862-A517-3AF8C92AF69F@bblfish.net>
To: Anders Rundgren <anders.rundgren.net@gmail.com>

> On 30 Mar 2016, at 19:51, Anders Rundgren <anders.rundgren.net@gmail.com> wrote:
> 
> On 2016-03-30 19:47, Henry Story wrote:
>> 
>>> On 30 Mar 2016, at 15:55, Anders Rundgren <anders.rundgren.net@gmail.com> wrote:
>>> 
>>> On 2016-03-30 16:49, Melvin Carvalho wrote:
>>>> 
>>>> 
>>>> On 30 March 2016 at 16:39, Anders Rundgren <anders.rundgren.net@gmail.com> wrote:
>>>> In addition to technical issues it is also interesting nothing that new developments
>>>> in this space are likely to get limited support from the (browser) platform vendors:
>>>> https://lists.w3.org/Archives/Public/www-tag/2016Mar/0001.html
>>>> 
>>>> Apparently it is not enough to be the inventor of the Web and being knighted by the Queen
>>>> to keep even the old stuff intact!
>>>> 
>>>> A correction to this, firefox have confirmed that they WILL follow the TAG recent advice and not deprecate any used functionality until there is a suitable replacement.  
>>> 
>>> Suitable replacement?  Since the core issue (when you connect all the dots out there in various lists and forums...), rather is the deprecation of client certificates on the Web, the only imaginable replacement is FIDO alliance tokens and technologies.
>> 
>> If you look carefully, client certificates have not been deprecated. Hardware supported certificates are supported still.
>> What has been removed by Chrome is the easy low cost generation of client certificates via keygen. keygen was a bit
>> broken, true, but it should be easy to fix those, one way or another.
>> 
>> See https://github.com/w3ctag/client-certificates
>> 
>> There is of course a lot of potential to improve certificates. X509 is not a be all end all. It works, but there is huge room for
>> improvement.
> 
> If we stick to X509 client certificates and browsers, no such improvements are in sight.  The client-certificates write-up is a political paper to show "good will".
> 
> The deprecation I refer to is for example mentioned here:
> https://lists.w3.org/Archives/Public/public-webappsec/2015Sep/0093.html
> 
> "and it puts authentication at a layer (the TLS handshake) where it is fundamentally
>  problematic to the commonplace scalability and performance architecture of 
> anything but  hobbyist-level applications"

This article makes a few statments that one should be weary about. 

1. conflating <keygen> with certificates
----------------------------------------

eg: "<keygen> entangles being identified with being authenticated..."

<keygen> only creates a public/private key pair, and so does not entangle being identified
with being authenticated. He means that X509 Certificates to that. I suppose <keygen> is
also tied to the server sending an X509 Certificate back. But browsers could easily be
extended to allow other types of certificates to be sent back and added to the keychain.

2. Identity and authentication
------------------------------

So if we correct that he meant 
"X509 certificates entangles being identified with being authenticated...""

yes, rfc5280 requires a DN or subject alternative name.

But there is no need to tie client certificate authentication to X509 certificates. Indeed the 
"Reactive Certificate-Based Client Authentication in HTTP/2" proposal by Microsoft and Mozilla
https://tools.ietf.org/html/draft-thomson-http2-client-certs-02
by moving the certificate exchange from the TLS layer to the HTTP layer, open up a move away
from X509. Citing from the introduction:

     "In this document, a mechanism for doing certificate-based client
   authentication via HTTP/2 frames is defined.  This mechanism can be
   implemented at the HTTP layer without requiring new TLS stack
   behavior and without breaking the existing interface between HTTP and
   applications which employ client certificates."


Once one moves to the HTTP layer, everything is much more flexible. One 
could easily render explicit the semantics of X509 Certificates by mapping
the data over to RDF, and then using say Linked Data Signatures send signed 
graphs, in whatever syntax is the fashion du jour.
(see https://web-payments.org/specs/source/ld-signatures/ )

Where the X509 syntax states that the DN has to be filled, a signed graph
could just use a blank node in the subject position. Nothing logically requires 
that the user identify herself directly using a name. She could also indirectly 
identify herself by just sending the public key, or a reference to the key as 
in HTTP  Signatures. 

So if the client only wishes to use such a certificate with
one origin, then that would be very close to FIDO. 

2. TLS vs HTTP layer
--------------------

(Your quote above) 
> "and it puts authentication at a layer (the TLS handshake) where it is fundamentally
>  problematic to the commonplace scalability and performance architecture of 
> anything but  hobbyist-level applications"

This is a good point with regard to HTTP/2. For HTTP/1.x this was not such a problem. 

Now usually what happens is that one tries to evolve a working technology slowly to improve it, 
as other parts of the stack improve. This is what is clearly what Mozilla and Microsoft are 
doing in 

"Reactive Certificate-Based Client Authentication in HTTP/2"
https://tools.ietf.org/html/draft-thomson-http2-client-certs-02

So yes, the criticism is reasonable for HTTP/2, but there are also reasonable answers to
it which are backward compatible, build on existing experience and forward looking,
and with no IP baggage.

3. Browsers & UI
-----------------------

It states that browsers are incapable of improving their UI.  It states that the current
stack

"locks the experiences and evolution of the direct relationships between users and
services to the inconsistent and slow moving world of browser UI,"

It is really up to the browser vendors to prove that wrong. Some do a much better job
than others in that respect, which proves that this is clearly not a fundamental problem.
In my view this is more related to the fact that as currently used client certificates must
have been mostly require by the military or large organisations for which it works, and
that for them UI is not a big problem. But with WebID enhancement ( http://webid.info/spec )
the same stack can be made cheap and available to all, and there are many reasons for
browsers to want to help the whole web be a safer place. 

If I were a browser vendor I would be reacting strongly to such statements. After all 
if browsers cannot innovate in UI, then where can they innovate? 



> 
> Anders
> 
> 
> 
>> 
>>> 
>>> Creating "a better keygen" is clearly not considered.
>>> 
>>>>  
>>>> 
>>>> Personally, I advocate for solutions that make third-party extensions of the Web (browser)
>>>> architecture a reality because then you can iterate and experiment a bit before launching
>>>> new schemes, regardless if it is a proprietary product or a standard-to-be.
>>>> 
>>>> Anders
>>>> 
>>>> 
>>>> 
>>>> 
>>> 
>> 
> 
Received on Thursday, 31 March 2016 04:54:27 UTC

This archive was generated by hypermail 2.3.1 : Thursday, 31 March 2016 04:54:29 UTC