W3C home > Mailing lists > Public > public-xg-webid@w3.org > February 2011

Peter's wish protocol

From: Henry Story <henry.story@bblfish.net>
Date: Tue, 1 Feb 2011 23:06:04 +0100
Message-Id: <DF862704-7F18-4EBD-9E01-BBD5E3EF629A@bblfish.net>
To: WebID XG <public-xg-webid@w3.org>
On 1 Feb 2011, at 20:27, Peter Williams wrote in a thread archived at
http://www.w3.org/mid/SNT143-w44720C811FBEFF7E72DA3992E50@phx.gbl

> The cert is a way of "getting browsers to do the security primitive" called the SSL handshake. its nothing more. Arguably, the cert communicates the webid, and cert enrollment at least ties the webid URI to the public key, in a self-signed blob.

( The certificates can be self signed or not. )

The public key passed in the certificate is of major importance, as it is that public key
that the server will use to prove in the TLS connection that the client knows the corresponding key. 

The certificate then further claims that the owner/(knower?) of that public key has a global
identifier called wid. The Relying party then does a dictionary lookup in the global distributed dictionary we know as the web on the meaning of wid, and finds that the meaning of that term is whoever is a knower of that key.

But come to think of it I see your point. The public key could also be fetched at the WebID profile, served over https in any number of formats, such as rdfa and it would work, and the client would never need to send the certificate to the server. 

>  One NICE thing about having ClientHello communicate the webid is ...it DEPRIVES the wolrd of PKI the excuse to try yet again to sell client cert lifecycle management processes, forcing them to focus on the profile doc instead.
>  
> here is my current wish list for an skeleton ideal scheme (due PURELY to the discusions held here, which I find stimulating).

> 1. clientHello communicates webid claim

with the client_certificate_url extension? How much bandwith is really saved there when you
have just a very minimal certificate go down the wire. Or rather how many packets are saved, as those are the basic units of measurement. This would help understand the importance of this.

> 2. EE cert for client auth is ephemerally  minted and (self-)signed by browser, thereby authenticating clientHello and its webid claim

What is an EE cert?

Anyway all the client needs to do is sign the something with the private key of the
certificate selected by the user. 

>  
> 3. new "cert type" defined per the TLS spec with help from IETF, in which that ephemeral EE cert is NOT ASN.1 on the wire but an xmldsig-signed datum. Other certs in the SSL message's client cert chain (if any) retain their ASN.1 value, to bring valuable legacy interoperability to bear while ensuring ensuring we do not project legacy formats further.

So creating a new cert format type is not really that important is it, if you fetch it remotely?  A foaf file publishing a public key, served over https at the WebID location is enough. 

Though I am very much in favor of certs being in XML when served by the client, if it can be shown that the space issues are not serious. (binary xml?)

> 4. client cert support in CGI and page javascript APIs support client certs in ASN.1 and xmldsig, to drive new generation of apps.

Ok so those are jobs for library writers. That will happen if there is a need. The immediate need is the Social Web ( see my "Philosophy and the Social Web" http://www.slideshare.net/bblfish/philosophy-and-the-social-web-5583083 to get a bit
of an idea of some of the serious political, philosophical, and social forces that are moving us all to participate on this list)

That need cannot wait for browsers to be changed. It has to start now with what is available. And developers/companies won't do much with SSL or TLS unless there is a nicely written down standard for it that is endorsed. Mostly because the received opinion is that client certificates are not usuable - a received opinion that was born without taking into acount linked data) So it is a key requirement for this group to have a spec that can be worked on and made useable NOW by developers.

Your idea above sounds like nice optimisation tricks and improvements that can be added to future browsers. I think it would be worth investigating those as WebID 2.0, or something that can even be done in parallel with the minimal WebID protocol that we know as foaf+ssl.
But the real need for WebID is to get the Social Web going. Without adoption of the minimal spec, the advanced specs will not go anywhere. I am for release early and often, and don't get too far ahead of the needs.  But we being an incubator group I think we could have a WebID 2.0 protocol sketch like this, giving some longer term directions as to where IETF/W3C evolution could lead to from our experience with WebID 1.0. It's a question as to how much time it takes to work on.

Henry


Social Web Architect
http://bblfish.net/
Received on Tuesday, 1 February 2011 22:06:41 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:06:22 UTC