W3C home > Mailing lists > Public > public-xg-webid@w3.org > January 2011

RE: Documenting implicit assumptions?

From: Peter Williams <home_pw@msn.com>
Date: Mon, 31 Jan 2011 13:02:35 -0800
Message-ID: <SNT143-w53CAFBE5CBD120FB5463DF92E20@phx.gbl>
CC: <public-xg-webid@w3.org>

 
> subjectAltName tightly binds WebID to x509v3 certificates, x509v3 
> certificates with subjectAltName extensions are very hard to produce 
> with common libraries (unless you have a custom setup - e.g. openssl).

I managed to get a "commodity" Windows 2008 EE edition cert server to make one. It was not hard (for someone who has done nothing else but certs, for 20 years...). I also got a command line tool in windows to do it, too - this being relevant to modern toolchains supporting the major rollout of the Windows Identity Framework, leveraging self-signed certs stored in metadata documents handsomely for endpoint authentication.
 
So, a few hundered million Windows type admins should have no problem, using tools that come with the product. It would be totally minor for a windows product team to "productize it", issuing some templates that orchestrate things and supplying the right language in the help text,
 
Not  flogging windows, simply pointing out the skill level required and the accesibility available. 
 
Using those tools, it was also easy to put the webid in the certpolicy field too - which has infinite flexibility on string type support. While the URI to a "CPS document" must be IA5String, defining a new class of URI to a different document type (RDF FOAF...) can mandate a more capable string.
 
------------
 
But we are getting ahead of ourselves. Still have to weigh: do we want webid protocol to work with 5 year old browsers, or not. This kinda drives how much of the old cert stuff to keep around; how much to start to work of 30 years old stuff out of the way.. so it doesnt keeping clogging for the next 50 years. i know this issue of "it works today" was very much in Henry's mind in the initial stages of conception - making sure it actually works in the wild of today. It was critical to getting here...
 
If I was the funding agency on this, with my style, Id probably back both a quick and dirty of $100k to get more traction for a REALLY REALLY major overhaul for $1Million. The quick and dirty has to pathfind the case for folks to buyin to the risk of the major overhaul delivering really big changes. it has to show the way to dump some of the older legacy, ensure only certain legacy carrier forward, and fundamentally retools the street from copper to fiber... It has to find a way to talk to IETF PKIX, so the two intiatives keep on good terms.
 
My gut tells me (since https has stagnated for a decade now), there is pent up capability for a really radical leap forward, here. And, this time (Peter holding breath in mixed fear, mixed admiration) W3C culture and process gets to drive https'. Its because the semweb technologies are similarly pentup that I see an opporunity - that the identity and semweb things are releasing each other ; to major social benefit.
 
For my part, Im putting a lot of trust in this group. Ive waited 10 years for a Henry to turn up and use the URI (specifically) I had put in certs, and I want his idea to run  - somehow or other. He cuied into my original gamble - that one day self-signed certs and URIs could line up. If its too late for PKIX client certs to eovlve further for this new world... I dont care. It can be an digsig XML blob playing the role of  the self-signed ert, for all I care. Just beware politics; since there are billion dollar movers and shakers tied to certs, for a decade or more yet....
 
 
 
 
 
 
  		 	   		  
Received on Monday, 31 January 2011 21:03:29 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 31 January 2011 21:03:30 GMT