RE: Limitiations of Atribute Certs - Re: Position Paper for W3C Workshop on Identity

You have to remember that DOD is a large institution, and NSA is a component
(independent of DoD in tradition, but  fully integrated at the same time). I
have to assume MoD/GCHQ/DERA are the same (not that I know a damn thing
about any of them). NSA is engineering and operations, pure and simply. Tell
it to buy OSI equipment according to ISO standard, that's what they do. Tell
it to invest in a 10 year standards effort by lending staff and money to ISO
(including PKI and Directory groups, to "argue for" US agendas), this is
what they do. It's a mission, as ordered. This culture is different to NASA,
where tell it to buy OSI and it does an open-secret deal with Cisco to duly
supply OSI (and, oh look, they supplied TCP/IP for free). Guess which gets
deployed, at NASA Science Internet? 

One has to understand this contrast between conservatism (NSA) and the
duplicity (NASA/NSI); its part of the American scene. Its neither right nor
wrong. It's just what it is, and it impacts society at large due to the
sheer buying power of the US economy. It's the same in crypto - which went
from weird national security space pre 1990 (james bond, M, and some black
guy in CIA wandering around Barbados being pick up in super secret
submarines) to what we have today - where crypto is now part of the general
economy. I got to live that change moment (from the inside, at one point;
and then the outside - where the initiative belongs.

In many ways webid is the culmination of that process - because it ties a
knot around an era (of stooge committees, doing what their funding agencies
tell them to do, while pretending otherwise). We shall see in the coming
identity conference if the era is actually over, or still gasping for air -
as W3C gets drawn into the mysterious loop of "influence" and "expert
groups", and "folks in the know" (and feedbacks by "persons anonymous",
etc). Such is the spying issue, I suspect there are many gasps left.

With all certs, id and Attribute, you have to remember the idea was to have
an offline issuer, to control the unwanted impacts that follow from a
failure of the trust assumptions in the core CA concept (think Comodo,
recently, who violated the core duty of a traditional CA... by failing to
prevent a fraudulent cert issuing act). VeriSign and then NSA indeed changed
those assumptions (thank Dave Solo at BBN and then CitiBank, for that). The
vulnerabilities of any secure channels on OPEN network built sing public key
crypto mechanisms (including RSA) became slowly understood (and now
represent great spying opportunities, against the unwary using the crappier
implementations, shush!). But, they also allowed the previously sacred
offline/online boundary to be fuzzed, even as channels on PRIVATE networks
moved over to OPEN networks full of unconstrained attackers with huge
computational resources. 

Now, with the CMS protocol (built into windows, and even Mozilla), remote
operations between browser and cert issuer occur over messaging (SOAP
expressed in ASN.1, Im afraid), as the CA can be fronted by a CMS - a cert
management system : an online responder. This changed the game - offline
certs designed for the associated trust and risk assumptions, were not
longer ahem "offline". In the W3C world, xmldsig pays the role of the PKCS7
standards the secure the CMS channels, but no semantic equivalent of the
operations defined for that channel exists (except the paltry keygen tag).
Thus W3C has no real equivalent of AA blobs, or AA issuing - except
RDF/foaf-cards (!)  - and some of the more arcane features of XML/DSIG,
nobody uses.

Now one has online minting capabilities for all cert blogs (for LANs and
enterprise internets), one was left with the "ideas" of Attribute
Authorities, supposed to be distinct from Certification authorities in tone,
culture, focus and administration style. And, those ideas were open in
nature (coming from Canadian thought lines, mostly, which nicely contrasts
the overly-militarized US economy). Sometimes, those idea, long brewed in
ISO, take decades to  find their moment in time. ISO doesn't build products;
it invests in openness, general quality, and market development in general
(its part of the UN concept, remember). Remember, its take 30 years for
X.500 to make it to being mainstream in Office365 in the Cloud (and it took
from 1986 to 1996 for id certs -- in their ISO format, anyways -- to make it
to the commodity mainstream (Netscape 2.0 browser SSL, Lotus Notes 4 email).

ISO and W3C are on the same side of the coin. They are just open systems. In
many ways, TBL took a slice of open systems and massaged it. Good for him,
and his knighthood. I don't think it was worth more than Turing, to the
world, though (who didn't exactly get a knighthood). But, such is business.
I don't deserve my wealth either; just right spot right time, mixing VISA,
porn, NSA and certs in the just right amounts to make VeriSign a runaway
success, doing something folks cared about at this time in web history.

Yes, the web has gone beyond the ISO world. But, as AL Gore tried to say: it
takes time to fund initiatives that you intend to be big. And, as a manager,
you don't even know what it will grow up to be. You just know it's the right
KIND of future. All it MUST do, is be bigger and better and more inclusive.

I aint here in webid for the wonderful world of ASN.1 formats, or working
with committees full of folks trying to get knighthoods or invites to
society weddings while wearing funny hats and Prussian uniforms! Crypto and
the web are joined at the hip; somehow re-defining the privacy space, post
imperialism and centralized control of telco. Self-signed certs blobs, be
they id or attribute, distributed in signed metadata are just the latest
evolution of a 30 year+ evolution, that is yet ongoing. For now, we can
simulate the signed metadata by using a https "signing" endpoint. At some
point, we have to sign the RDF graph, properly, though - even if we just
sign the damn content-transfer encoding. Then, the abstract triple is an id
cert, an attribute cert, and .. more. We have moved beyond the need for
layer 6, per se.








-----Original Message-----
From: public-xg-webid-request@w3.org [mailto:public-xg-webid-request@w3.org]
On Behalf Of Henry Story
Sent: Sunday, April 24, 2011 4:15 AM
To: peter williams
Cc: 'WebID XG'
Subject: Limitiations of Atribute Certs - Re: Position Paper for W3C
Workshop on Identity



On 24 Apr 2011, at 01:47, peter williams wrote:
> 
> Arguably, pushing a signed XML blob in websso (or a mac-signed 
> openid), or pulling an OAUTH record.... plays the "role" anticipated 
> for the AA cert (and indeed the role played by pulling an foaf card). 
> None of them have the lifecycle properties of AA, but they have the
functional aspects done.
> Furthermore, websso protocols are tuned up for the web (redirects, 
> auto-posts, etc); whereas signed AA blobs were not web-specific. They 
> really focused on being added to the SSL handshake as an additional 
> cert  type, which never happened. Lots of DoD politics around the 
> Defense Messaging System "influenced" the US defense vendors fronting 
> DoD in IETF/IESG PKI/SSL WGs, who duly ensured AA went nowhere. At the 
> time,  DoD was ordained to be in charge of civilian infrastructure - 
> and they had the WGs tied up to do their bidding. If THEY didn't want 
> it for US, the internet standards duly reflected that.

That is an interesting piece of history Peter, thanks for sharing.

You need to go further in your investigation though. You argument is that
politics were involved in Dod. That of course is a magical word for "it just
did not succeed as well as we hoped". Politics is involved everywhere, in
particular in the web. And the web if anything would be ever more political,
because it is global. A technology spanning countries that during most of
the 20th century were the worst enemies. The web is available and used by
competing companies, countries at war with each other, and so on...

So you need to inquire into these properties of the web that have led to
this astounding success, despite all the possible politics.

My claim is that the architecture of the web is fundamental to this success,
which is based on some principles based on simplicity, transparency and
understandability among others. Perhaps a principle of building towards
complexity in a layered programatic manner. I don't know exactly. 

But clearly XML is an improvement already over binary formats that are
difficult to read and process. (The BouncyCastle Java/C# api is useful, but
still really difficult to use I find for example.)

Publishing documents that can be linked together is what creates network
effects, that outweigh clearly the differences of political powers, allowing
them to agree on the logical minimum that is needed to get to the next
level.
 
If you can get yourself to think along these lines, you will see how all
these old technologies can be webified cleanly and simply (it has to be that
way) in slowly introduced into the global space. Things of course will look
very different than initially anticipated.

Henry

Social Web Architect
http://bblfish.net/

Received on Sunday, 24 April 2011 14:28:58 UTC