W3C home > Mailing lists > Public > public-xg-webid@w3.org > February 2011

RE: [keyassure] publishing the public key

From: peter williams <home_pw@msn.com>
Date: Sun, 20 Feb 2011 09:34:07 -0800
Message-ID: <SNT143-ds51387C80CD7E9561AC7B092D60@phx.gbl>
To: "'Henry Story'" <henry.story@bblfish.net>
CC: "'WebID Incubator Group WG'" <public-xg-webid@w3.org>
NSA infamously defined a value type for subjectPublicKey with 2 keys within
(and other facts about those keys, such as security labels); leveraging what
the embedding notation for "spk in a bit-string wrapper-value
 provided. (The notation for certs in 1986 recognized that different formats
for public key(s) existed in the military key-fill-centric crypto hardware
of the day, and this wrapper-technique allowed for variations of key-fill
formats). It was at the time when there was no formal extensibility model
for X.509 cert type. (There actually was one, but folks would argue
endlessly whether it really existed, or whether it would be appropriate to
use it if it did exist; so nothing happened...).

It was done also at a time when there was no expectation that any one entity
would have multiple EE certs per named user/entity. This assumption was
eventually challenged; and folks successfully argued for a change when X.509
standardized a framework not only the authn cert but for the  authz cert (a
shameless ISO-ization of the ECMA/kerberos PAC). Having challenged the
"singular cert is everything" notion, even the singular authn cert split -
between those with keys for signing and those for orchestrating key exchange
(in the likes of SSL handshake). A further rationale developed, splitting
SSL client authn grade signing (e.g. webid) from NR-grade signing (in
S/MIME, CMS , and CRMF in the Mozilla world). In the ephemeral ciphersuites
of SSL, further splits happened, allow temporary certs to be signed. In the
world of W3C-era SSL MITM proxy firewalls, one sees further cert
re-signings, as authority spoofing firewalls are introduced by corporate
gateways and ISPs (for home users). In the windows world there is strong
splitting , where different certs (with the same publickey) with different
applicationkeyusage tags can drive and distinguish 10 (or n) different https
applications (of which the browser is just one class).

By this time, FBI was paranoid about folks having key exchange keys that
were not "controlled" by governments, oppressive or otherwise. NSA found a
rationale that enabled  it to help out its immature cousin-agency given to
using control and threat culture vs trust and security culture -
distinguishing at least one key type in a authn cert (the NR key) that even
a military control culture would not want to hold in escrow - since it
undermined the dignity of the individual service member.

NR keys were thus distinguished from signing keys, distinguished from key
agreement keys. Thus came about the keyusage, extendedKeyUsage, and later
the application-centric Usage extensions that could flag these purposes.
(There are three because some folks would arguely endlessly
about...existence and various religious positions that induced community
schisms. One US agency actively funded those schisms, too (any... it didn't
matter between who or over what) simply to slow down the rate of
cryptographic usage, while it developed subversion countermeasures for
commodity crypto)

So now you understand that there are various ways to embed control systems
[designs] into certs, each type leveraging the notation's design to borrow
its crucial canonicalization and signing feature (and its support and use in
a billion or two PCs).

If you want to leverage the 1986-88 era X.509 design model in the web, you
still can - by NOT conforming with PKIX - by using the v1 cert (vs the
IETF's v3-ony cert). Do so, and you define your own subjectPublicKey type,
serialize values, and wrap the resultant bytes as the content of and ISO
bit-string (stick 04 <len> on the front). Obviously, the average SSL server
will be unable to parse it, having no code for the new value.

If you want you can define a type that is an UTCString, which itself embeds
some javascript as its value. Wrapped in a bit-string, it substitutes for
the type/value recognized by servers today. Once again, code lockup in your
average SUN server will not recognize it, and there are no extensibility
points from that class of vendor.

All pretty trivial stuff (stuff a string in a string); but exactly the kind
of world I wanted when I wrote our 
how-to cert" book for vb programmers - wanting this lowly programmer to feel
empowered when designing crypto/key management systems (since academic type
folks who can say "ontology" are too easily subverted with research grants,
removal of travel privileges, blackballing from the Royal Society of this or
that, etc)

In the v3 cert era (2000 and later), it's easier (really, honest...no
guff...) to embed json using the v3 extensilbity framework rather than the
old-fashioned spk field - stuffing the javascript/json in a v3 extension
(string in a string...as in certs for dummies), which is the more modern way
AND what the framework is specifically for (le you design your own control
system). It has the advantage over the v1 model that it breaks MUCH less
operational stuff (a benefit which probably is irrelevant, here, since
breaking the IESG's work on CAs (simply as some kind of gesture) is part of
the unstated mission here). 

This is all good in the bigger picture, as the IETF PKIX work is only a part
of what the https world actually supports.

There are also other fallbacks once the self-signed cert is "removed" by the
Mozilla foundations et al. CRLs and OCSP (and perhaps the webid protocol
that apes their "status service") are well supported now by the commodity
infrastructure. Thus, one can indirectly extend certs these days without
worrying about the signing block, since the (authenticated) status message
can add by contextual reference that which the CA-signed cert will probably
not be allowed to signal directly. Just like a universal Turing machine, one
just has to encode in the subject name of the cert a (coded) denotational
reference that authenticates the third party source to the relying party.
One boots past the control system, then, with an indirection (or two...)

went away for
-----Original Message-----
From: public-xg-webid-request@w3.org [mailto:public-xg-webid-request@w3.org]
On Behalf Of Henry Story
Sent: Sunday, February 20, 2011 5:16 AM
To: keyassure@ietf.org
Cc: WebID Incubator Group WG; Peter Gutmann
Subject: Re: [keyassure] publishing the public key

  I am CCing this to the WebID XG Group because it is also relevant there
  to ISSUE-39: "Simplify how public keys are expressed"
  which is discussing something very similar to this issue
  For people on that list the thread here is

On 20 Feb 2011, at 04:01, Peter Gutmann wrote:

> Henry Story <henry.story@bblfish.net> writes:
>> Is that the same as X509? 
> It *is* X.509, just used as a key bag (with optional attributes).

A bag with only one key though. Sounds more like a singleton.
Just reading RFC5280, I don't see the option of putting more than one key in
there https://datatracker.ietf.org/doc/rfc5280/

(Though I suppose extensions could be designed for that, though to what

Anyway here is how we Dane define the public key. Exactly as in rfc5280

SubjectPublicKeyInfo  ::=  SEQUENCE  {
        algorithm            AlgorithmIdentifier,
        subjectPublicKey     BIT STRING  }

Then one just does as they do and point people to more rfcs

[[  Subject Public Key Info

   This field is used to carry the public key and identify the algorithm
   with which the key is used (e.g., RSA, DSA, or Diffie-Hellman).  The
   algorithm is identified using the AlgorithmIdentifier structure
   specified in Section  The object identifiers for the
   supported algorithms and the methods for encoding the public key
   materials (public key and parameters) are specified in [RFC3279],
   [RFC4055], and [RFC4491].

>> The code to select the subset is going to be at most a few lines.
> How do you get from CertCreateContext() to 
> turn-this-encoded-blob-into-a- public-key-context?

Hey, that's a clever way to get free programming time from people: Taunt
them that they can't do something :-) And you get the added pleasure of
forcing me to learn ASN.1, which I have been trying to avoid... 

Note how decoding the key is three lines of code.

So here it goes in Scala (requires Sun JVM).

// create an RSA Key

import java.security.spec._
import java.math.BigInteger
val keySpec = new RSAPublicKeySpec(new
), new BigInteger("65537"))

import java.security.KeyFactory;
import java.security.interfaces._
val keyFactory = KeyFactory.getInstance("RSA") val rsaKey =

// Base64 encode it, ready for publication //

import sun.misc.BASE64Encoder
new BASE64Encoder().encode( rsaKey.getEncoded )

// In pastebin ( http://pastebin.com/TEpMBJK5 ) // you will see this returns
// //

// Decode the key, which could have been found in DNSsec //

import sun.security.util.DerValue
val der = new DerValue( rsaKey.getEncoded ) val newKey = X509Key.parse(der)

The output from the scala shell is shown in 

>> Currently we are not asking to remove the other options. Just to see 
>> if this option is possible, and to work out what the advantages and 
>> disadvantages would be.
> Well I'm OK with that, as long as it's made optional so implementers 
> can ignore it at their leisure.  Putting it in a seperate RFC would 
> make this even easier.

Now I think that the argument that this is so difficult has been shown to be
wrong, I think we can perhaps push the discussion further on this. The
reasons for or against this here won't be the same as on the WebID XG, but
it should be instructive none the less.

> Peter.

Social Web Architect
Received on Sunday, 20 February 2011 17:34:50 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:06:22 UTC