RE: Proposal: Chrome privacy CA

OK, This WG is not here to discuss the merits or the design of a Privacy CA, if there are certain changes needed to WebAuthn to allow a particular attestation type to be used then let’s have that discussion otherwise we should take this discussion elsewhere.

From: Denis Pinkas [mailto:denis.w3c@free.fr]
Sent: Thursday, November 9, 2017 2:29 AM
To: Adam Langley <agl@google.com>
Cc: W3C Web Authn WG <public-webauthn@w3.org>
Subject: Re: Proposal: Chrome privacy CA

Hi Adam,
On Wed, Nov 8, 2017 at 9:02 AM, Denis Pinkas <denis.w3c@free.fr<mailto:denis.w3c@free.fr>> wrote:
By default, self-attestations should be used. So nothing at all would be given away. This is what comes out when you apply "Privacy by Design" and "Privacy by Default" rules.

By "self-attestation" I'm assuming that you have in mind that the private key itself signs the registration data, as opposed to some attestation key in the token. This would show possession of the private key.

However, no U2F device that I'm aware of does this and the CTAP2 spec doesn't have this either, I think. So, while I'd be very happy for tokens to do that by default, that ship may have sailed.

But we might agree that privacy by default makes sense. Our requests (2) and (3) say that, by default, RPs should not get attestation data. In that case, no information need be sent to any Privacy CA.

You say that "by default, RPs should not get attestation data". I am not sure how to interpret this sentence.
An attestation is always needed, either a certificate attestation (where currently a batch of hardware tokens share the same private key)
or a self-attestation (where a self-signed certificate is generated using the private key used to open the account).
This means that the RP must always get and verify attestation data.



Revocation of a specific token is currently impossible in FIDO since all RPs are considered to be equal. If you remove this equality, then a new bunch of possibilities emerges.
So saying that is is a "non-goal of FIDO" may not be an appropriate statement. From a functional point of view, it would be nice to be able to revoke a specific token
and this is possible as soon as you introduce a distinction between :

  *   RPs that need to know whether an authenticator conforms to some security and functional requirements,
and in this case the use of an attestation certificate will be required.
  *   RPs that don't need to know whether an authenticator conforms to some security and functional requirements,
and in this case the use of an self-signed certificate will be required.
This means that a given hardware token should be able either to generate a self-signed certificate or to use its attestation certificate upon the request from the client.

I think we're in agreement there, although we do have to deal with the existing population of tokens in the world. Thus we're asking for a flag from RPs to indicate which of those two groups that they're in (point #5) and we'll have Chrome generate a dummy attestation certificate for RPs that don't care about requirements. That's not as good as having the token sign the registration message with the registration private key, but it's the best that we can do with the current population of tokens.

It is not a good way at all to solve the problem. By doing so, you introduce an easy way to create a bunch of dummy registrations,
because at the time of the registration you do not even know that the user is knowing the private key corresponding to the public key that is being registered.
In other words, you do not have any PoP (Proof of Possession of the private key) at registration time. Adding a signature of the registration message
with the private key would be the smartest and the simplest way to correct the problem rather that, as currently, asking to create a self-signed certificate
to keep a single profile for the registration protocol.


If CTAP2 were to define a "no-op" attestation form that tokens would use on request, we would happy with that.



Hardware authenticators do not last for ever. Since current hardware authenticators are unable to support revocation,
it may be time to think of an architecture able to support revocation. My proposal paves the way for such an architecture.
The topic of the Webinar of the FIDO Alliance from Wednesday, September 20, 2017 was : "NIST 800-63 Guidance and FIDO Authentication".

The agenda was:

  *   Introduction/Updates on FIDO Alliance
  *   Overview of NIST 800-63-3
  *   Exceeding NIST Guidance with FIDO Authentication
One of the speakers was Paul Grassi, Senior Standards and Technology Advisor, NIST.
According" to NISP SP 800-63A, FIDO hardware authenticators are in the category of "Multi-Factor Cryptographic Device Authenticators".
However currently FIDO authenticators do not comply with a requirement from section 5.1.9.1 of NISP SP 800-63A which states:
5.1.9.1 Multi-Factor Cryptographic Device Authenticators
Multi-factor cryptographic device authenticators use tamper-resistant hardware
to encapsulate a secret key that is unique to the authenticator.
FIDO authenticators do not encapsulate a private key that is unique to the authenticator but to a batch of authenticators.
FIDO authenticators do not comply either to another requirement from section 5.2.1 of NISP SP 800-63B which states:
5.2.1 Physical Authenticators

CSPs SHALL provide subscriber instructions on how to appropriately protect the authenticator against theft or loss.
The CSP SHALL provide a mechanism to revoke or suspend the authenticator immediately upon notification from subscriber
that loss or theft of the authenticator is suspected.

If revocation is "a non-goal" for FIDO authenticators, is it reasonable to claim at the same time that FIDO authenticators are exceeding NIST Guidance ?

Both within an enterprise environment and in an Internet environment revocation is an important concern. It is time to address and to solve this issue.



If you want to remain compatible with the current situation, you always request the use of an attestation certificate, ... if some way or another
you know that the same certificate is being used for a batch of at least 100K tokens. BTW, today there is no mean being defined to know whether the attestation certificate
applies to a single token or a batch of tokens.

Making the distinction between two kinds of RPs opens the door to new possibilities, in particular, the revocation of a specific token.


That is a balance between the wider interests of common users (who do not wish to have a tracking token) and the interests of closed, enterprise deployments,
who might like to be able to identify specific tokens, track them in inventory systems etc.

The balance is simply not between two parties (common users and RPs) but between three parties :

  *   common users,
  *   RPs that need to know whether an authenticator conforms to some security and functional requirements, and
  *   RPs that don't need to know whether an authenticator conforms to some security and functional requirements.
Item #5 in our requests attempts to reconcile this a little differently and envisions a signal to tokens that they are being used in a context where individual attestation is appropriate, and might thus return a different, unique attestation certificate.

By default, the use of an attestation certificate (whether unique or shared) would be inappropriate. By exception (i.e. upon request), it would be.


In Chrome, that signal would be wired up to our enterprise policy to allow administrators to enable direct attestation for enumerated RP IDs.

FIDO is intended to be usable on the Internet, I mean not only within an enterprise environment.
Within an enterprise environment the added value of FIDO will be "ease of use" and "security", it will not be "privacy".
Within an enterprise environment privacy is not the major concern, but on the Internet it is.

I do not object to any of that. While I don't think tokens should use individual attestation certificates by default (because not all browsers will use a Privacy CA in all likelihood), we've no problem with them doing so when signalled that they're in an enterprise environment.

Coming back to the Privacy CA proposal, you postulate that many consumers services in reality don’t care about provenance of authenticators. That's true.

You also consider that, "in enterprise environments, there are usually the need for a much stricter enforcement of authenticator types
and as such it is important in these cases for the device to disclose information about itself to the RP". I would not phrase this this way,
but I would say that some RPs (whether or not in an enterprise environment) need for a much stricter enforcement of authenticator types.

So the requirement is not necessarily associated with the enterprise environment.

You are trying to find a solution for hardware authenticators that are not compliant with NISP SP 800-63.

I am proposing a solution for hardware authenticators that would be compliant with NISP SP 800-63.

Since the functionalities of the Google Private CA are going to remain private and non verifiable by independent parties,
the information that could be gathered, as well as its use, is going to be unknown. This is a serious concern.

If the Privacy CA were being deployed, this would introduce a Big Brother component into the FIDO architecture.
Such a component should be and can be avoided.

If the project was going anyway to proceed, at the minimum, the drawbacks of the approach should be mentioned in the proposal.
Denis


Cheers

AGL

Received on Thursday, 9 November 2017 12:38:46 UTC