Re: Proposal: Chrome privacy CA

Hi Christiaan,

> Hi Denis & Johan,
>
> Apologies for my slow response here.
>
>     For non-enterprise deployment, you are asking RPs to trust the
>     Privacy CA infrastructure which is unspecified, uncertified and
>     probably even unaudited.
>
> You're right, but I think if there was a requirement for auditing from 
> RPs we could get behind that.

Johan is indeed right. You are asking RPs to trust another 
infrastructure that is left "unspecified, uncertified and probably even 
unaudited".

With the Privacy CA infrastructure, Google would be introducing a 
potential /spying component/ in the model. The Privacy CA infrastructure 
would allow
to gather information likethe identity of the token vendors and the rate 
of use of such hardware tokens. That information could then be used to 
perform
a market survey for the benefits of Google and/or be sold to some 
unspecified parties. Google would be the only one able to perform such a 
market survey.

>  This is the weakest point in the proposal. The Privacy CA security 
> level essentially becomes the security level of all authenticators.
>
>     An RP can’t trust its attestation of a level 5 authenticator
>     unless the infrastructure that generated this attestation is also
>     certified to an equivalent security level.
>
> True, but I think this is a solvable challenge.

It is questionable whether this challenge really needs to be solved 
considering the previous point.

>  Why not make this a service you offer to your Chrome users but allow 
> them to override this at the request of the RP?
>
>
> Allowing tokens to be identified in this way will inevitably lead to 
> fragmentation and we'd like to try
> and move the ecosystem to a place where "all tokens meeting a certain 
> bar is considered equal".

The proposal below is able to solve the problem.
>
>     There is a simpler solution able to address the same concerns:
>
>     -For those RPs that need to know whether an authenticator conforms
>     to some security and functional requirements,
>     the use of an attestation certificate will be required.
>
>     -For those RPs that don't need to know whether an authenticator
>     conforms to some security and functional requirements,
>     the use of an self-signed certificate will be required.
>
>     This means that a given hardware authenticator should be able to
>     support, upon the request from the user or the client,
>     either an attestation certificate or a self-signed certificate.
>
>     There would be no need to introduce the concept of an "enterprise
>     environment".
>     There would be no need to support Privacy CAs.
>
>     This would eliminate privacy concerns with the Privacy CA which
>     would otherwise be able to gather information
>     about the use of the authenticators. This would also eliminate
>     bottlenecks to the Privacy CAs.
>
> While I agree with this, in practice I think many RPs will ask for 
> attestation just because they can.
> Many RPs (such as Google) don't necessarily care about attestation, 
> but we do feel that we'd like to identify batches of tokens if 
> revocation is ever needed.
> The privacy CA provides a model for that.

You said: "/many RPs will ask for attestation just because they can/". 
No, this is not as simple. In order to verify a certificate attestation,
the RP needs to get at least one root key per hardware token vendor and 
to make sure that the root keys are the right ones.
Presently, there is no specification that tells them where from they can 
(securely) get these root keys.

The easiest for RPs will be to use self-attestations *and to verify 
these self-attestations* because with the current protocol this is 
currently
the only way to get a proof of possession (PoP) of the private key at 
registration time.

You also said: "/identify batches of tokens if revocation is ever 
needed/". In case of revocation, the first need is NOT to revoke a batch 
of tokens
but to *revoke an individual token*. The Privacy CA model will be of no 
help to achieve such an objective.

> The third sentence from the extract states:
>
>     These relying parties rarely (if ever) have the need to *uniquely*
>     *identify* authenticators.
>
>     As soon as the use of attestation certificates will be limited to
>     RPs that really have the need to know that an authenticator
>     conforms to some security and functional requirements, it will be
>     possible to avoid batch certificates and to allow authenticators
>     to be individually revoked.
>
>     It should be observed that RPs that need to know whether an
>     authenticator conforms to some security and functional requirements
>     usually already "know their customers" with their family name,
>     first name, address and even birth date, so using an hardware
>     authenticator
>     that can be individually identified would not be a major problem.
>

> This isn't the only problem we're concerned about. The FIDO ideal is 
> to have one token per user, that's used across the web without fear of 
> the device
> being trackable amongst RPs. In this example, multiple RPs colluding 
> will be able to know I'm using the same device for both services.
> We want to discourage the world where a user needs multiple tokens, 
> one per RP.

/We are on the same track: I don't want either users to use one hardware 
token per RP. However,all RPs are not born equal. A difference can be 
made between:
/

      * /RPs that "need to //know their customers" and would also like
        to know whether an authenticator conforms to some security and
        functional requirements.
        /
      * /RPs that don't really need to know whether an authenticator
        conforms to some security and functional requirements. These RPs
        would not, in any way,
        be able to link their user accounts./

RPs that "need to know their customers" would indeed to be able to make 
sure that they have the same customer by looking at the attestation 
certificate.
However, they would nevertheless be able to do it by using the family 
name, the first name, the address, the birth date and most often the 
email address.

The balance of the advantages of this simple model versus the 
disadvantages is rather simple to establish when compared to the 
drawbacks of the Privacy CA model.

>
>     This would simplify the overall model rather than adding an extra
>     level of complexity.
>
> That's true, but the risk of market fragmentation remains.

The Privacy CA infrastructure is likely to introduce market 
fragmentation by giving to a single vendor a position that other vendors 
will not be able to get.

Denis

>
> /christiaan
>
> On Mon, Nov 6, 2017 at 10:19 AM, Denis Pinkas <denis.w3c@free.fr 
> <mailto:denis.w3c@free.fr>> wrote:
>
>     Hi Christiaan,
>
>     Answering to the comments sent on the list by Johan Verrept  and
>     myself would be appreciated.
>
>     Regards,
>
>     Denis
>
>>     Hi folks,
>>
>>     In the interest of moving the spec forward and not causing any
>>     undue delays here I just want to make it clear that Google's
>>     intention is *not *to hold the current spec advancement of FIDO2
>>     to ID, or WebAuthN to CR, hostage. We fully support the current
>>     train, but want to ensure that we discuss and address these open
>>     issues in the next RDs and WDs. Please note that these issues are
>>     of utmost importance to us and that our implementation of the
>>     /current spec /will already take some of this thinking into account.
>>
>>     Regards,
>>     Christiaan
>>
>>     On Thu, Nov 2, 2017 at 9:40 AM, J.C. Jones <jc@mozilla.com
>>     <mailto:jc@mozilla.com>> wrote:
>>
>>         Mozilla definitely agrees that the AAGUID needs to move out
>>         of the signed payload for Privacy CA to function correctly.
>>         That is a change we feel needs to happen ASAP. Thank you for
>>         identifying the problem!
>>
>>         We're also considering internally to enforce either a Privacy
>>         CA or Self-Attestation mode on all authenticators when users
>>         are using our Private Browsing feature, and exposing the
>>         ability to enforce the same to Tor Browser. The UX
>>         implications here are still very TBD.
>>
>>
>>
>>         On Wed, Nov 1, 2017 at 3:51 PM, Hodges, Jeff
>>         <jeff.hodges@paypal.com <mailto:jeff.hodges@paypal.com>> wrote:
>>
>>             On 11/1/17, 9:21 AM, "Christiaan Brand"
>>             <cbrand@google.com <mailto:cbrand@google.com>> wrote:
>>
>>             > Please see attached a proposal from Google regarding the "Privacy CA" model that
>>
>>             > Chrome will be adopting.  ...  Please note that this document is a WIP,
>>             but I wanted to
>>
>>             > make sure we give everyone an early glimpse into our thinking so we could
>>             refine the
>>
>>             > proposal as we go along while making sure we have the necessary plumbing in
>>
>>             > WebAuthN to support this model.
>>
>>             here's a plain-text rendering of "External FIDO Privacy
>>             CA design.pdf"'s content, in case it is useful for
>>             replying/commenting/etc..
>>
>>             FIDO Privacy CA
>>
>>             Proposal
>>
>>             Adam Langley < agl@google.com <mailto:agl@google.com> >
>>
>>             Matt Braithwaite < mab@google.com <mailto:mab@google.com> >
>>
>>             Christiaan Brand < cbrand @google.com <http://google.com> >
>>
>>             Alexei Czeskis < aczeskis @google.com <http://google.com> >
>>
>>             Dirk Balfanz < balfanz @google.com <http://google.com> >
>>
>>             (October 2017)
>>
>>             This document is intended to inform other webauthn
>>             parties about our
>>
>>             plans and to encourage them to implement the changes to
>>             webauthn
>>
>>             that's required to support this model.
>>
>>             The problem
>>
>>             Webauthn and FIDO supports the concept of attestation:
>>             the ability
>>
>>             for a relying party to determine the provenance of the
>>
>>             authenticator-to-be-registered. We postulate that many
>>             consumers
>>
>>             services in reality don't care about provenance of
>>             authenticators
>>
>>             especially when deployed as a second factor, since the
>>             primary threat
>>
>>             model is scalable, remote attacks. However, in enterprise
>>
>>             environments there are usually the need for a much stricter
>>
>>             enforcement of authenticator types and as such it is
>>             important in
>>
>>             these cases for the device to disclose information about
>>             itself to
>>
>>             the RP.
>>
>>             In the current FIDO 1 world an RP would take the batch
>>             attestation
>>
>>             certificate sent by the authenticator and query the FIDO
>>             MDS to
>>
>>             determine the relevant attributes: whether the device has
>>             passed
>>
>>             protocol compliance testing, and perhaps the relevant
>>             security level
>>
>>             of the device. We want to make implementing
>>             attestation-checking
>>
>>             easier, so that sites are more likely to do it correctly
>>             and the
>>
>>             webauthn experience overall will be better for users.
>>
>>             The solution
>>
>>             Chrome intends to implement what we are calling a Privacy
>>             CA, but
>>
>>             which might be called an "attestation proxy". During
>>             webauthn and
>>
>>             U2F registrations the attestation certificate and
>>             signature from the
>>
>>             token will be sent to a Privacy CA, along with the hash
>>             of the signed
>>
>>             data. We are planning on following this same model for
>>             built-in
>>
>>             Authenticators on Android too, even when registrations
>>             are performed
>>
>>             by apps on the device.
>>
>>             The Privacy CA will:
>>
>>               1. Check the attestation signature, given the attestation
>>
>>                  certificate  and signed data hash.
>>
>>               2. Check the certificate (or other attestation) against
>>             its local
>>
>>                  policies.
>>
>>               3. If the signature is valid and the certificate is
>>             recognised, it
>>
>>                  will return a new packed attestation certificate and
>>             signature
>>
>>                  of the same hash to the Chrome instance.
>>
>>             Chrome will pass the Privacy CA certificate to the
>>             calling Javascript
>>
>>             so that the token appears to be attested by the Privacy
>>             CA. If the
>>
>>             Privacy CA returns an error then Chrome will substitute a
>>             generic,
>>
>>             meaningless attestation certificate for U2F and, in webauthn,
>>
>>             potentially return a dummy attestation type.
>>
>>             The Privacy CA will support two levels of attestation:
>>             Basic and FIDO
>>
>>             Security Certification Level tbd (hardware attestation +
>>             some code
>>
>>             review coverage). In time, we intend for both of these to
>>             be defined
>>
>>             by the MDS, which the Privacy CA will reload regularly. In
>>
>>             exceptional circumstances (such as a security issue with
>>             a token that
>>
>>             should be responded to immediately), or in order to
>>             bootstrap the
>>
>>             system before the MDS is ready, we may augment the MDS data.
>>
>>             The two levels of attestation will be exposed as two
>>             different
>>
>>             attestation roots.
>>
>>             The certificates from the Privacy CA, and the randomly
>>             generated
>>
>>             certificates from Chrome, will copy the transport type
>>             extension from
>>
>>             the token's certificate.
>>
>>             The Privacy CA will not learn of the sites that a user is
>>             registering
>>
>>             with because it only receives the hash of the signed
>>             data, and that
>>
>>             hash includes a random challenge which blinds the
>>             included rpID.
>>
>>             Enterprise cases
>>
>>             We do acknowledge that there are other relying parties
>>             out there that
>>
>>             have an obligation to ensure that the authenticators they
>>             accept meet
>>
>>             a certain security and usability bar, while not
>>             necessarily having
>>
>>             control over the client platform. These relying parties
>>             rarely (if
>>
>>             ever) have the need to uniquely identify authenticators
>>             or even
>>
>>             authenticator vendors, but rather are interested in being
>>             able to
>>
>>             tell whether an authenticator conforms to some minimum
>>             requirement.
>>
>>             Chrome has a mature enterprise policy system. A policy
>>             control will
>>
>>             be added to allow a token's attestation certificate to be
>>             returned
>>
>>             directly to the calling Javascript for whitelisted rpIDs.
>>
>>             This will obviously not apply to clients that don't have the
>>
>>             enterprise policy installed, but we note that a token
>>             need only be
>>
>>             registered on a configured client. It can then be used in
>>             other
>>
>>             machines.
>>
>>             Retrospective unblinding of tokens
>>
>>             We understand that the ability to identify affected users
>>             when a
>>
>>             security issue with a token is disclosed is desired.
>>             While issues
>>
>>             with weak keys and bad key-handle construction can likely be
>>
>>             identified without attestation information, some cases
>>             cannot be
>>
>>             spotted that way.
>>
>>             In order to support this, Privacy CA certificates will
>>             contain an
>>
>>             extension containing a series of 32-byte values. The
>>             first 16 bytes
>>
>>             of each value will be random and the remaining 16 bytes
>>             will be the
>>
>>             truncated HMAC-SHA256 of that random value under a key.
>>             The HMAC key
>>
>>             will be specific to some property of the token's
>>             certificate, for
>>
>>             example the certificate itself, the Issuer name, or
>>             perhaps the
>>
>>             AAGUID. A given certificate can contain several such
>>             32-byte values
>>
>>             and thus may be identified by several different properties.
>>
>>             Google will maintain a public URL serving a JSON file
>>             containing the
>>
>>             HMAC keys corresponding to token certificates that are
>>             linked to a
>>
>>             known security issue. In this way, the Privacy CA can
>>             retrospectively
>>
>>             unblind Privacy CA certificates.
>>
>>             For example, if a specific batch of tokens is found to be
>>             flawed, the
>>
>>             HMAC key linked to the specific attestation certificate
>>             for that
>>
>>             batch can be published. RPs that wish to take special
>>             measures to
>>
>>             respond to the flaw can search their recorded attestation
>>
>>             certificates and, for those from the Privacy CA, look for a
>>
>>             value/HMAC pair that matches when using the published
>>             HMAC key.
>>
>>             Individual certificates
>>
>>             Since Chrome will have an enterprise policy control for
>>             direct
>>
>>             attestation, it can expose that signal to the token in
>>             case the token
>>
>>             should wish to use an individual attestation certificate
>>             in that
>>
>>             situation.
>>
>>             Would ECDAA be a better choice than a privacy CA?
>>
>>             Firstly, ECDAA is currently moot as millions of U2F
>>             tokens are
>>
>>             already deployed with batch certificates. We have to
>>             support them in
>>
>>             any case. Secondly, ECDAA is a smarter way to do batch
>>             attestation,
>>
>>             but it still inherently exposes vendor and, likely,
>>             model, and so
>>
>>             causes many of the same concerns as batch certificates.
>>
>>             Access to the Privacy CA
>>
>>             Expediency requires that Chrome's Privacy CA be run by
>>             Google, at
>>
>>             least at first. We are open to other browsers using our
>>             Privacy CA
>>
>>             should they so desire.
>>
>>             Requests from webauthn and FIDO
>>
>>             In priority order:
>>
>>             1. That the AAGUID be moved from the signed registration
>>             data to the
>>
>>             token's attestation certificate.
>>
>>             2. That an option be provided at registration time for
>>             sites to
>>
>>             indicate whether they "care" about attestation. If not, the
>>
>>             Privacy CA round-trip can be omitted. PR is here
>>             <https://github.com/w3c/webauthn/pull/636
>>             <https://github.com/w3c/webauthn/pull/636>>.
>>
>>             3. That the option default to false, i.e. so that people
>>             implementing
>>
>>             webauthn in the long-tail of sites and who will never
>>             care about
>>
>>             attestation, get the correct behaviour by default.
>>
>>             4. That the browser be able to add a blinding value
>>             that's included
>>
>>             in the signed registration data. (This eliminates the
>>             need for the
>>
>>             RP's registration challenge to have enough entropy to
>>             blind the
>>
>>             rpID from the Privacy CA.) This does not require a change
>>             to the API
>>
>>             and is simply something the browser could do today.
>>
>>             5. A boolean in the CTAP2 registration message to
>>             indicate to tokens
>>
>>             that individual attestation certificates may be used.
>>
>>             end
>>
>>
>>
>
>

Received on Tuesday, 7 November 2017 10:03:38 UTC