Re: [webauthn] Signature format needs to be defined

The signature format is defined by the algorithm of a key, which is specified [in the “alg” parameter](https://w3c.github.io/webauthn/#sec-attested-credential-data) during registration. So, for example, the [newly registered](https://w3c.github.io/webauthn/#sctn-cose-alg-reg) RSA algorithms won't have a DER signature, but rather a big-endian RSA group element because that's what [8017 says](https://tools.ietf.org/html/rfc8017#section-8.2.1).

(Although it is worth noting that ES256 is defined [here](https://tools.ietf.org/html/rfc8152#section-8.1) as best I can tell and it says:

>The signature is encoded by converting the integers into byte strings of the same length as the key size.  The length is rounded up to the nearest byte and is left padded with zero bits to get to the correct length.  The two integers are then concatenated together to form a byte string that is the resulting signature.

However, both Chrome and Firefox actually return a standard [RFC 3279](https://tools.ietf.org/html/rfc3279#section-2.2.3) ECDSA signature in this case.)

-- 
GitHub Notification of comment by agl
Please view or discuss this issue at https://github.com/w3c/webauthn/issues/799#issuecomment-365323510 using your GitHub account

Received on Tuesday, 13 February 2018 16:35:41 UTC