Exactly what is cryptographic hardening? (was: Re: Updating SafeCurves for 2022...)

On 5/26/22 4:49 PM, John, Anil wrote:
> If the (2) scenario is valid, does it automatically include (1)? And what
> are the trade-offs and considerations that are involved in deploying
> something like (2) into the ecosystem?

TL;DR: There is a stark difference in outcomes between the two approaches.
"Cryptographic agility" is reactive while the "Cryptographic hardening" is
proactive. Most professional security compliance schemes have a check box for
"support cryptographic agility" (reactive), but they never actually test it at
scale in an ecosystem, which is what cryptographic hardening is about -- being
proactive.

This email, which should've really been a blog post (sorry!), explores the
security theatre around the concept of cryptographic agility with respect to
signed artefacts, such as Verifiable Credentials.

What is "Cryptographic Agility"?
--------------------------------

When you get into an argument with anyone that cares about security...
cryptographic agility is often ill defined and wielded as a cudgel.

"Oh, but where is your 'cryptographic agility'!? You have to be prepared for
when your cryptographic scheme fails!"

This argument has, misguidedly, been used against the Data Integrity approach
(of having tightly focused cryptosuites) for years now.

... and so, kitchen sink specifications are created such as the list of
acceptable cryptographic ciphers in TLS, and the list of supported curves in
JOSE (JWA, specifically). These tend to go counter to the notion of having
tightly scoped, auditable cryptography code. The design of ed25519 was partly
a reaction to all the variability and footguns of the cryptographic schemes
that came before it.

As far as we know, as of today, there's not much to get wrong with ed25519...
but there's a lot of ways you can get in trouble with RSA. We shouldn't let
application developers have control over knobs and levers that they do not
have the training to modify. It's like putting a car driver behind the
controls of a bulldozer and hoping they don't cause any damage. The JOSE stack
suffers from this design flaw, IMHO -- TLS 1.3 has learned from this and is
getting better by reducing variability in the newer cipher suites.

... and all this hand wringing over "cryptographic agility" completely misses
the bigger danger; ecosystems that are cryptographically agile have
demonstrated themselves to be the opposite when it comes time to upgrade due
to environmental factors like FIPS, HSMs, and IT teams that are terrified to
upgrade their infrastructure.

So, what do people mean when they say "cryptographic agility". I suggest that
they mean item 1: "demonstrate the ability to be reactive and upgrade if it
ever comes to that", but not item 2: "be proactive and always deploy multiple
signatures as a best practice". If you do item 2, you cover item 1, but in a
way that has far better readiness outcomes.

Is Cryptographic Agility alone, enough?
---------------------------------------

Those of us that use security libraries have been trained to think in terms of
"when X breaks, be ready to deploy Y." So, when SHA-1 is broken, be ready to
deploy SHA-256... and that happened, and the roll out took /years/ (it's still
going on). That's the failure mode for modern "cryptographic agility"... and
it's a common occurrence for ecosystems that have digitally signed assets
(like X509 certificates).

For example, SHA-1 was broken around 2005[1], and then NIST formally
deprecated use of SHA-1 in 2011, disallowing its use for digital signatures in
2013[1]... DigiCert didn't stop using SHA-1 until December 1, 2020[2]. They
were not alone -- many Certificate Authorities were dealing with legacy
deployments and could not upgrade on any reasonable time frame. That's 7 years
between NIST disallowing it and Certificate Authorities catching up. This was
done because the ecosystem wasn't ready to switch... there were just too many
legacy ecosystems at play there. So, while everyone ticked the "cryptographic
agility" checkbox... the industry as a whole was anything but agile.

Just because an ecosystem has demonstrated that they are capable of
cryptographic agility isn't enough, because when the time comes, systems have
to be upgraded -- and mission critical systems often cannot be upgraded
overnight. The problem isn't whether or not any single system is
cryptographically agile -- we can all throw a kitchen sink of cryptography
libraries at our systems and demonstrate that "Of course we can digitally sign
using all these different mechanisms! We are cryptographically agile! We are
ready!" The real problem is that there are all these digitally signed
artefacts floating out there in an ecosystem and EVERYONE has to go and update
all of those digitally signed artefacts when there is a compromise and that
takes time. It takes time for the industry to agree on the new path forward
and deploy it to mission critical systems. It takes years to upgrade mission
critical systems.

Now, people will argue that there are test suites that test TLS
interoperability and they test a large variation of different ciphersuites and
there are many implementations that pass those ciphersuite tests... and that
is true. TLS implementations tend to do a pretty good job of being agile AND
highly paid security teams tend to be far more ready to switch their servers
to the newer ciphersuites. However, there are very large swaths of the
Internet that don't have dedicated security teams and IT admins that are
stretched thin -- they don't know the appropriate ciphersuites to use this
year, or which ones have critical vulnerabilities, and so on.

Verifiable Credentials are in a different space than TLS, though. Our
ecosystem generates long-lived cryptographic artefacts, like X509 Domain
Certificates. TLS (the connection level stuff) tends to be far easier to
upgrade (because it's point to point, mostly) than a certificate (which is one
to many, mostly). There are many other reasons, such as the application layer
where VCs are used being far more complex than a low-level TLS connection.

All that to say that cryptographic agility is probably not good enough for our
ecosystem, or at the very least, we should do better than being reactive
because we now have technologies that allow us to be proactive instead of
reactive.

Can we get to "Cryptographic Hardening"?
----------------------------------------

What if we were to digitally sign a Verifiable Credential using multiple
cryptographic schemes? For example, secp256r1 (NIST approved), BBS+ (for
selective disclosure, and SPHINCS+ (post-quantum secure).

For the purposes of this discussion, I'm going to call this approach
"cryptographic hardening" to contrast it with "cryptographic agility". The
cryptographic hardening approach ensures that we are proactive when we
digitally sign long-lived Verifiable Credentials in our ecosystems such that
we can improve upon the 7 year upgrade cycles that the Certificate Authorities
fell victim to when SHA-1 was found to be compromised.

So, if we find that secp256r1 becomes broken, we still have the other two
schemes to fall back on while systems go through a multi-year upgrade cycle.

Cryptographic hardening has another benefit, which is that it allows more
modern, experimental cryptographic schemes to be deployed in parallel with
older, more "officially approved" schemes. So, an organization that must issue
their credentials using NIST secp256r1 crypto might also be able to issue the
same credential with BBS+ or SPHINCS+ signatures in parallel. They get to be
compliant, while allowing any other entity in the ecosystem to "take the risk"
and consume the newer BBS+ and SPHINCS+ crypto.

This might just be what NIST needs to enable more modern cryptography to be
used beside cryptography that they have approved.

So, what are the downsides?
---------------------------

It's more complicated for one. The protocols that use cryptographically
hardened Verifiable Credentials will need to know when to use one signature
over another (but this is a fairly straight-forward problem to solve, and one
we must solve anyway for verifiers. The solution is for the verifier to just
say the types of signature schemes it prefers and the digital wallet only
sends those signatures over with the VC).

The other issue is that some cryptographic envelope formats don't support
cryptographic hardening. X509 only supports one signature as do JWTs. The Data
Integrity specifications

.. and like just about everything in software engineering, it comes down to
what trade-offs we want to make?

As a community, there is a requirement to support NIST-approved cryptography,
but to also support things like selective disclosure (BBS+), and I'm pretty
sure some of us will want post-quantum signatures well in advance of HSMs
supporting them. So, it feels like an option with some compelling upsides and
some manageable downsides.

Thoughts? Are there other benefits or drawbacks to cryptographic hardening
over just supporting cryptographic agility?

-- manu

[1]https://www.schneier.com/blog/archives/2005/02/sha1_broken.html
[2]https://knowledge.digicert.com/alerts/end-of-issuance-for-sha1-code-signing-certificates.html

-- 
Manu Sporny - https://www.linkedin.com/in/manusporny/
Founder/CEO - Digital Bazaar, Inc.
News: Digital Bazaar Announces New Case Studies (2021)
https://www.digitalbazaar.com/

Received on Friday, 27 May 2022 20:57:17 UTC