RE: Defaults: Getting concrete (round 2)

> -----Original Message-----
> From: Ryan Sleevi [mailto:sleevi@google.com]
> Sent: Monday, April 22, 2013 11:09 AM
> To: Richard Barnes
> Cc: Wan-Teh Chang; Web Cryptography Working Group
> Subject: Re: Defaults: Getting concrete (round 2)
> 
> On Sun, Apr 21, 2013 at 6:49 PM, Richard Barnes <rbarnes@bbn.com> wrote:
> >
> > On Apr 19, 2013, at 6:47 PM, Ryan Sleevi <sleevi@google.com> wrote:
> >
> >> On Fri, Apr 19, 2013 at 3:16 PM, Wan-Teh Chang <wtc@google.com>
> wrote:
> >>> On Thu, Apr 18, 2013 at 11:14 AM, Richard Barnes <rbarnes@bbn.com>
> wrote:
> >>>>
> >>>> I agree that there are lots of protocols that have defined ways to shove
> things into the counter and IV fields for CTR and GCM.  They can always
> override the default.
> >>>>
> >>>> I'm more concerned about newer protocols that haven't done
> something similar (and probably don't need to).  Those protocols just need
> something that meets the security requirements, and it's easy enough for
> the UA to provide that.
> >>>>
> >>>> We've also seen that application designers can also get counter/IV
> generation badly wrong, as with the recent nonce reuse issue in JOSE:
> >>>> <http://www.ietf.org/mail-archive/web/jose/current/msg01967.html>
> >>>>
> >>>> So while you're right that there are protocols that will not make use of
> the default, I think that newer things can benefit from having a safe default
> here.
> >>>
> >>> 1. Let's first consider the counter field for the CTR mode.
> >>>
> >>> Unless the UA knows about all the CTR mode encryptions that have
> >>> been done with the key in question, the UA cannot generate a new
> >>> counter value that hasn't been used before.
> >>>
> >>> This requires the UA to be the exclusive user of the key in question.
> >>> But if the API allows the key to be exported, the UA won't be the
> >>> exclusive user of the key.
> >>
> >> Or a new key imported that has been used previously.
> >>
> >> I definitely don't think implementations should be trying to track
> >> what the 'used' counters are - that's certainly the realm and
> >> responsibility of a high-level protocol, and no API does this.
> >
> > Obviously, in the fully general case, there's no way the UA can guarantee
> uniqueness.
> >
> > There is one clear case where the UA knows for sure the entire set of
> counters that has been used, namely for non-exportable keys generated by
> the UA.  Likewise for exportable keys that have not been exported (which
> the UA knows).
> 
> No, this is not at all a realistic statement.
> 
> No crypto library tracks IVs. When you think of the space of possible IVs
> (2^128), it's entirely unreasonable to suggest that UAs should.
> Otherwise, you've implemented a trivial DoS. Heck, you've implemented
> arbitrary storage, simply by allowing a web application to flip bits in IV
> consumption.
> 
> Given how much push back there was simply to track *algorithm* usage, I'm
> truly surprised that a discussion about tracking *IVs* is being entertained.
> 
> And to what end? The security assurances provided by such a solution are
> dwarfed by the complexity of implementation and the security risks therein.
> 
> I again propose that we drop this topic.
+1

IV: No crypto layer tracks IVs, and should not track IVs. In practice, IVs are random numbers.
CTR: Unless there is a monotonic counter, no one tracks counters (protocols may for session keys, and that amounts to counters). Besides, monotonic counters in distributed systems is a pain in the back side where the same key may be used.

> 
> >
> > As Wan-Teh points out, an RBG-based CTR approach can offer very low
> probability of counter reuse.  For a 32-bit counter / 96-bit nonce, the
> probably of nonce re-use would be 2^-96 as long as no single encryption
> processed more than 16GB.  It wouldn't be strictly FIPS compliant, but it
> would be practically there.  And if apps care, they can generate their own IVs
> to ensure uniqueness.
> 
> We should NOT be baking cryptographic protocols into the low-level API
> - which is exactly what this is.
> 
> >
> > I would posit that these use cases -- UA-generated keys and encryptions
> <16GB -- cover a broad enough swathe of likely usages that they are worth
> addressing.
> 
> Strongly disagree here, for the reasons above.
> 
> >
> >
> >>> 2. As to the IV field for GCM, I think the UA can use the RBG-based
> >>> construction of the IV in Section 8.2.2 of NIST SP 800-38D. I
> >>> believe this is what you proposed.
> >>>
> >>> 3. This makes me wonder if an RBG-based construction of the counter
> >>> field for the CTR mode would also be acceptable if the probability
> >>> of reusing a counter value is low enough.
> >>>
> >>> Wan-Teh
> >>>
> >>
> >> I think the inconsistency argument should be the one we're looking at
> here.
> >>
> >> The argument for having the UA generate the IV is not one being made
> >> on technical requirements, but simply on the basis that "People (may)
> >> use it incorrectly."
> >
> > I guess our disagreement is on the risk = likelihood * impact calculation for
> IV re-use.  I'm claiming that the likelihood of a non-crypto-expert developer
> re-using an IV is non-negligible, and the impact is likely to be severe.  You are
> apparently claiming that either or both of these factors is effectively zero.
> That doesn't really seem plausible to me.
> 
> I suggest your logic is fundamentally flawed by attempting to design for the
> non-crypto-expert here. We've repeatedly had this conversation
> - particularly whenever defaults are brought up. We've repeatedly assessed
> that "no API can serve two masters" - attempting to split the API like that
> only serves to make a worse API (which I think the discussion on this thread
> clearly shows how wildly inconsistent and unpredictable it becomes).
> 
> Again, I'm extremely sympathetic to the non-crypto-expert here - but I don't
> think it's at all reasonable or well-thought-out to try to shoe-horn it in as a
> last minute design consideration. These are, again, the exact same points
> that were raised the last time we discussed defaults - and from a number of
> people, not just me.
> 
> >
> >
> >> The fact that the IV needs to be protected (outside of the AEAD case)
> >> should be a compelling reason enough for us to suggest it's a false
> >> argument being presented.
> >
> > Could you clarify in what sense the IV needs to be protected?  I assume
> you don't mean confidentiality protection [1].  And in any case, I don't really
> see how that bears on how the IV is generated.
> 
> Integrity, not confidentiality. The fundamental issue of the "Cryptographic
> Doom Principle".
> 
> You're arguing that IV generation prevents footguns by some measurable
> sense. I'm repeatedly asserting that this is demonstrably not the case
> - and that whatever incremental value derived from trying to do so is vastly
> eclipsed by both the implementation and cognitive complexity from the
> wildly inconsistent API needed to service this.
> 
> >
> > --Richard
> >
> >
> >
> > [1] From SP 800-38A: "The IV need not be secret, so the IV, or
> > information sufficient to determine the IV, may be transmitted with the
> ciphertext."
> >
> >
> >
> >
> >

Received on Monday, 22 April 2013 20:43:59 UTC