[whatwg] Masking and threat models (Re: PeerConnection: encryption feedback)

On Thu, Mar 24, 2011 at 11:23 AM, Glenn Maynard <glenn at zewt.org> wrote:

> It's expensive resilience: 16 bytes of added overhead for every datagram.
>  That's overhead added to every PeerConnection datagram protocol, in order
> to help hide problems in something catastrophically broken and inherently
> insecure.


I've been trying to come up with a way to reduce this overhead while still
achieving the guaranteed masking that Adam wants.

16 bytes of random data isn't actually required.  That's what you need with
CBC, to guarantee the IV won't repeat.  With CTR we can do better: combine a
6-byte monotonically increasing sequence number with 6 bytes of random
data.  (The sequence number is needed anyway, so it's not added overhead
here.)  The sequence number prevents reusing a counter; the random data
prevents predicting the CTR counter.

For example, SSSSSSRRRRRR, where S is the sequence number and R is random
bytes.  The whole thing is the counter input to CTR (with some zero-padding
so CTR can increment the counter).  Only SSSSSS is input to the
replay-prevention algorithm.

It still feels wasteful, but it's easier to swallow six bytes of overhead
than sixteen.

-- 
Glenn Maynard

Received on Thursday, 24 March 2011 09:04:37 UTC