W3C home > Mailing lists > Public > ietf-http-wg@w3.org > July to September 2014

Cost analysis: (was: Getting to Consensus: CONTINUATION-related issues)

From: Martin Thomson <martin.thomson@gmail.com>
Date: Fri, 18 Jul 2014 11:24:22 -0700
Message-ID: <CABkgnnWmBUNKFDH8JKz8GKRgZDaS=1f6yQ0C6CdF_zv=QnPR8A@mail.gmail.com>
To: Jason Greene <jason.greene@redhat.com>
Cc: Michael Sweet <msweet@apple.com>, Nicholas Hurley <hurley@todesschaf.org>, HTTP Working Group <ietf-http-wg@w3.org>
On 18 July 2014 10:57, Jason Greene <jason.greene@redhat.com> wrote:
> It’s extra complexity, but the implementation isn’t difficult (a cake walk compared to other aspects of the spec). I can certainly appreciate the perspective from implementations that don’t want to touch their code though.

I realize that this is a standard sophist technique in this forum, but
I find that selectively trivializing various aspects of the space
isn't particularly constructive.  Let's try to be even-handed in our
analysis.

On the one side:

CONTINUATION has a cost in code complexity.  It defers the discovery
of what might be a surprisingly large amount of state.

On the other:

A hard cap on size (i.e., option A) has a cost in code complexity.  It
requires that encoders be prepared to double their state commitment so
that they can roll back their header table when the cap is hit.  When
you consider compression, it does not prevent there from being a
surprisingly large quantity of header information.

Option B seems almost orthogonal to these points, providing a 431
warning.  I'm not really certain how it would be used.

Did I miss anything else pertinent to the discussion?
Received on Friday, 18 July 2014 18:24:50 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 30 March 2016 09:57:09 UTC