Re: HTTP header compression benchmark

Sure.  However, 15 m is probably a bit much, I'll shoot for less (but
I don't know how Q&A-hungry the group will be ;-)).

Happy hacking!

Christian

On 07/28/13 08:56, Mark Nottingham wrote:
> Thanks, Christian.
>
> Could you give a short (~15 minute) summary at our Wednesday meeting?
>
> Regards,
>
>
> On Jul 20, 2013, at 7:54 PM, Christian Grothoff <christian@grothoff.org> wrote:
>
>> Dear all,
>>
>> I've recently published a benchmark for HTTP header compression based on captured
>> real-world HTTP traffic data.  The benchmark contains five sets of approximately
>> one million HTTP request and response headers.  Details on how the data was
>> collected, preprocessed, C code (generation, preprocessing and sample compressions
>> using gzip and bzip2), as well as some statistical data on the benchmarks are
>> all available at https://gnunet.org/httpbenchmark/.
>>
>> It should be noted that these are not just raw HTTP headers, but request sequences.
>> So the benchmark will allow you to assess the performance of compression algorithms
>> that compress traffic for an entire SPDY/HTTP 2.0 session (differences, etc.) and
>> not just process each header in isolation.
>>
>> I hope this will be useful for informed discussions on HTTP 2.0 header compression;
>> I'll be available for discussions on the benchmark at IETF 87.
>>
>> Happy hacking!
>>
>> Christian
>>
>
> --
> Mark Nottingham   http://www.mnot.net/
>
>
>

Received on Sunday, 28 July 2013 08:21:48 UTC