- From: David Morris <dwm@xpasc.com>
- Date: Thu, 30 Jul 2009 18:38:52 -0700 (PDT)
- cc: "'HTTP Working Group'" <ietf-http-wg@w3.org>
Ooops ... this is where I blush and appologize for not correctly remembering the sequence of issues I encountered. Going back to capture examples, I found that the content now ungzips correctly. Digging further, it turns out that I had this problem before I discovered that transfer encoding chunked was being incorrectly removed. Since I had already added built in gunzip in the extract tool, I had not gone back to try the gunzip. Sorry about crying wolf without double checking my results. Dave Morris On Thu, 30 Jul 2009, Henrik Nordstrom wrote: > ons 2009-07-29 klockan 22:17 -0700 skrev David Morris: >> While doing some recent packet capture / object extraction work as part of >> an effort to study the current degree of content-encoding by live web >> sites, I needed to remove gzip encoding ... naively thought I could just >> save the data as a *.gz file (tried .zip also) and post process it. Found >> about 800 gzip encoded responses of about 2400 and decoded them all >> assuming no gzip or zlib wrapper. I'm not 100% sure about my conclusions >> (I probably could provide a couple example payloads still gzip-ed in file >> form if someone with more knowledge would care to look), but it appears to >> me that 'some incorrect' is likely 'most implementations are incorrect'. > > Interesting. Are you saying that you saw this breakage for content > announced with an gzip content encoding as well, or did you mean to say > deflate encoding above? > > Regards > Henrik >
Received on Friday, 31 July 2009 01:39:35 UTC