- From: Larry Masinter <masinter@adobe.com>
- Date: Sun, 29 Jul 2012 10:57:25 -0700
- To: Adam Barth <w3c@adambarth.com>
- CC: "ietf-http-wg@w3.org" <ietf-http-wg@w3.org>
- Message-ID: <c3ae1b1c-912e-4191-84ce-d50caef9a0e3@blur>
what are the use cases and requirements for such gateways? A general purpose omni-site 2.0->1.1 gateway might need to do sniffing of sites not known to have correct content-type headers, but 1.1->2.0 gateways shouldn't need to change content-type .... sniffing leaves correctly labeled content intact, of course. :) Connected by DROID on Verizon Wireless -----Original message----- From: Adam Barth <w3c@adambarth.com> To: Larry Masinter <masinter@adobe.com> Cc: "ietf-http-wg@w3.org" <ietf-http-wg@w3.org> Sent: Sun, Jul 29, 2012 00:53:44 GMT+00:00 Subject: Re: Straw-man for our next charter From the charter: ---8<--- Changes to the existing semantics of HTTP are out of scope in order to preserve the meaning of messages that might cross a 1.1 --> 2.0 --> 1.1 request chain. --->8--- http://datatracker.ietf.org/wg/httpbis/charter/ Changing how user agents interpret the Content-Type header would change the semantics of HTTP and are therefore out of scope for HTTP/2.0 according to our current charter. Adam On Sat, Jul 28, 2012 at 8:01 AM, Larry Masinter <masinter@adobe.com<mailto:masinter@adobe.com>> wrote: The sniffing I was in particular hoping to stop is content-type sniffing. http://tools.ietf.org/html/draft-ietf-websec-mime-sniff-03 " Many web servers supply incorrect Content-Type header fields with their HTTP responses. In order to be compatible with these servers, user agents consider the content of HTTP responses as well as the Content-Type header fields when determining the effective media type of the response." If browsers suddenly stopped sniffing HTTP/1.1 content, it would break existing web sites, so of course the browser makers are reluctant to do that. However, if it was a requirement to supply a _correct_ content-type header for HTTP/2.0, and no HTTP/2.0 client sniffed, then sites upgrading to HTTP/2.0 would fix their content-type sending (because when they were deploying HTTP/2.0 they would have to in order to get any browser to work with them.) Basically, sniffing is a wart which backward compatibility keeps in place. Introducing a new version is a unique opportunity to remove it. The improved performance would come from having to look at the content to determine before routing to the appropriate processor. Larry -----Original Message----- From: Amos Jeffries [mailto:squid3@treenet.co.nz<mailto:squid3@treenet.co.nz>] Sent: Friday, July 27, 2012 11:53 PM To: ietf-http-wg@w3.org<mailto:ietf-http-wg@w3.org> Subject: Re: Straw-man for our next charter On 28/07/2012 6:39 p.m., Larry Masinter wrote: > re changes to semantics: consider the possibility of eliminating > "sniffing" in HTTP/2.0. If sniffing is justified for compatibility > with deployed servers, could we eliminate sniffing for 2.0 sites? > > It would improve reliability, security, and even performance. Yes, > popular browsers would have to agree not to sniff sites running 2.0, > so that sites wanting 2:0 benefits will fix their configuration. > > Likely there are many other warts that can be removed if there is a > version upgrade. Which of the several meanings of "sniffing" are you talking about exactly? AYJ
Received on Sunday, 29 July 2012 17:57:37 UTC