W3C home > Mailing lists > Public > ietf-http-wg@w3.org > July to September 2012

RE: Re[2]: Straw-man for our next charter

From: Larry Masinter <masinter@adobe.com>
Date: Sun, 29 Jul 2012 20:24:38 -0700
To: Adam Barth <w3c@adambarth.com>, "Adrien W. de Croy" <adrien@qbik.com>
CC: "ietf-http-wg@w3.org" <ietf-http-wg@w3.org>
Message-ID: <C68CB012D9182D408CED7B884F441D4D1E2D86A111@nambxv01a.corp.adobe.com>
HTTP 2.0 can tighten requirements where loose interpretation in HTTP 1.x leads to performance, reliability, security problems.

For some areas (like pipelining) it would require browsers and other HTTP agents to implement MORE, and perhaps that isn't realistic.
But for sniffing, we would not be asking browsers to implement more, but rather to turn off heuristic interpretation.
So I think it's feasible and quite consistent with the goals and can meet the gateway requirements as well.

There's no need to sign content. There's just a need to for tooling to aid site owners into making their web sites "HTTP 2.0 ready".
That seems to require a variety of optimizations, anyway, so cleaning up content-type labels.

Perhaps this is just a matter of reversing the sense of the "nosniff" header, that is 2.0 defaults nosniff to true. 1.1 defaults to false. Gateways can just make it explicit.



-----Original Message-----
From: Adam Barth [mailto:w3c@adambarth.com] 
Sent: Sunday, July 29, 2012 5:11 PM
To: Adrien W. de Croy
Cc: Larry Masinter; ietf-http-wg@w3.org
Subject: Re: Re[2]: Straw-man for our next charter

On Sun, Jul 29, 2012 at 3:59 PM, Adrien W. de Croy <adrien@qbik.com> wrote:
>
> We see this problem a lot at the gateway.  We have processing agents that
> only want to process say text/html, and really don't like getting streamed
> MP4s labelled as text/html by some brain-dead server
>
> But in the end, where does the server get the C-T from?  Most just do a map
> lookup on file extension.
>
> Even if we tried to push the meta-data into the resource itself, so it could
> be specified by the actual author (think about the hosted site, where the
> site maintainer has no control over content types the server will send, or
> not easily), then how do we trust that information?  Some attacker can label
> whatever content as whatever type if they can find some purpose to do so.
>
> In the end, I think it basically makes Content-Type largely unreliable.  I
> don't see this changing with 2.0 (at least not properly), unless we
> introduce the concept of trust - either sign content by someone vouching for
> its type, or run RBLs of known bad servers.
>
> Do we even need C-T if clients are sniffing anyway?

It's certainly used in an essential way in the web browser security
model.  In any case, I'm pretty sure this discussion is getting
off-topic.

Adam


> ------ Original Message ------
> From: "Larry Masinter" <masinter@adobe.com>
> To: "ietf-http-wg@w3.org" <ietf-http-wg@w3.org>
> Sent: 29/07/2012 3:01:08 a.m.
> Subject: RE: Straw-man for our next charter
>>
>> The sniffing I was in particular hoping to stop is content-type sniffing.
>> http://tools.ietf.org/html/draft-ietf-websec-mime-sniff-03
>>
>> " Many web servers supply incorrect Content-Type header fields with
>>  their HTTP responses.  In order to be compatible with these servers,
>>  user agents consider the content of HTTP responses as well as the
>>  Content-Type header fields when determining the effective media type
>>  of the response."
>>
>> If browsers suddenly stopped sniffing HTTP/1.1 content, it would break
>> existing web sites, so of course the browser makers are reluctant to do
>> that.
>>
>> However, if it was a requirement to supply a _correct_ content-type header
>> for HTTP/2.0, and no HTTP/2.0 client sniffed, then sites upgrading to
>> HTTP/2.0 would fix their content-type sending (because when they were
>> deploying HTTP/2.0 they would have to in order to get any browser to work
>> with them.)
>>
>> Basically, sniffing is a wart which backward compatibility keeps in place.
>> Introducing a new version is a unique opportunity to remove it.
>>
>> The improved performance would come from having to look at the content to
>> determine before routing to the appropriate processor.
>>
>> Larry
>>
>>
>> -----Original Message-----
>> From: Amos Jeffries [mailto:squid3@treenet.co.nz]
>> Sent: Friday, July 27, 2012 11:53 PM
>> To: ietf-http-wg@w3.org
>> Subject: Re: Straw-man for our next charter
>>
>> On 28/07/2012 6:39 p.m., Larry Masinter wrote:
>>
>>>
>>> re changes to semantics: consider the possibility of eliminating
>>> "sniffing" in HTTP/2.0. If sniffing is justified for compatibility
>>> with deployed servers, could we eliminate sniffing for 2.0 sites?
>>>
>>> It would improve reliability, security, and even performance. Yes,
>>> popular browsers would have to agree not to sniff sites running 2.0,
>>> so that sites wanting 2:0 benefits will fix their configuration.
>>>
>>> Likely there are many other warts that can be removed if there is a
>>> version upgrade.
>>>
>>
>>
>> Which of the several meanings of "sniffing" are you talking about exactly?
>>
>> AYJ
>>
>>
>>
>>
>
>
Received on Monday, 30 July 2012 03:25:14 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 30 July 2012 03:25:22 GMT