RE: Minimizing/avoiding User-Agent, was: SPDY Header Frames

I was trying to understand impact of HTTP2.0 or SPDY on intermediaries.

I think Metadata will always be in cleartext so redirect or proxies will still be low cost operation..

Even if major traffic sources such as Google, Facebook and Twitter go the TLS route, I think valid intermediaries (SPs doing traffic classification, traffic policy control, CDNs doing embedded video optimization) can always use TLS proxy functionality to achieve their goal.

Not sure if I understood the impact perfectly yet but I think intermediaries won't be severely impact even if most of the major players chose to use TLS....

Anil 

-----Original Message-----
From: Nicolas Mailhot [mailto:nicolas.mailhot@laposte.net] 
Sent: Wednesday, July 18, 2012 12:54 AM
To: Julian Reschke
Cc: Nicolas Mailhot; HTTP Working Group
Subject: Re: Minimizing/avoiding User-Agent, was: SPDY Header Frames


Le Mar 17 juillet 2012 21:03, Julian Reschke a écrit :
> On 2012-07-17 20:50, Nicolas Mailhot wrote:
>> Julian Reschke <julian.reschke@...> writes:
>>
>>>
>>> On 2012-07-17 15:38, Poul-Henning Kamp wrote:
>>
>>>> There must be a smarter way than "User-Agent:"...
>>>
>>> Actually one nice potential optimization is if the server can declare
>>> that it's not interested in the User-Agent at all; see
>>> <http://tools.ietf.org/html/draft-nottingham-http-browser-hints-03#section-5.7>
>>
>> The server may not be interested by intermediaries may still be
>>
>> (while ugly user-agent special-casing is quite useful for proxy
>> operators
>> that have to contend with web clients that were never really tested with
>> proxies and misbehave big way)
>
> Could you elaborate? What kind of misbehavior are you referring to?

Typical misbehaviour is inability to cope with intermediary-inserted
redirect and error codes, and retrying in a loop (because the developer
could not bother to write error handling code, considers that all errors
on the Internet are transient, so with enough retrying he will get
whatever he expected to get at first)

Then you get the web clients that try to interpret error pages as if they
were whatever they expected to get at first, with sometimes weird results.

Then you get the 'pump as much as you can' web client that will starve
everything else.

It's usually less invasive to blacklist or limit just the specific
troublesome client instead of targeting the URLs it tries to access or the
system it runs on (because the buggy part is the web client,the user who
installed it may do legitimate work with other web clients at the same
time, the broken web client will misbehave the same way tomorrow with
other web sites, and this way the protection is in place before other
systems are infected with brokenware)

I really hope HTTP/2 makes intermediaries first-call citizens and
clarifies intermediary/web client signalling so such things happen less
often in the http/2 future (though bugs will always exist, so a client
signature to home on is nice)

-- 
Nicolas Mailhot

Received on Wednesday, 18 July 2012 11:34:59 UTC