W3C home > Mailing lists > Public > ietf-http-wg@w3.org > July to September 2012

Re: Minimizing/avoiding User-Agent, was: SPDY Header Frames

From: Mark Nottingham <mnot@mnot.net>
Date: Wed, 18 Jul 2012 10:24:22 +1000
Cc: "Julian Reschke" <julian.reschke@gmx.de>, "HTTP Working Group" <ietf-http-wg@w3.org>
Message-Id: <AA365728-A181-4B8B-9693-E63E1EFA6909@mnot.net>
To: Nicolas Mailhot <nicolas.mailhot@laposte.net>
Just a clarification (without commenting on what roles intermediaries should play) --

The intent of the small-hdrs hint is NOT to omit the UA header;


>    o  Description: When true, this hint indicates that clients can omit
>       the Accept and Accept-Charset request headers when communicating
>       with the resource, and that they can use a shortened version of
>       the User-Agent header.

This is because browser vendors tend to emit a long (and somewhat unnatural) UA header because any significant changes will hurt interop. small-hdrs says "don't worry, I'm not doing weird browser sniffing on historic UA strings, so send me a compact, simple UA" -- not "don't send me a UA".

This may be an opportunity to introduce a profile of the UA header, but I didn't attempt that in the current draft (hoping that browser implementers can figure out a shorter UA header on their own).


On 18/07/2012, at 5:23 AM, Nicolas Mailhot wrote:

> Le Mar 17 juillet 2012 21:03, Julian Reschke a écrit :
>> On 2012-07-17 20:50, Nicolas Mailhot wrote:
>>> Julian Reschke <julian.reschke@...> writes:
>>>> On 2012-07-17 15:38, Poul-Henning Kamp wrote:
>>>>> There must be a smarter way than "User-Agent:"...
>>>> Actually one nice potential optimization is if the server can declare
>>>> that it's not interested in the User-Agent at all; see
>>>> <http://tools.ietf.org/html/draft-nottingham-http-browser-hints-03#section-5.7>
>>> The server may not be interested by intermediaries may still be
>>> (while ugly user-agent special-casing is quite useful for proxy
>>> operators
>>> that have to contend with web clients that were never really tested with
>>> proxies and misbehave big way)
>> Could you elaborate? What kind of misbehavior are you referring to?
> Typical misbehaviour is inability to cope with intermediary-inserted
> redirect and error codes, and retrying in a loop (because the developer
> could not bother to write error handling code, considers that all errors
> on the Internet are transient, so with enough retrying he will get
> whatever he expected to get at first)
> Then you get the web clients that try to interpret error pages as if they
> were whatever they expected to get at first, with sometimes weird results.
> Then you get the 'pump as much as you can' web client that will starve
> everything else.
> It's usually less invasive to blacklist or limit just the specific
> troublesome client instead of targeting the URLs it tries to access or the
> system it runs on (because the buggy part is the web client,the user who
> installed it may do legitimate work with other web clients at the same
> time, the broken web client will misbehave the same way tomorrow with
> other web sites, and this way the protection is in place before other
> systems are infected with brokenware)
> I really hope HTTP/2 makes intermediaries first-call citizens and
> clarifies intermediary/web client signalling so such things happen less
> often in the http/2 future (though bugs will always exist, so a client
> signature to home on is nice)
> -- 
> Nicolas Mailhot

Mark Nottingham   http://www.mnot.net/
Received on Wednesday, 18 July 2012 00:24:51 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 1 March 2016 11:11:04 UTC