W3C home > Mailing lists > Public > ietf-http-wg@w3.org > July to September 2012

Re: Minimizing/avoiding User-Agent, was: SPDY Header Frames

From: Nicolas Mailhot <nicolas.mailhot@laposte.net>
Date: Tue, 17 Jul 2012 22:34:07 +0200
Message-ID: <23ec05d25c37cb07b6159ee725ee29ca.squirrel@arekh.dyndns.org>
To: "Karl Dubost" <karld@opera.com>
Cc: "Nicolas Mailhot" <nicolas.mailhot@laposte.net>, "HTTP Working Group" <ietf-http-wg@w3.org>

Le Mar 17 juillet 2012 22:06, Karl Dubost a écrit :
> Nicolas,
>
> interesting but I would like to understand the issues you are having.
>
>
> Le 17 juil. 2012 à 15:23, Nicolas Mailhot a écrit :
>> Le Mar 17 juillet 2012 21:03, Julian Reschke a écrit :
>>> Could you elaborate? What kind of misbehavior are you referring to?
>>
>> Typical misbehaviour is inability to cope with intermediary-inserted
>> redirect and error codes, and retrying in a loop
>
> Do you have a precise example of such an issue?

we have seen broken web clients trying to get the proxy to proxify its own
auth portal (messy loop especially when the load balancer moved the client
to another proxy, so the proxies started looping over one another, fat web
client written in delphi or another obsolete windows tech, we protect
against this pattern now), clients that retried the same access several
thousand times a second all day round (stuffed our logs and made resolving
an actual incident that happened concurrently hard) infra load-testing
when akamai launched its hd service and users pushed us to he limit (still
investigating how to best block the akamai P2P video downloader), some
google chrome service that really could not accept the proxy refusal (not
chrome itself), windows widgets that thought wall-time was better queried
over the internet (it's nice to learn a computer can't emulate a basic
clock without the web) etc

Though most offenders do not bother to declare a user-agent at all which
is why I'm currently getting trying to get approval to block any access
attempt without user agent header

>> It's usually less invasive to blacklist or limit just the specific
>> troublesome client
>
> seen from the client side. Note that it happens quite often that people
> use a "regular user agent" to run a spambot. So the user agent string is
> not an absolute signature of the user agent behind.

spambots either do not know how to authentify over the proxy (not because
of the evil way we do proxy auth, for lack of a good standard, but because
they don't know the user password), are caught by its anti-malware
scanning (for http), the IPS probes, or the evil-sites list provided by
the manufacturer (that's why full-url-in-clear is useful at the
intermediary level, with only the ip or hostname security equipments would
not catch bot access patterns)

-- 
Nicolas Mailhot
Received on Tuesday, 17 July 2012 20:34:53 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 17 July 2012 20:34:59 GMT