- From: Mark Nottingham <mnot@mnot.net>
- Date: Tue, 20 Oct 2009 09:32:48 +1100
- To: Adrien de Croy <adrien@qbik.com>
- Cc: "William A. Rowe, Jr." <wrowe@rowe-clan.net>, HTTP Working Group <ietf-http-wg@w3.org>
Adrien proposed: > "Implementors of client applications SHOULD give consideration to > effects that a client's use of resources may have on the network > (both local and non-local), and design clients to act responsibly > within any network they participate in. Some intermediaries and > servers are known to limit the number of concurrent connections, or > rate of requests. An excessive number of connections has also been > known to cause issues on congested shared networks. In the past > HTTP has recommended a maximum number of concurrent connections a > client should make, however this limit has also caused problems in > some applications. It is also believed that any recommendation on > number of concurrent connections made now will not apply properly to > all applications, and will become obsolete with advances in > technology." And William proposed: >> """ >> Clients attempting to establish simultaneous connections SHOULD >> anticipate >> the server to reject excessive attempts to establish additional >> connections, >> and gracefully degrade to passing all requests through the >> successfully >> established connection(s), rather than retrying. >> """ My .02 - both of these proposed requirements are quite vague and can't be tested. I'd put forth that client authors know full well whether they're being abusive, and a few sentences in HTTP isn't going to convince them not to be. At the most, we might add something like """ Note that servers might reject traffic that they deem abusive, including an excessive number of connections from a client. "" (IMHO) -- Mark Nottingham http://www.mnot.net/
Received on Monday, 19 October 2009 22:33:24 UTC