- From: Ian Hickson <ian@hixie.ch>
- Date: Mon, 1 Nov 2010 20:13:34 +0000 (UTC)
- To: "Eric J. Bowman" <eric@bisonsystems.net>
- cc: Adam Barth <ietf@adambarth.com>, httpbis <ietf-http-wg@w3.org>
On Mon, 1 Nov 2010, Eric J. Bowman wrote: > > In fact, many user agents (googlebot) only interpret content. Many of > the issues you're suggesting HTTP be changed to account for, simply > don't apply to user-agents *unless* they're rendering content. As someone who works for Google, I assure you that we are interested in seeing strict client-side conformance requirements, including those that describe required error handling, in the HTTP specification. We want our crawling infrastructure to interoperate with Web browsers, and currently to do so we have to spend significant resources reverse-engineering those browsers. It would be better for us, as well as fostering improved competition in our space, if the specification could just define all this in detail so we didn't have to reverse-engineer anything. -- Ian Hickson U+1047E )\._.,--....,'``. fL http://ln.hixie.ch/ U+263A /, _.. \ _\ ;`._ ,. Things that are impossible just take longer. `._.-(,_..'--(,_..'`-.;.'
Received on Monday, 1 November 2010 20:14:03 UTC