- From: Daniel Dardailler <danield@w3.org>
- Date: Wed, 10 Mar 1999 10:18:51 +0100
- To: "Silas S. Brown" <ssb22@cam.ac.uk>
- cc: w3c-wai-er-ig@w3.org, ping@lfw.org
> > > > Let me see if I get this clearly: you expect HTTP1.0 agents to > > > > implement your extension before they implement a piece of 1.1 ? > > HTTP agents don't need to know about this at all. The only people who > need to know about this are the ones who write gateways. That's why gateways are HTTP agents. > using something like Warning would probably not be a good idea, because > other agents might think they're too clever and do something, whereas > actually they're not supposed to know. if the proper new warning code is used, why would they do something ? > How did the Standard for Robots Exclusion come about? I don't know, probably before HTTP1.1 stabilized and new work on extension started.
Received on Wednesday, 10 March 1999 04:18:57 UTC