- From: Noah Mendelsohn <nrm@arcanedomain.com>
- Date: Wed, 18 Feb 2015 23:08:04 -0500
- To: Bjoern Hoehrmann <derhoermi@gmx.net>
- CC: "www-tag@w3.org List" <www-tag@w3.org>
On 2/17/2015 3:32 PM, Bjoern Hoehrmann wrote: > The requirement you cited is a MUST-level requirement, it is not a pre- > ference. Yes. > You are arguing that such a crawler cannot be implemented with- > out violating the protocol. Yes, it violates a MUST. That doesn't mean that building such a crawler is a bad thing. Lots of us violate specifications in the privacy of our own organizations to achieve some purpose or other. Debugging is a classic case where we retain things that specifications say should not be cached, etc. But yes, I don't think it conforms to the RFC 3986 and friends specification for the dereference of an https-scheme URI. In your example, if I understand it, this is being done to answer a question about the state of the Web itself (what's deployed with weak encryption etc.) The situation with ISPs violating the specification to me seems very differnt in spirit. ISPs are doing this specifically to interfere with the contract between users and resource providers, in exactly the situation the specification was written to address. Noah
Received on Thursday, 19 February 2015 04:08:26 UTC