- From: Eliot Lear <lear@cisco.com>
- Date: Tue, 8 Dec 2015 08:16:08 +0100
- To: Martin Thomson <martin.thomson@gmail.com>
- Cc: HTTP Working Group <ietf-http-wg@w3.org>
- Message-ID: <566683B8.1060406@cisco.com>
Hi Martin, Thanks for your response. Please see below. On 12/7/15 10:51 PM, Martin Thomson wrote: > I think that your proposal had some sort of implication that malware > scanning was the responsibility of some sort of intermediary. That's > probably unintentional. I think that all we need to do is acknowledge > that this is a (value-neutral) choice. Based on your explanation I now understand what you're saying. You have indeed surfaced unstated architectural claim here by both of us. Mine is this: The more opportunities there are for bad links, the more infections there will be. And the converse holds true, of course. Furthermore, not all systems will be inoculated from malware. Let us assume that this statement is true, or Google and Yahoo! are spending needlessly spending a whole lot of money scrubbing such links. Yours, I think, is this: The end system is in the best position to protect itself against malware. I largely agree. Let's agree that these two architectural principles are not mutually exclusive. I will add one more assertion that I believe we also can agree on, which is that some people can be fooled into clicking on links. Phishing is not the only threat vector. A full scale automated system that uses this encryption mechanism would also be at risk, and perhaps more dangerous. But that there is at least one threat that is well understood by a lot of people is enough to recognize the problem. The threat we must address is exacerbation of attacks (an example being phishing) as relates to encrypted files where attackers have access to recipients' public keys/certs. Stating that "clients might need to use other means of protection..." addresses the latter architectural statement without acknowledging the former. We need to go just a bit further and simply state that "all systems that receive the encrypted object are advised to take what precautions they can to have some confidence that the object is free of malware." And then I would suggest several examples are in order, so as not to be too opaque. As an aside, part of the problem with this discussion is with the use of the term "intermediary". While perhaps correct in some sense, it is so vague as to introduce an ambiguity that may further cause additional confusion when having such discussions. In one of the cases I mentioned, we have a classic cache which neatly fits the term "intermediary" as everyone understands it. The term is expansive in its use, and thus a tad misleading, regarding the case where the server is The origin server for the object. The above change covers both cases, by the way. Eliot
Received on Tuesday, 8 December 2015 07:16:39 UTC