W3C home > Mailing lists > Public > www-tag@w3.org > February 2015

Re: Considering the pressure to turn HTTPS into a three-party protocol

From: Bjoern Hoehrmann <derhoermi@gmx.net>
Date: Mon, 16 Feb 2015 23:47:09 +0100
To: Noah Mendelsohn <nrm@arcanedomain.com>
Cc: "www-tag@w3.org List" <www-tag@w3.org>
Message-ID: <u3q4ea11pjfgoedu7a8iljd4gtgnv71kkd@hive.bjoern.hoehrmann.de>
* Noah Mendelsohn wrote:
>"2.7.2.  https URI Scheme
>    The "https" URI scheme is hereby defined for the purpose of minting
>    identifiers according to their association with the hierarchical
>    namespace governed by a potential HTTP origin server listening to a
>    given TCP port for TLS-secured connections ([RFC5246]).
>    All of the requirements listed above for the "http" scheme are also
>    requirements for the "https" scheme, except that TCP port 443 is the
>    default if the port subcomponent is empty or not given, and >>the user
>    agent MUST ensure that its connection to the origin server is secured
>    through the use of strong encryption, end-to-end, prior to sending
>    the first HTTP request.<<" [Emphasis mine...Noah]
>I think you can make the case that this REQUIRES unmodified end-to-end 
>communication to the origin server. This is not specific to browsers, 
>users, or configuration of machines. It's the definition of the https URI 
>Now, we can also point out that the user agent is, well, the user's agent. 
>If a user chooses to modify his/her agent with an extension that does 
>something other than what the normative specifications require, the user 
>can do that, but hasn't he or she now turned it into something 
>non-conforming? Isn't that what's happening when the user (intentionally or 
>otherwise), grants permission for an extension, ISP proxy or other means of 
>replaying the end-to-end check with MITM?

The specification defines the behavior of user agents, not the behavior
of a timbled kiosk in some heteronomous computing environment, and users
are naturally the ultimate authority for any trust decisions agents may
make. It would be ridiculous, for instance, to argue that a web crawler
designed to scan web sites for malware would violate the rules of the
protocol when scanning web sites that only support ciphers that are no
longer considered secure enough for financial transactions.
Björn Höhrmann · mailto:bjoern@hoehrmann.de · http://bjoern.hoehrmann.de
D-10243 Berlin · PGP Pub. KeyID: 0xA4357E78 · http://www.bjoernsworld.de
 Available for hire in Berlin (early 2015)  · http://www.websitedev.de/ 
Received on Monday, 16 February 2015 22:47:45 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 22:57:10 UTC