- From: Mike Belshe <mike@belshe.com>
- Date: Thu, 19 Sep 2013 00:00:28 -0700
- To: Willy Tarreau <w@1wt.eu>
- Cc: Stephen Farrell <stephen.farrell@cs.tcd.ie>, "Roy T. Fielding" <fielding@gbiv.com>, Mark Nottingham <mnot@mnot.net>, IETF HTTP WG <ietf-http-wg@w3.org>
- Message-ID: <CABaLYCsc6eLv7y=25SUrqnUdiVka3VBxqfy8T2n2OFntR-qp0Q@mail.gmail.com>
Mark - I was about to write that I didn't like your proposal :-( But after reading Willy's argument, maybe I see a route where it can be useful. Specifically, Willy's retort is the common one - discussing "dumb administrators" and that TLS is not a panacea. This argument doesn't appeal to me because any of us can craft a reasonable story about the clueless guy to support our own point of view. And the clueless guys don't read protocol specs anyway :-) But we don't need to talk about the clueless guy, because even smart administrators that do read these specs don't always know when to encrypt. This could be demonstrated with an example. Appending to your own text: Common use of HTTP often contains a considerable amount of sensitive data; this might include cookies [RFC6265], application data, and even patterns of access. When used without encryption, HTTP makes this data vulnerable to passive interception. There are known instances when third parties have exploited the in-the-clear nature of "http://" URIs to obtain sensitive information, for a variety of purposes. Use of TLS [RFC2818] can help to mitigate such passive interception attacks. Note that TLS itself has many modes of use with different security properties, and there are currently known attacks against server authentication in "https://" URIs. *Content providers should be aware that while content may be public information, and seemingly not "sensitive", a consumer being observed accessing that information may still be sensitive. For example: suppose an encyclopedia content provider has a public page on sexually transmitted diseases. There is nothing confidential or private in that page. A medical student may access that page as research, and not care whether it is publicly known that they viewed that page. However, a patient may be accessing that page because they are researching a sensitive issue. This user, unlike either the content provider or the medical researcher, benefits from additional privacy when the content is deployed via HTTP/TLS rather than with HTTP alone.* I dunno - I'm still not in love with this - words without teeth are very close to pontificating, and I'd rather hold out for a protocol that actually walks the talk... Can we do that instead? :-) Mike On Wed, Sep 18, 2013 at 10:53 PM, Willy Tarreau <w@1wt.eu> wrote: > On Thu, Sep 19, 2013 at 02:55:00AM +0100, Stephen Farrell wrote: > > "Properly used, TLS provides good confidentiality > > The problem is precisely here. The mechanism is too complex for the > casual web admin to deploy it correctly and to understand the > implications of his choices. Not to mention the client side which > is generally worse as soon as it's not a browser. TLS is only safe > iff properly used and very few people know how to use it properly. > Thus they deploy and feel safe, so they have nothing else to care > about. > > The only really safe implementations I have seen were in clear text. > Why ? Simply because their authors knew that a TLS deployment would > eventually be degraded by clueless admins so they considered that > they needed to have something robust even when TLS was broken. As > a result they did all the job in the application (encrypting/signing > sensible data, timestamping/signing HTTP headers) and the transport > was as safe as a good TLS deployment without the risk that the > transport would be degraded further. > > That's why I don't like promoting it as the easiest path to > confidentiality. > It's only one element but we tend too often to spread the word that it's > sufficient, which is totally wrong and counter-productive. > > Willy > > >
Received on Thursday, 19 September 2013 07:00:58 UTC