W3C home > Mailing lists > Public > ietf-http-wg@w3.org > July to September 2012

Re: Semantics of HTTPS

From: Willy Tarreau <w@1wt.eu>
Date: Tue, 7 Aug 2012 07:44:10 +0200
To: "Adrien W. de Croy" <adrien@qbik.com>
Cc: Stephen Farrell <stephen.farrell@cs.tcd.ie>, Mark Nottingham <mnot@mnot.net>, "ietf-http-wg@w3.org Group" <ietf-http-wg@w3.org>
Message-ID: <20120807054410.GE7647@1wt.eu>
On Mon, Aug 06, 2012 at 11:37:15PM +0000, Adrien W. de Croy wrote:
> >Well just block those sites if you must. I don't see why inspection
> >is somehow better. I do see that some people might think inspection
> >is better, but if that's a falsehood then no conclusion can be drawn.
> >(False => anything, logically.) Evidence of the effectiveness of
> >MITM inspection (vs. endpoint mechanisms) would be good, but seems
> >to be missing.
> >
> I agree, I'd like to see some more evidence, e.g. rates of prevention 
> of malware (which commonly uses https to retrieve payload).
> 
> But there are plenty of security software vendors who will tell you 
> that malware spreads more and more with https.

This is not just a claim, about 8 years ago I could observe the first
malware communicating with the outside only using https. At that time
it couldn't pass the proxy authentication trick so we had to connect
it to the internet to see it escape the machine. Now this has become
the norm it seems, since anti-virus and anti-malware proxies are
supposed to be almost everywhere.

> They do a good job of convicing customers they need to scan https.

Our customers regularly ask us for this because at the moment they have
to find the right balance between opening some "trusted" https sites
their users are asking for, and not opening too much to limit the risk
of infection. And with more sites migrating to https, this becomes
really tricky not to do MITM now. And doing so causes real security
issues because nothing was planned to inform the user about the original
site's certificate.

> >You left out an important thing: it requires sites (e.g. a bank)
> >to trust proxies the site has never heard of with the site's
> >customer data, e.g. payment information.
> >
> 
> Sure, my point was client-centric.  
> However, banks are already in this situation.  I use online systems of 
> 3 banks.  None require me to use a client certificate.  Is this just a 
> meteor waiting to hit?
> 
> They already are demonstrating either ignorance or trust of MITM 
> proxies operated by client organisations.  I won't do them the 
> disrespect of claimimg it's ignorance.

No, one of my regular customer is a bank and they're facing deeper
issues : certificate management. Supporting this for millions of
customers is really really tricky. You don't only have geeks among bank
customers, a large number of them are totally novice and are experiencing
issues you wouldn't even imagine. Some do not hesitate to send invoices
for a PC repairing claiming that a change on the site design has broken
their PC... Others have no trouble connecting to the site with ads all
over the screen... Managing certificates with such ignorant people is a
real burden, it would cost a lot in support. I'm not saying it's not going
to happen, I'm just doubting it will happen soon, considering that their
#1 ennemy is the MITB (man-in-the-browser, the malware that sees everything
in clear) and not the network.

Anyway, that's where I think that CONNECT will still be used once we have
GET https://. GET will be used for normal https, and CONNECT with a short
whitelist of sites which must not be analysed and which must be trusted.

> >Do we really want to engineer the web so as to allow a company
> >proxy to prevent payments to the company's favourite bad cause?
> >That's what's being enabled here. Its a bad plan.
> >
> 
> It can already happen.  If we want to stop it, that's yet another 
> direction to move in.  What we're proposing has no impact on that.

Also, proxy usage is clearly defined in the HTTP spec, so the HTTP
spec already allows MITM rejection of some websites based on the
destination address. Should we state that the IETF was bad in allowing
this ? I don't think so, it's just that whatever we design will make
it possible for someone along the path to roughly classify the traffic
and decide what to do with it. SNI/NPN provide similar information that
can be used for filtering traffic.

So we're not proposing anything new, just suggesting another way to
use a proxy to relay https so that the user decides whether he prefers
privacy or malware protection. Right now he cannot choose.

Willy
Received on Tuesday, 7 August 2012 05:44:44 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 7 August 2012 05:44:57 GMT