- From: ianG <iang@iang.org>
- Date: Mon, 22 Dec 2014 02:11:18 +0000
- To: michael.martinez@xenite.org, Donald Stufft <donald.stufft@gmail.com>
- CC: Chris Palmer <palmer@google.com>, "public-webappsec@w3.org" <public-webappsec@w3.org>, Daniel Kahn Gillmor <dkg@fifthhorseman.net>, blink-dev <blink-dev@chromium.org>, security-dev <security-dev@chromium.org>, mozilla-dev-security@lists.mozilla.org
On 19/12/2014 01:36 am, Michael Martinez wrote: > I have been asked offlist to stand down for at least a day to let this > discussion cool down. And it's obvious that those of you who are > defending TLS against nothing I have said are tuning out what I am > trying to say. This WILL be my last reply even though I know there are > other responses that really are missing the point. > > On 12/18/2014 7:54 PM, Donald Stufft wrote: >> Agreed. The paper only looks at mobile apps, of which only some were >> found to be compromised. But those of you responding with objections >> are completely missing the point. Google wants everyone to switch >> over to using secure protocols and the execution will not only never >> be perfect, the hackers already have sufficient information about how >> the SYSTEM works that they are seeking other ways to bypass the >> security. All they have to do is insert a rogue proxy somewhere in >> the middle, and they can do that in a lot of different ways. >> You’re missing a step here, “All they have to do is insert a rogue >> proxy somewhere in the middle AND either get a certificate incorrectly >> issued to them (really hard to do) or the client software in question >> needs to not properly implement TLS. In this as far as anyone is >> currently aware Chrome (and Mozilla, and all the major browsers) are >> currently implementing TLS correctly so unless someone finds a bug >> there (which would be promptly fixed) and the CA vendors are not >> currently mis-issuing certificates. > > This has nothing to do with whether TLS works. Google brought its > proposal to flag all non-secure Websites to the security lists. In > responding to their proposal I have shown you multiple examples of how > user privacy is violated in spite of the use of TLS. It's not that TLS > doesn't work. It's that it is being used in a system that is full of > holes. In part, this is about one group trying to fix one problem with one protocol. Of course it highlights the problems in other places. But does that mean we shouldn't do the small fixes needed in one area? > People who use their work networks to browse the Web have to accept that > their employers have a legal right to monitor their activity; if they > are in a public wifi spot and they connect to the wrong router, they are > hosed. Even in a secure connection the employers and bad guys can still > see where the packets are going if they control access point. Yes, they can see the packets, but this is far less damaging. Really, the whole issue of traffic analysis is huge, difficult, and probably best left out of scope. > I don't care how much TLS is improved from this point on (because in the > end I have to trust my credit card data with the idiot storing that data > in an unencrypted database that isn't properly firewalled). OK, here, there is a sort of point. Back in the early 2000s, CAs stopped marketing that certs made sites secure. You won't find CAs advertising that the cert is needed for security. That's because annoying people like us made the point that they might have to substantiate that claim in court. And they knew they couldn't do it before a jury of (our) peers. But it is still the case that people use the word "security" with websites and certs too loosely. We can't really avoid that. But that bad language should not be the reason not to do things. > I do care > about whether the security community joins Google's campaign to convert > the Web to using a protocol that is totally inappropriate in situations > where no secure data exchange is required. Sure, I get that a Website > login can be sniffed in a public wifi spot; that is why hackers use > methods to bypass TLS protections. > > I ask that if you want to respond to me, then respond to my questions. > Please don't bring up TLS and Chrome again. That isn't what this is about. > > Website owners will want to know why users should not trust their sites > when they don't ask for or require credit card information. This > proposal is part of Google's long-term campaign to change the entire > Web. They have yet to explain why the Web needs to get off of HTTP. The > fact a small percentage of people don't want anyone to know what sites > they browse isn't good enough. As I posted earlier, there are several motives: 1. To stop passive eavesdropping by eg/ie NSA. 2. To stop datamining by ISPs, broadly, etc, and also to stop ISPs changing content. 3. To provide fuller authentication of websites, cf phishing. 4. To sell more certificates. Let me talk about 3. Phishing started up in 2003, and since then has been used as the wedge to open up and finance the entire cyber fraud industrial sector. Perhaps worse, it is the lead tool in cyberwar where "spear phishing" is used to spike open an employee's laptop and then get into a company. Now, why does phishing work? Simple. The browser cannot tell the user the difference between a secure connection and an insecure connection. Pretty darn simple, isn't it? It's all because of the *existence* of HTTP. Which is "nothing". Therefore no security statement and not noticed by the user... In security terms this is a complete, utter security 101 fail. We all know this. So how do we get out of the trap? Long hard analysis always leads to the same end-point: we should deprecate HTTP and go HTTPS for everything. Google's proposal is a midway point. Now, right at the moment, there is a building swell of support to head in that direction. Sure it is messy. But the alternate is "no security" in effect, or security facade or security theatre, or false sense of security. Whatever term you like. >> Indicating that the connection isn’t secure isn’t forcing everyone to >> use HTTPS. Disabling HTTP access altogether would do that, but nobody >> is suggesting that. Well, I wouldn't hide it as it is possible in the future. But of course this is not part of the current proposal, and we're probably talking about 10 years hence minimum before there are moves to drop HTTP. Also, talk about name & shame is rather aggressive and fearmongering. It will probably happen, but it is unlikely to be effective, it never has been in the past. >> All this proposal really does is have the user >> agents be honest and ethically inform their users of the properties of >> their current connection. Correct -- the current proposal is about providing more of the complete security information, especially the "absence of SSL". > It's an act of intimidation. You may not see it that way but many of > the Website owners who have to deal with the implications of lost user > trust DO. And how the victims of intimidation feel is very important in > a discussion of the tools being used. This is an over-reaction. Most website users will not see this unless they are laggards. > There is nothing ethical in Google's proposal. I think you may be confusing calls for name & shame. Google isn't doing that. It will probably follow, but it isn't going to be that important. As it happens, we've been trying these things since 2005 when SSL v2 was first discovered to be a barrier to security. And the results were not scary. Or you may be referring to the above comment about "honest and ethical informing" which is actually true. The sad part is that before this moment, browser vendors have been dishonest and unethical. Now, there is change in the air. If one browser starts putting better info up, then others should follow suit. > It is a dirty, > underhanded propaganda tactic that sidesteps a fair and open public > discussion with the people who will be most affected. They are enticing > the security community into supporting a sweeping change without fully > explaining WHY they want it or why it should even be expected to provide > any benefit. Well -- it's complicated. They are talking to the other vendors and suggesting they are going ahead with changes that were first identified decades ago. This is not an underhanded propaganda effort at all, it's a wakening of some security spirit. But you are right that they are not explaining this to the general public. That's likely impossible. > Earlier this year Google disclosed that it was giving a slight boost in > its search results to Websites that use HTTPS. Many marketers > immediately began discussing the implications of this algorithmic change > but they don't know how to do the math. Yes, all part of the same campaign. Good. > If the same boost is applied to every site then the signal washes out in > the mix. But it is a carrot that Google has dangled in front of the > donkeys. Yeah, whatever. I'm not seeing the problem. I care little for marketeer's confusion. > They love to dangle carrots. See those carrots for what they are. Yawn. > DEMAND A FULL EXPLANATION FROM GOOGLE for why this proposal should be > adopted. Well, frankly, google aren't going do that. Firstly, they do not say anything that isn't needed because of SEC rules. Same as any company. Secondly, they aren't talking to you, they're talking to other vendors. Thirdly this will mostly improve the position of users, not make it worse for them. Fourthly, as you can see from the convoluted discussions on this thread, literally nobody agrees with anything anyone says; security effect of SSL is so layered in settled detritus that it's a historical artifact not a system. That said, if you're keen on battling towards better understanding, then it will take a while. It took me about 5 years to unravel the industry; don't expect it to take 5 days. iang
Received on Monday, 22 December 2014 16:58:46 UTC