Re: Fwd (TAG): Draft finding - "Transitioning the Web to HTTPS"

Nick Doty wrote:
>
> Other responses on this thread (and in particular links from Domenic
> and Chris) might give more info than I can provide, but I’ll try to
> answer briefly a couple of these questions.
> 

And I sincerely thank you for that, even if my contrariness rankles.

The problem as I see it, is that the debate is between no-auth HTTP and
HTTPS -- with no discussion of HTTP Digest and how it may be improved to
solve the problems HTTPS purports to, without the drawbacks as I see
them.

>
> Eric J. Bowman wrote:
>
> >> 
> >> To summarize, sites are still exposing information about their
> >> users when they force visitors to use HTTP, even if there are no
> >> authentication cookies. In particular, the user’s reading habits
> >> are exposed (which page on your site are they reading? does that
> >> page contain words of political interest?). Non-authentication
> >> cookies are used to surveil users or identify them for more
> >> invasive attack [0].
> >> 
> > 
> > Sorry, but is this problem actually solved by HTTPS in an era of
> > super- cookies? HTTPS doesn't hide the visited site IP from network
> > admins. So I still believe (until convinced otherwise, but I'm a
> > hard audience) w3c should come up with something better instead of
> > copping out to using ubiquitous HTTPS despite its known problems.
> 
> Super-cookies and browser fingerprinting do create privacy issues for
> Web users. But I’m not sure why we would conclude from that that
> HTTPS won’t be helpful. In particular, HTTPS provides improvements in
> confidentiality from network attackers (passive or active) not from
> the site you’re communicating with. (That’s rather the point: you
> want to communicate with the site you’re communicating with!) That
> sites can use persistent identifiers to re-identify you is an
> orthogonal problem, but we at least wouldn’t want to provide
> visibility into all of those identifiers to a passive adversary on
> the network.
> 

I'm not sure why we should conclude that Digest Auth is unhelpful. ;-)
My problem with HTTPS, is the weakest link in the CA chain negates any
amount of money I pay Thawte for a cert, or get free from anybody. The
hoped-for trust model never materialized, although I hear it's coming
in Summer 2015 along with the latest blockbuster superhero film...

I'm skeptical of solving the fingerprinting problem at the application-
protocol layer. Although it is curious to me, that those "features" of
HTML5 I didn't like the smell of in the first place for breaking the
back button, also happen to be the basis of supercookies...

As to communicating with whom you're communicating, means exist to
detect content tampering from ISPs, Webhosts, black-hats, and even end-
users over HTTP:

http://www.cs.washington.edu/research/security/web-tripwire/nsdi-2008.pdf

Which also apply to HTTPS, because we aren't solving the content-
injection problem by going down that road, as the study shows. I find
it interesting that this is what Content-Md5 showed as not feasible at
the protocol layer (due to range requests), but it's mostly trivial to
just not present a login box to the user if the page it's on has been
changed, regardless of protocol.

The thing about HTTP Digest, is the user-agent _can_ authenticate the
server if the login is via TLS; subsequent communications can then use
HTTP and caching -- even for authenticated users, even with content
personalization. Mooting my CPU argument.

>
> HTTPS is also not a silver bullet, as all agree. But users can
> benefit from the network adversary not knowing all the details of the
> pages they visit even if the destination IP address is revealed.
> 

The user in me is content to know if a page has been tampered with, or
not. The admin in me, knows anyone can set up an account to see hidden
content and pronounce guilt by association, even if I've not viewed
that particular content, and isn't sure using TOR to access certain
sites doesn't put a bigger target on my back than just visiting without
even logging in, over HTTP. See anarchology.org.

Which is the long way of saying that the site owner in me, doesn't care
to have the onus of user privacy on his back. Certainly not from a
customer service standpoint, like marketing a restaurant as "healthy"
where everyone has individual definitions of that term.

All I can do, regardless of protocol, is inform users when the content
I've published isn't the content that they see -- but I don't need
HTTPS to do that for the 10% or so of affected requests, according to
that study I linked. The remaining 0.3% represent users who will
"discover" TOR next week, and aren't worth the cost of moving to HTTPS.

Or they're black-hats. One thing I've noticed doing more than one small
mountain town law-firm website, is prepare to get slashdotted if your
client takes on someone who's represented by a hired gun. Spend their
money on DNS Anycast service, I had no issues during a 250Gbps (!) DNS-
based attack earlier this year on another UltraDNS client.

The one service I still provide former clients is reselling DNS.
Doesn't pay the bills, value is in references. Paying for low latency
at the DNS layer is more cost-effective than the better CPUs/hosting
required for HTTPS migration, particularly looking at DME's pricing and
latency. Spending money to *increase* latency just isn't my bag, baby.

>
> >> Also, without integrity guarantees, HTTP sites expose their users
> >> to the risk of running any script the attacker wishes to introduce,
> >> including potentially asking for access to sensitive device APIs.
> >> Network attackers can also introduce identifiers or modify content
> >> for HTTP browsing. That is, integrity also helps with
> >> confidentiality and other privacy concerns [1].
> > 
> > But still, why HTTPS? When I can solve those problems using HTTP
> > Digest and auth-int? To solve a problem nobody I know has ever come
> > across in the first place? So many of the reasons to not use HTTP
> > come across as FUD when I delve deeper -- theoretical problems pale
> > in comparison to actual problems, there are enough of those to go
> > around without resorting to scare tactics.
> 
> Among the documented security issues with HTTP Digest is that it’s
> vulnerable to a man-in-the-middle network adversary. This is not at
> all just a theoretical problem. For example, see the lists in Chris
> Palmer’s email in this thread, or the list in the Chromium FAQ
> referred to earlier:
>

There's also RFC 4169. None of the explanations of HTTP Digest
vulnerabilities, assume auth-int + AKAv2 with no RFC 2069 fallback.
Change md5 to bcrypt and I get "pretty good privacy" without my having
to accept higher latency by giving up Celerons/T1's and caching, being
relegated to the post-Net-Neut slow lane unless paying extortion, etc.

Giving the gewgaws, like ad services, a better/cheaper way of doing
things, would reduce the invalid-cert popup issue -- the Doom of the
Web. (Film at 11.)

> 
> (It’s also intended for user authentication, not for authenticating
> the site. It isn’t intended to provide any measure of
> confidentiality. As RFC 2617 points out, message integrity is not
> provided against a man-in-the-middle attacker.)
> 

OK, but is the client authenticating the server all that important for
every request? Or just loggin in? I don't see how that doesn't involve a
third party, but that's the inherent weakness of the CA system. Can't
we just authenticate content? If it's altered, users don't log in from
there. Or trust what they're seeing. Problem solved, in a way Web
developers can control, unlike invalid-cert popups. Without needing
ubiquitous HTTPS, or overwhelming end-users with invalid-cert warnings.

-Eric

Received on Saturday, 20 December 2014 10:04:40 UTC