W3C home > Mailing lists > Public > public-webappsec@w3.org > February 2015

Re: [MIX] 4 possible solutions to the problem of Mixed Content Blocking stalling HTTPS deployment

From: Eric Mill <eric@konklone.com>
Date: Wed, 4 Feb 2015 02:42:14 -0500
Message-ID: <CANBOYLXMgiP3HDc6aCyFF9K1QMDaWig-JsGrA7Zx5=vxbkm-NQ@mail.gmail.com>
To: Tom Ritter <tom@ritter.vg>
Cc: Peter Eckersley <pde@eff.org>, "public-webappsec@w3.org" <public-webappsec@w3.org>, technologists@eff.org
On Tue, Feb 3, 2015 at 9:38 PM, Tom Ritter <tom@ritter.vg> wrote:

> On 2 February 2015 at 18:21, Peter Eckersley <pde@eff.org> wrote:
> > 0. Really force millions of site operators to edit all their code.  If
> > we're going to do this and expect to win, we had better provide much,
> > much better tools for doing it.  Perhaps Let's Encrypt can enable
> > report-only CSP and host analytics somewhere to tell webmasters where
> > things are going wrong.  Perhaps, instead of a mysterious MCB shield,
> > browsers could provide a list of required edits to HTML and JS source
> > files to fix MC.  Even with measures like this, I believe option 0 would
> > leave far too much of the work to be done by millions of fallible
> > humans.
> I think this will be necessary for people whose first priority is to
> provide a secure browsing experiences to their users.  I expect that
> Google's effort to move to HTTPS everywhere was not a half-dozen
> edits, but a pretty laborious process.  But it was what was necessary.
> And (like I mentioned in the other thread) I think a CSP mechanism for
> "Tell me about the non-SSL includes I have" is a great idea.
I haven't used them yet, but CSP reports for insecure includes seem like a
potentially fantastic triage mechanism, so a site can evaluate what work it
will have to do before it can safely turn on HTTPS for all browsers.

This means that CSP would need to be delivered over HTTP, and not by Let's
Encrypt (unless LE had a mode explicitly meant for sites to triage HTTPS

> > 2. Alter their HSTS implementations, so that those help solve rather than
> > exacerbate mixed content situations.  Perhaps this is only realistic
> > within first party origins, plus those third party origins that have
> > themselves enabled HSTS.

agl's main argument against considering HSTS before MCB was that it would
lead to inconsistent site behavior, based on whether the browser had seen
HSTS or not.

But if a site is in the HSTS preload list, this concern doesn't apply. And
I can think of some high value domains (youtube.com and its variants,
common high profile CDNs) that would make a huge dent on this front.
(YouTube is in the preload list, but only for PKP and not HSTS.)

> 3. Add a new directive to the HSTS header, which sites (and the Let's
> > Encrypt agent working on behalf of sites) can set.  It could be called
> > the "easy" or "helpful" bit.  In slogan form, the semantics of this
> > would be "if you're a modern client that knows how to do Helpful HSTS,
> > this site should be entirely HTTPS; if you're an old client that has
> > MCB/HSTS but doesn't know about Helpful HSTS, leave the site on HTTP".

This is helpful, but is also quite a long game -- while it may get more
traffic encrypted in the short term, it doesn't contribute to helping
browsers deprecate HTTP until Helpful-HSTS-enabled clients are
overwhelmingly in use. And during all that time, there's little pressure on
site owners to move their site fully over to HTTPS.

> There's a question about how to get the HSTS Helpful bit to the
> > client if the server is trying to leave the HTTP version of their site
> > alive for clients with traditional MCB implementations.

Seems like you'd have to make the site's canonical URL HTTPS, and do UA
detection to downgrade clients to HTTP? Hopefully I'm wrong about that and
there are better ways?

> The Helpful bit should probably also have a way for site operators to
> > request and control automatic upgrades of embedded third party
> > resources.  That could range from "try every third party resource over
> > HTTPS, all of the time", through a whitelist of domains for which
> > this should happen, through to the full gory details of putting
> > something flexible like HTTPS Everywhere rulests in the HSTS directive.

I wonder if the simpler solution here is to push the HSTS preload list even
harder, and to give inclusion in that list the added benefit that you don't
need to go around fixing mixed content warnings, because the browser is
willing to glance at the preload list first before making a MCB

Of all these options, so far I think only tools for 0, and maybe 1, would
help a large site move entirely to HTTPS and HSTS in the next 1-2 years.
That's not the only goal here, I know, but it is my current focus.

-- Eric

konklone.com | @konklone <https://twitter.com/konklone>
Received on Wednesday, 4 February 2015 07:43:27 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 18:54:46 UTC