W3C home > Mailing lists > Public > public-webappsec@w3.org > June 2014

Re: "Mixed Content" draft up for review.

From: Yan Zhu <yan@eff.org>
Date: Mon, 02 Jun 2014 06:46:05 -0700
Message-ID: <538C801D.8000008@eff.org>
To: rsleevi@chromium.org, Mike West <mkwst@google.com>
CC: Daniel Veditz <dveditz@mozilla.com>, Anne van Kesteren <annevk@annevk.nl>, palmer@chromium.org, Brad Hill <bhill@paypal.com>, Tanvi Vyas <tanvi@mozilla.com>, "public-webappsec@w3.org" <public-webappsec@w3.org>
On 06/02/2014 06:15 AM, Ryan Sleevi wrote:
> 
> On Jun 2, 2014 5:35 AM, "Mike West" <mkwst@google.com
> <mailto:mkwst@google.com>> wrote:
>>
>> I believe that both Chrome and Firefox currently implement the
> opposite of what's specced here (e.g. mixed content happens, then HSTS
> happens). I think that the relevant folks consider that a bug, but I'd
> invite Ryan and Chris to weigh in (Hi Ryan and Chris, who I just added
> to CC!). If I've misinterpreted things, I'm happy to change that piece.
>>  
> 
> Correct. Mixed content detection is performed over the URLs as written
> in the source. If you (the author) say to load something over HTTP,
> while the active document is sourced from HTTPS (possibly due to HSTS),
> then we treat that as mixed content.
> 
> That's because even though the fetch will (probably) be secured via
> HSTS, you attempted to fetch insecurely.
> 
> I believe that we have debated with Mozilla on this in the past as to
> whether this was bug or feature. Our (Chrome) view is that its feature,
> and that it is more important to warn authors about the potential
> mixed-content than it is to have users relying on HSTS. Mozilla was
> debating whether or not the mixed content checking should happen based
> upon the effective transport.
> 
> With respect to where does HSTS fit within Fetch, as implemented within
> Chrome, it is treated as a synthetic redirect, "as-if" a 301 was
> received. We do not rewrite the content in situ, since any
> programmatically induced loads also need to evaluate HSTS policy.
> Further, at any point during a fetch of a subresource, the HSTS policy
> may be invalidated (max-age of 0), further supporting the unconditional
> treatment of things as mixed.
> 
> Additionally, there has been debate about where and how to handle
> extension rewriting. Tools like HTTPSEverywhere currently get the same
> treatment/behaviours as HSTS, meaning mixed content signalling, whereas
> they would like to be able to suppress that, since it is not/was not the
> author's intent to load over HTTPS, rather something the extension did.
> The desire here is to allow mixed content to be based on the 'effective'
> URL (or first non-synthetic URL). However, that has not been implemented
> yet in Chrome.
> 

Feel free to ignore the following if it's not useful, but I figured I
would chime in as a relevant browser extension developer:

The implementation of mixed content blocking occurring before HSTS and
extension rewrites in Chrome and subsequently Firefox forced us to
disable 25% of the HTTPS Everywhere rewrite rules and caused a lot of
bug reports. As many of these were cases where HTTPS Everywhere would
have gotten rid of the mixed content errors entirely by rewriting HTTP
resources to HTTPS, we were somewhat frustrated and felt that this was a
bug.

It sounded like a wontfix in Chrome, but we opened a bug report in
Firefox: https://bugzilla.mozilla.org/show_bug.cgi?id=878890. Peter and
Mike worked on a Firefox patch to make the MCB fire after HSTS and
extension rewrites, which looks like it may get accepted.

> Is that what you were looking for, Mike?
> 


-- 
Yan Zhu  <yan@eff.org>, <yan@torproject.org>
Staff Technologist
Electronic Frontier Foundation                  https://www.eff.org
815 Eddy Street, San Francisco, CA  94109       +1 415 436 9333 x134


Received on Monday, 2 June 2014 20:58:22 UTC

This archive was generated by hypermail 2.3.1 : Monday, 23 October 2017 14:54:05 UTC