Re: UPGRADE: Do we need granular control?

I think we should leave the v1 spec as is and see if we have a demand 
for something more in v2.

upgrade-insecure-subresources is an easy way for a website to eliminate 
mixed content issues without doing much work.  A developer who wants to 
make the list more granular, is going to have to put in some work to 
find all the domains it needs to upgrade (or all the domains it doesn't 
want to upgrade).  For smaller sites, it might be just as easy to 
upgrade the links in their html.  If we do go down this road, we should 
consider supporting both a whitelist and a blacklist.  So websites can 
say upgrade everything but this one website that isn't https yet, 
without having to figure out all the domains they embed.  It would be 
nice to see some demand for the feature before we put in the work to 
support it.

upgrade-insecure-navigations doesn't make much sense to me.  What is the 
threat we are trying to protect against?  I understand if we upgrade for 
the same domain (the way upgrade-insecure-requests is currently spec'ed 
and implemented in Firefox 42).  But why upgrade for external links?

~Tanvi

On 8/10/15 8:29 PM, Mike West wrote:
> On Tue, Aug 11, 2015 at 12:14 AM, yan <yan@eff.org 
> <mailto:yan@eff.org>> wrote:
>
>     + pde
>
>     On 8/10/15 1:59 PM, Brad Hill wrote:
>
>         I think that we could call it done and think about adding just
>         'upgrade-insecure-navigations' to a Level 2.  I think it is
>         beneficial
>         to have that scope expansion available as extra behavior, but
>         I don't
>         see any good use cases to formally "decompose"
>         upgrade-insecure-resources out of the existing behavior.
>         (where it could
>         only be used to weaken mixed content fetching, which we don't
>         want to do
>         and won't necessarily ever produce good results)
>
>
>     Firefox/Chrome doesn't block passive mixed content. Tor Browser
>     doesn't even block active mixed content by default.
>
>
> Really? That seems like a bad default choice.
>
>     In both cases, sites can use the header with a source set to
>     upgrade as many resources as possible. I think it's good to
>     encourage this best-effort behavior.
>
>
> I guess I'm simply wondering whether anyone will use the granularity. 
> My suspicion is that developers who have a deep enough understanding 
> of the hosts to which they can securely connect is capable of simply 
> coding their sites such that they connect to those hosts over HTTPS. 
> But it might be the case that making the ramp more gradual could be 
> useful even for those types of sites.
>
> Dan, Tanvi, is this something you folks would implement in Firefox?
>
> -mike

Received on Wednesday, 12 August 2015 18:52:15 UTC