W3C home > Mailing lists > Public > public-webappsec@w3.org > August 2015

Re: UPGRADE: Do we need granular control?

From: Mike West <mkwst@google.com>
Date: Wed, 12 Aug 2015 21:10:36 +0200
Message-ID: <CAKXHy=d3t6gUyaGKiY3Lpf6TOe7VXpMpZKLLFdo_hqvxkENnVw@mail.gmail.com>
To: Tanvi Vyas <tanvi@mozilla.com>
Cc: yan <yan@eff.org>, Brad Hill <hillbrad@gmail.com>, "public-webappsec@w3.org" <public-webappsec@w3.org>, Alex Russell <slightlyoff@google.com>, Peter Eckersley <pde@eff.org>, Dan Veditz <dveditz@mozilla.com>, Christoph Kerschbaumer <ckerschbaumer@mozilla.com>
Ok, then. That feels like a rough consensus to wait on a split.

I've punted the relevant bug to The Future, which I think clears the way
for a CfC to publish the spec as a CR. Thanks for your feedback!

-mike

--
Mike West <mkwst@google.com>, @mikewest

Google Germany GmbH, Dienerstrasse 12, 80331 München,
Germany, Registergericht und -nummer: Hamburg, HRB 86891, Sitz der
Gesellschaft: Hamburg, Geschäftsführer: Graham Law, Christine Elizabeth
Flores
(Sorry; I'm legally required to add this exciting detail to emails. Bleh.)

On Wed, Aug 12, 2015 at 8:51 PM, Tanvi Vyas <tanvi@mozilla.com> wrote:

> I think we should leave the v1 spec as is and see if we have a demand for
> something more in v2.
>
> upgrade-insecure-subresources is an easy way for a website to eliminate
> mixed content issues without doing much work.  A developer who wants to
> make the list more granular, is going to have to put in some work to find
> all the domains it needs to upgrade (or all the domains it doesn't want to
> upgrade).  For smaller sites, it might be just as easy to upgrade the links
> in their html.  If we do go down this road, we should consider supporting
> both a whitelist and a blacklist.  So websites can say upgrade everything
> but this one website that isn't https yet, without having to figure out all
> the domains they embed.  It would be nice to see some demand for the
> feature before we put in the work to support it.
>
> upgrade-insecure-navigations doesn't make much sense to me.  What is the
> threat we are trying to protect against?  I understand if we upgrade for
> the same domain (the way upgrade-insecure-requests is currently spec'ed and
> implemented in Firefox 42).  But why upgrade for external links?
>
> ~Tanvi
>
>
> On 8/10/15 8:29 PM, Mike West wrote:
>
> On Tue, Aug 11, 2015 at 12:14 AM, yan <yan@eff.org> wrote:
>
>> + pde
>>
>> On 8/10/15 1:59 PM, Brad Hill wrote:
>>
>>> I think that we could call it done and think about adding just
>>> 'upgrade-insecure-navigations' to a Level 2.  I think it is beneficial
>>> to have that scope expansion available as extra behavior, but I don't
>>> see any good use cases to formally "decompose"
>>> upgrade-insecure-resources out of the existing behavior. (where it could
>>> only be used to weaken mixed content fetching, which we don't want to do
>>> and won't necessarily ever produce good results)
>>>
>>
>> Firefox/Chrome doesn't block passive mixed content. Tor Browser doesn't
>> even block active mixed content by default.
>
>
> Really? That seems like a bad default choice.
>
>
>> In both cases, sites can use the header with a source set to upgrade as
>> many resources as possible. I think it's good to encourage this best-effort
>> behavior.
>>
>
> I guess I'm simply wondering whether anyone will use the granularity. My
> suspicion is that developers who have a deep enough understanding of the
> hosts to which they can securely connect is capable of simply coding their
> sites such that they connect to those hosts over HTTPS. But it might be the
> case that making the ramp more gradual could be useful even for those types
> of sites.
>
> Dan, Tanvi, is this something you folks would implement in Firefox?
>
> -mike
>
>
>
Received on Wednesday, 12 August 2015 19:11:24 UTC

This archive was generated by hypermail 2.3.1 : Monday, 23 October 2017 14:54:14 UTC