W3C home > Mailing lists > Public > public-webappsec@w3.org > January 2015

Re: [MIX] Require HTTPS scripts to be able to anything HTTP scripts can do.

From: Daniel Kahn Gillmor <dkg@fifthhorseman.net>
Date: Mon, 05 Jan 2015 12:24:31 -0500
Message-ID: <54AAC8CF.6030405@fifthhorseman.net>
To: WebAppSec WG <public-webappsec@w3.org>
On 01/05/2015 11:40 AM, Anne van Kesteren wrote:
> On Mon, Jan 5, 2015 at 5:26 PM, Daniel Kahn Gillmor <dkg@fifthhorseman.net> wrote:
>> If we allow access to http data from https web applications, there will
>> be no way to make these guarantees to the user, which would make the web
>> much weaker as a whole.
> 
> I don't necessarily disagree, but
> 
> a) we already allow this to some extent (see "Mixed Content") so it's
> not the best carrot

Currently only for images and other "passive" or "optionally-blockable"
[0] content, right?  This is a weakness that we're accepting, and which
is typically indicated to the user (albeit in a way that most people
don't understand), right?

[0] http://www.w3.org/TR/mixed-content/#category-optionally-blockable

An https origin that loads passive mixed content is the "dubious" state
referenced in [1].

[1]
https://www.chromium.org/Home/chromium-security/marking-http-as-non-secure

Or are you talking about something else?

> b) the server could still fetch data without an authenticated
> connection (see how Google was owned by the NSA)

You mean client → https origin → http datastore?  That is certainly
possible, and an information security failure on behalf of the origin.
However, it does mean that an attacker on-path between the user and the
web application will still not be able to monitor or modify web
application traffic.

the webapp's origin server is acting in effect as a proxy here, with all
attending responsibilities.  If they fail those responsibilities, that
doesn't mean we should move those failures to the client directly.

> c) executables are not bound by these limitations and are currently
> leapfrogging the web on phones

those executables are not using the network in a secure way.  they are
actively leaking information about their users.  This is not something
to emulate.

> d) we are in fact planning on allowing tainted cross-scheme responses
> due to service workers and point a) above (for images, sound, video)

Is this any different from passive mixed content?

The right fix for passive mixed content is to gradually deprecate it
further, not to expand its scope.

> It's a rather fragile setup that we have and I guess the question is
> whether the current setup is efficient in helping us getting towards
> near universal authenticated encryption or whether we should change a
> few variables.

I have no illusions that an all-https web will mean that communications
security is a solved problem.  But carving out more exceptions for
cleartext seems like a step in the wrong direction, because it
effectively lowers the ceiling of what protections the web can provide
to the user for the communications channels used.

We should be trying to raise the floor instead.

	--dkg


Received on Monday, 5 January 2015 17:24:59 UTC

This archive was generated by hypermail 2.3.1 : Monday, 23 October 2017 14:54:09 UTC