W3C home > Mailing lists > Public > public-webappsec@w3.org > January 2016

Re: Limiting requests from the internet to the intranet.

From: Brian Smith <brian@briansmith.org>
Date: Fri, 8 Jan 2016 10:49:57 -1000
Message-ID: <CAFewVt5bbcEB78m+aK+308x+GcDgbnPvxG9cdCD4=9mrpErWAw@mail.gmail.com>
To: Mike West <mkwst@google.com>
Cc: "public-webappsec@w3.org" <public-webappsec@w3.org>, Brad Hill <hillbrad@gmail.com>, Dan Veditz <dveditz@mozilla.com>, Ryan Sleevi <sleevi@google.com>, Justin Schuh <jschuh@google.com>, Devdatta Akhawe <dev@dropbox.com>, Anne van Kesteren <annevk@annevk.nl>, Chris Palmer <palmer@google.com>
Mike West <mkwst@google.com> wrote:

> I've put together a kinder, gentler take on hardening the user agent
> against the kinds of attacks that such requests enable:
> https://mikewest.github.io/cors-rfc1918/. It's pretty rough, as I've only
> poked at it sporadically over the holidays, but I think there's enough
> there to get a conversation going.
>

First, it seems wrong that no router makers are represented in this thread.
(I heard that Chromium OS is the foundation of Google OnHub, which is an OS
for routers, so the Googlers are router software makers in some sense.
However, IIUC, Google OnHub uses and iPhone or Android app for
configuration, not a web UI, so I guess OnHub isn't relevant to this
discussion.) We should make some effort to bring the router makers into the
discussion or move to a venue that is more relevant to them.

Anyway, the premise of this work is that SOHO router makers (and makers of
similar devices) are doing such a bad job at securing their configuration
web apps that browsers need to do special things to defend the routers that
they don't do to defend other web apps. But why are router makers doing a
bad job? Are they doing worse than web app developers in general? How so?
Are their products somehow more at risk than web apps in general? How so?

It seems wrong that nobody in this thread represents a SOHO router maker (I
heard OnHub-based routers are based on Chromium OS, so the Googlers are in
some sense "router toolkit makers").

Mike's nice document says "[...] a router’s web-based administration
interface must be designed and implemented to defend against CSRF on its
own, and should not rely on a UA that behaves as specified in this
document." My hypothesis is that the people making the vulnerable software
aren't "web developers working on a router" but more "networking developers
working on a web interface." Accordingly, it may be unreasonable to just
say "defend against CSRF" and expect them to effectively do so.

A better alternative, I think, would be to specify more clearly what is
meant by "defend against CSRF" in a document specifically targeting the
specific nature of SOHO routers and similar devices. That way router makers
can sooner finish their work to "defend against CSRF" on their end. I also
think it is essential to have a test suite and a reference implementation
for router makers to read, use, and copy. Note that existing documentation
on defending against CSRF from OWASP and others, is either quite hand-wavy
or framework-specific. In the case of the OWASP documentation, there are
too many choices, IMO, such that one could easily get overwhelmed and get
trapped by the paradox of choice.

Then, with a test suite in hand, we can look at what additional mechanisms
a browser would need to implement. Interestingly, the test suite for the
new browser functionality would be the same as, or a subset of, the test
suite for the routers own mechanisms to "defend against CSRF." Thus, doing
the test suite first should not slow down the development of any browser
changes.

Conversely, it is difficult to understand the given proposal without a test
suite. For example, to what extent is it important or unimportant to
disallow public->private top-level navigation? Is it only important to
disallow that kind of navigation if the browser supports
http://user:password@host URLs, or is blocking navigation for
http://user:password@host sufficient? A test suite should be able to easily
answer such questions.

We know this is a high-risk project from past experience. Mozilla tried to
solve this problem in Firefox and had to back out the change [1]. Already
in this thread we have people saying that the proposed browser changes
would break their products. Browser developers good visibility into
intranets and other private networks to find and understand problems. These
are all indications that any change to the default navigation, iframe
embedding, or XHR behavior of web browsers to mitigate the issues is likely
to take many iterations, and thus a lot of time, to get right. Thus, a
parallel approach of outreach to device makers and browser development
makes the most sense.

tl;dr:
* Let's make sure that the makers of the products that we're trying to help
are actually involved in the discussion.
* Let's build an open source test suite that device makers can use to
improve their products.
* Let's document, more specifically and precisely, what security measures
router makers need to use to defend themselves against CSRF and other
attacks.
* Let's create a mockup router web UI, or modify an open source web UI, to
use as a reference implementation to help router makers.
* Let's derive and evaluate any spec for changing browser behavior from the
test suite.
* Let's recognize that there is a high risk of failure for changing browser
behavior and that changing browser behavior only helps to a limited extent.
* Let's trade high fives all around when it's all done.

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=354493

Cheers
Brian
-- 
https://briansmith.org/
Received on Friday, 8 January 2016 20:50:29 UTC

This archive was generated by hypermail 2.3.1 : Monday, 23 October 2017 14:54:17 UTC