W3C home > Mailing lists > Public > www-talk@w3.org > January to February 2009

Re: Origin vs Authority; use of HTTPS (draft-nottingham-site-meta-01)

From: Breno de Medeiros <breno@google.com>
Date: Mon, 23 Feb 2009 15:05:01 -0800
Message-ID: <29fb00360902231505j129276a5m8f890c9877e717af@mail.gmail.com>
To: Adam Barth <w3c@adambarth.com>
Cc: Mark Nottingham <mnot@mnot.net>, Ben Laurie <benl@google.com>, Eran Hammer-Lahav <eran@hueniverse.com>, "www-talk@w3.org" <www-talk@w3.org>
On Mon, Feb 23, 2009 at 2:23 PM, Adam Barth <w3c@adambarth.com> wrote:

> On Mon, Feb 23, 2009 at 2:07 PM, Mark Nottingham <mnot@mnot.net> wrote:
> > To me, what's interesting here is that the problems you're illustrating
> have
> > never been an issue AFAIK with robots.txt,
> I recently reviewed a security paper that measured whether consumers
> of robots.txt follow redirects.  I'm not sure if their results are
> public yet, but some consumers followed redirects but others don't,
> causing interoperability problems.
> > and they didn't even come up as a
> > concern during the discussions of P3P. I wasn't there for sitemaps, but
> > AFAICT they've been deployed without the risk of unauthorised control of
> > URIs being mentioned.
> That just means they aren't interesting enough targets for attackers.
> For high-stakes metadata repositories, like crossdomain.xml, you find
> that people don't follow redirects.  If I recall correctly,
> crossdomain.xml started off allowing redirects but had to break
> backwards compatibility to stop sites from getting hacked.

crossdomain.xml was introduce to support a few specific applications
(notably flash), and it did not take into account the security requirements
of the application context. Tough.

> > I think the reason for this is that once the mechanism gets deployment,
> site
> > operators are aware of the import of allowing control of this URL, and
> take
> > steps to assure that it isn't allowed if it's going to cause a problem.
> This is a terrible approach to security.  We shouldn't make it even
> harder to deploy a secure Web server by introducing more landmines
> that you have to avoid stepping on.
> > They haven't done that yet in this case (and thus you were able to get
> > /host-meta) because this isn't deployed -- or even useful -- yet.
> TinyURL doesn't appear to let me create a redirect with a "." in the
> name, stopping me from creating a fake robots.txt or crossdomain.xml
> metadata store.  Similar to how MySpace and Twitter didn't let me make
> a profile with a "-" in the name, I wouldn't hang my hat on this for
> security.
> > I would agree that this is not a perfectly secure solution, but I do
> think
> > it's good enough.
> The net result is that most people aren't going to use host-meta for
> security-sensitive metadata.  The interoperability cost will be too
> high.
> Why not introduce a proper delegation mechanism instead of re-using
> HTTP redirects?  That would let you address the delegation use case
> without the security issue.

Because at this point there is no consensus what a general delegation
mechanism would look like. Quite possibly, this might be
application-specific. It is probably a better idea to see how this plays
out, how useful people find it to be, and if there are generic concerns that
can be addressed in a spec. The alternative is to write a spec that
introduces complexity to solve problems that we conjecture might exist in
yet-to-be-developed applications. The risk then is that the spec will not
see adoption, or that implementors will deploy partial spec compliance in
ad-hoc fashion, which is also a danger to interoperability.

> > Of course, a mention in security considerations is worthwhile.
> Indeed.
> Adam


+1 (650) 214-1007 desk
+1 (408) 212-0135 (Grand Central)
MTV-41-3 : 383-A
PST (GMT-8) / PDT(GMT-7)
Received on Monday, 23 February 2009 23:05:39 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:33:07 UTC