W3C home > Mailing lists > Public > www-talk@w3.org > January to February 2009

Re: Origin vs Authority; use of HTTPS (draft-nottingham-site-meta-01)

From: Adam Barth <w3c@adambarth.com>
Date: Wed, 11 Feb 2009 15:57:32 -0800
Message-ID: <7789133a0902111557q7fe68722l91d11f00cf4b1da4@mail.gmail.com>
To: Breno de Medeiros <breno@google.com>
Cc: Eran Hammer-Lahav <eran@hueniverse.com>, "www-talk@w3.org" <www-talk@w3.org>

On Wed, Feb 11, 2009 at 3:32 PM, Breno de Medeiros <breno@google.com> wrote:
> In that case, content-type is a mild defense. Can you give me an example
> where a web-site administrator will allow files to be hosted at '/'?

There are enough of these sites to force Adobe to break backwards
compatibility in a Flash security release.

> I can find some fairly interesting names to host at '/'
>
> E.g.: favicon.ico, .htaccess, robots.txt, ...

OMG, you changed my favicon!  .htaccess only matters if Apache
interprets it (e.g., uploading an .htaccess file to Gmail doesn't do
anything interesting).

> Trying to secure such environments seems to me a waste of time, quite
> frankly.

Clearly, Adobe doesn't share your opinion.

> The most interesting threat of files uploaded to root is via defacement.
> This solution does nothing against that threat.

I you can deface my server, then I've got big problems already (e.g.,
my Web site is totally hacked).  Not addressing this issue creates a
security problem where none currently exists.

>> 1) Require host-meta to be served with a particular, novel Content-Type.
>
> Not feasible, because of limitations on developers that implement these
> server-to-server techniques.

That's an opinion.  We'll see if you're forced to patch the spec when
you're confronted with a horde of Web servers that you've just made
vulnerable to attack.

>> 2) Add a section to Security Considerations that explains that
>> applications using host-meta should consider adding requirement (1).
>
> No. I would suggest adding a Security Considerations that say that host-meta
> SHOULD NOT be relied upon for ANY security-sensitive purposes _of_its_own_,

Then how are we to address use case (1)?

> and that applications that require levels of integrity against defacement
> attacks, etc., should implement real security techniques. Frankly, I think
> content-type does very little for security of such applications.

Your argument for why strict Content-Type handling is insecure is that
a more powerful attacker can win anyway.  My argument is that we have
implementation experience that we need to defend against these
threats.

I did a little more digging, and it looks like Silverlight's
clientaccesspolicy.xml also requires strict Content-Type processing:

http://msdn.microsoft.com/en-us/library/cc645032(VS.95).aspx

That makes 3 out of 3 systems that use strict Content-Type processing.

Microsoft's solution to the limited hosting environment problem
appears to be quite clever, actually.  I couldn't find documentation
(and haven't put in the effort to reverse engineer the behavior), but
it looks like they require a content type of applciation/xml, which
they get for free from limiting hosting providers by naming their file
with a ".xml" extension.  This is clever, because it protects all the
sites I listed earlier because those sites would have XSS if they let
an attacker control an application/xml resource on their server.

Adam
Received on Wednesday, 11 February 2009 23:58:12 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 27 October 2010 18:14:30 GMT