W3C home > Mailing lists > Public > public-web-security@w3.org > January 2010

Re: Sandboxed iframes (was Re: Seamless iframes + CSS3 selectors = bad idea)

From: <sird@rckc.at>
Date: Wed, 20 Jan 2010 10:04:10 +0800
Message-ID: <8ba534861001191804k5b8ae1c1if828baee13aafbc0@mail.gmail.com>
To: Ian Hickson <ian@hixie.ch>
Cc: Adam Barth <w3c@adambarth.com>, public-web-security@w3.org
Hmm, I agree with Adam Barth, bad developers <exist> in the world, and
stupid filters that are surviving exist as well.. anyway I think that data:
URIs should be on several blacklists already.. since they are vulnerable
already on FF.

And on another subject,

data:text/sandboxed-html,<USER DATA>

would be an awesome way to sandbox content inline, almost without the need
of the sandbox="" element, but I would like to ask to reconsider to add a
new argument for sandboxed content, to avoid old browsers from popping a
download dialog:

<iframe sandbox-src="">

Wouldn't that solve that specific issue? I can think that some people will
just click download/open and on all browsers except IE8 it will run with
localfilesystem access..

This way you also give a failsafe in case the UA doesn't support it

<iframe sandbox-src="usercontent.php?id=1331" src="error-nosandbox.html"/>

Anyway, regarding:

On Sun, 6 Dec 2009, sird@rckc.at wrote:
> >
> > Starting from abarth message:
> > http://sla.ckers.org/forum/read.php?13,31377#msg-31430
> >
> > Anyway, maybe I misunderstood what he said, I thought he meant in chrome
> it
> > was a new and exclusive origin (different from the parent one) and my
> tests
> > sort of confirmed that.
> >
> > On firefox for example, the origin is something a little weird (that is
> > probably the same maciej just explained). where you have a different
> origin
> > but access to parent/opener..
>
> Is the current spec text satisfactory?
>

yes it is :)

Greetings!!
-- Eduardo
http://www.sirdarckcat.net/

Sent from Hangzhou, 33, China

On Wed, Jan 20, 2010 at 9:52 AM, Ian Hickson <ian@hixie.ch> wrote:

> On Tue, 19 Jan 2010, Adam Barth wrote:
> > On Tue, Jan 19, 2010 at 5:05 PM, Ian Hickson <ian@hixie.ch> wrote:
> > > On Sun, 6 Dec 2009, Adam Barth wrote:
> > >> In some sense, a site needs to vet all URLs for javascript URLs, but
> > >> this behavior means that every time you see "javascript:" in an XSS
> > >> filter, they're probably insecure unless you also see "data:" right
> > >> next door.
> > >
> > > Any system relying on blacklisting URLs or schemes is just asking for
> > > trouble. You simply cannot do a truly secure filtering mechanism with
> > > anything but a pure whitelisting mechanism, where _everything_ is
> > > whitelisted, including URL schemes.
> >
> > We've had this argument a couple of times.  I worry that we're relying
> > too much on authors using strong XSS filters.  I don't have any solid
> > data about how often folks use blacklist-based XSS filters. Anecdotally,
> > I've certainly seen them in the wild.
>
> Sure, but they're (almost certainly) already vulnerable. We could argue
> against almost any addition to the platform on the basis that it could
> introduce a vulnerability in a blacklist-based filter.
>
> --
> Ian Hickson               U+1047E                )\._.,--....,'``.    fL
> http://ln.hixie.ch/       U+263A                /,   _.. \   _\  ;`._ ,.
> Things that are impossible just take longer.   `._.-(,_..'--(,_..'`-.;.'
>
>
Received on Wednesday, 20 January 2010 02:05:03 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Sunday, 19 December 2010 00:16:02 GMT