W3C home > Mailing lists > Public > public-webapps@w3.org > April to June 2014

Re: HTML imports: new XSS hole?

From: James M Snell <jasnell@gmail.com>
Date: Mon, 2 Jun 2014 06:22:58 -0700
Message-ID: <CABP7RbeZex9vsH83jP1qBR28wpY5axK6eK2+0QcWETWnwJ5QKw@mail.gmail.com>
To: Boris Zbarsky <bzbarsky@mit.edu>
Cc: WebApps WG <public-webapps@w3.org>
Yes, that's true. Content filters are likely to miss the links themselves.
Hopefully, the imported documents themselves get filtered, but there's no
guarantee. One assumption we can possibly make is that any implementation
that knows how to follow import links ought to know that they need to be
filtered. Im not aware of any current user agents that are not import aware
that automatically follow and execute link tags.
On Jun 2, 2014 6:12 AM, "Boris Zbarsky" <bzbarsky@mit.edu> wrote:

> On 6/2/14, 9:02 AM, James M Snell wrote:
>
>> I suppose that If you
>> needed the ability to sandbox them further, just wrap them inside a
>> sandboxed iframe.
>>
>
> The worry here is sites that currently have html filters for user-provided
> content that don't know about <link> being able to run scripts.  Clearly
> once a site knows about this they can adopt various mitigation strategies.
>  The question is whether we're creating XSS vulnerabilities in sites that
> are currently not vulnerable by adding this functionality.
>
> -Boris
>
> P.S. A correctly written whitelist filter will filter these things out.
>  Are we confident this is standard practice now?
>
>
Received on Monday, 2 June 2014 13:23:32 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 18:14:24 UTC