- From: Ian Hickson <ian@hixie.ch>
- Date: Wed, 20 Jan 2010 01:52:02 +0000 (UTC)
- To: Adam Barth <w3c@adambarth.com>
- Cc: public-web-security@w3.org
On Tue, 19 Jan 2010, Adam Barth wrote: > On Tue, Jan 19, 2010 at 5:05 PM, Ian Hickson <ian@hixie.ch> wrote: > > On Sun, 6 Dec 2009, Adam Barth wrote: > >> In some sense, a site needs to vet all URLs for javascript URLs, but > >> this behavior means that every time you see "javascript:" in an XSS > >> filter, they're probably insecure unless you also see "data:" right > >> next door. > > > > Any system relying on blacklisting URLs or schemes is just asking for > > trouble. You simply cannot do a truly secure filtering mechanism with > > anything but a pure whitelisting mechanism, where _everything_ is > > whitelisted, including URL schemes. > > We've had this argument a couple of times. I worry that we're relying > too much on authors using strong XSS filters. I don't have any solid > data about how often folks use blacklist-based XSS filters. Anecdotally, > I've certainly seen them in the wild. Sure, but they're (almost certainly) already vulnerable. We could argue against almost any addition to the platform on the basis that it could introduce a vulnerability in a blacklist-based filter. -- Ian Hickson U+1047E )\._.,--....,'``. fL http://ln.hixie.ch/ U+263A /, _.. \ _\ ;`._ ,. Things that are impossible just take longer. `._.-(,_..'--(,_..'`-.;.'
Received on Wednesday, 20 January 2010 01:52:29 UTC