- From: Adam Barth <w3c@adambarth.com>
- Date: Tue, 19 Jan 2010 17:48:56 -0800
- To: Ian Hickson <ian@hixie.ch>
- Cc: Maciej Stachowiak <mjs@apple.com>, "sird@rckc.at" <sird@rckc.at>, Boris Zbarsky <bzbarsky@mit.edu>, gaz Heyes <gazheyes@gmail.com>, public-web-security@w3.org
On Tue, Jan 19, 2010 at 5:05 PM, Ian Hickson <ian@hixie.ch> wrote: > On Sun, 6 Dec 2009, Adam Barth wrote: >> In some sense, a site needs to vet all URLs for javascript URLs, but >> this behavior means that every time you see "javascript:" in an XSS >> filter, they're probably insecure unless you also see "data:" right next >> door. > > Any system relying on blacklisting URLs or schemes is just asking for > trouble. You simply cannot do a truly secure filtering mechanism with > anything but a pure whitelisting mechanism, where _everything_ is > whitelisted, including URL schemes. We've had this argument a couple of times. I worry that we're relying too much on authors using strong XSS filters. I don't have any solid data about how often folks use blacklist-based XSS filters. Anecdotally, I've certainly seen them in the wild. Adam
Received on Wednesday, 20 January 2010 01:49:48 UTC