- From: Boris Zbarsky <bzbarsky@MIT.EDU>
- Date: Mon, 02 Jun 2014 10:08:27 -0400
- To: James M Snell <jasnell@gmail.com>
- CC: WebApps WG <public-webapps@w3.org>
On 6/2/14, 9:54 AM, James M Snell wrote: > Im not saying it's perfect. Not by any stretch. I'm saying it shouldn't > be worse. I don't understand why you think it's not worse. > and content filters will need to evolve. And until they do, we may have vulnerable pages, right? How is that not worse? Say an OS added some new functionality that meant software running on that OS would be insecure unless it got patched. Would you consider that acceptable? This is a pretty similar situation. The only thing that might make this OK is if good whitelist-based filters are overwhelmingly used in practice. > Perhaps an additional > strongly worded warning in the spec would be helpful. By what mechanism would someone who created a web page a year ago see this warning and go update their page? -Boris
Received on Monday, 2 June 2014 14:08:56 UTC