- From: Aryeh Gregor <Simetrical+w3c@gmail.com>
- Date: Thu, 2 Sep 2010 15:53:30 -0400
On Thu, Sep 2, 2010 at 12:21 AM, Boris Zbarsky <bzbarsky at mit.edu> wrote: > On 9/1/10 4:46 PM, Aryeh Gregor wrote: >> Is this realistically possible unless the author deliberately crafts >> the file? > > I'm not an audio/video format expert; I have no idea. ?Does it matter? Yes. If false positives were realistically possible by accident, that would count strongly against sniffing. If they're not, that at least is not an issue. > Why is it not a problem if there are suddenly use cases that are impossible > because the browser will ignore the author's intent? Which use-cases? >> have any issues ever been caused by this kind of sniffing problem? > > As far as I know, yes (of the "remotely take control of the computer" kind). > >> Are there clear problems that have arisen in other cases? > > See above. > >> The problem can't plausibly arise with media >> >> files -- if you can execute a vulnerability via getting the user to >> view a media file, it's probably via arbitrary code execution. ?In >> that case you don't need to disguise yourself, just get the viewer to >> go to your own website and do whatever you want, since there are no >> same-domain restrictions. > > See above about people who take steps to protect themselves when problems > like this arise and would be screwed over by sniffing. Okay, but we're talking about standardizing sniffing in a spec. As long as browsers' behavior in processing a given resource is well-defined and reliable, a proxy could work fine by just implementing the same algorithm. There's no reason that the proxy has to only look at MIME types, is there? It simplifies the proxy a bit, but not much. It will already have to do some content sniffing to identify what content is dangerous, unless it's just going to block everything of that file type (which I'm assuming isn't the case). Put another way: the problem here is not that browsers sniff. It's that browsers don't behave interoperably or predictably. Speccing a precise sniffing algorithm that everyone's willing to follow allows proxies to reliably know what browsers will do with it. What will cause problems is what you seem to be arguing for -- *not* speccing sniffing, so that browsers that sniff do so in an ad hoc, undefined manner that's difficult to predict. For the use-case of filtering exploits, it doesn't really matter what the behavior is, so long as it's consistent. Or am I missing something here?
Received on Thursday, 2 September 2010 12:53:30 UTC