W3C home > Mailing lists > Public > public-webapi@w3.org > April 2006

Re: (XMLHttpRequest 2) Proposal for cross-site extensions to XMLHttpRequest

From: Ian Hickson <ian@hixie.ch>
Date: Fri, 14 Apr 2006 21:10:54 +0000 (UTC)
To: Mark Nottingham <mnot@yahoo-inc.com>
Cc: public-webapi@w3.org
Message-ID: <Pine.LNX.4.62.0604142039420.21459@dhalsim.dreamhost.com>

On Fri, 14 Apr 2006, Mark Nottingham wrote:
> > 
> > In any case this is largely academic. In practice there won't be that 
> > many resources that you'll be accessing cross-site, especially in the 
> > course of a single session.
> That's a big assumption to make. I have use cases where a large variety 
> of resources would need to be accessed cross-site.
> BTW, would you consider these URIs to have different policies?
> http://www.example.com/search?a=b
> http://www.example.com/search?c=d

Yes. But if you're doing a POST, why not include the variables in the 
entity body?

> > Since in reality the problem is with the response, not the request, 
> > I'm starting to become of the opinion that there aren't any unsafe 
> > methods.
> POST, PUT and DELETE are unsafe; are you suggesting that we redefine 
> HTTP's concept of safety?

By "unsafe" I mean "opening new vulnerabilites", sorry.

> > > As stated before, I'm not sure the existence of one hole justifies 
> > > the intentional opening of other holes.
> > 
> > It's not "one hole". Most of the Web works this way, always has.
> I was referring to the ability to do a POST; obviously GET is possible 
> through a variety of methods, but that's OK, because it's safe.

In that case I'm confused; you can't do a POST with a <script> element. 
Did you mean <form>? In any case, it is still possible to do POST 
submissions to arbitrary URIs without any protection whatsoever today, and 
that can't change (the Web relies on it). There's no reason to protect 
against things that you can do anyway -- it's like putting a steel door 
with iris recognition access controls on the front of your house, with an 
open window next to it. All it does is make it harder to do the right 
thing, the wrong thing remains easy.

> The attack I'm concerned about is an attacker write some XHR code in a 
> fashion that sends a request that has some side effect on another server 
> (say, your bank account). XHR introduces a new attack vector here 
> because it sends the request with your cookies; the user doesn't have to 
> initiate the interaction.

Forms already send requests with your cookies and the user already doesn't 
have to initiate the interaction. Cross-site XMLHttpRequest does not 
introduce a new vulnerability here. The only differences between 
XMLHttpRequest cross-site requests and existing cross-site requests is 
that XMLHttpRequest would let you read the return value, and would let you 
change (some of) the HTTP headers arbitrarily.

> It's true that it's possible to muck around with script tags and HTML 
> forms to send an arbitrary POST without interaction (the "one hole"), 
> but the existence of one accidental attack vector isn't justification 
> for intentionally creating (and standardising) another bigger one (not 
> just POST, but other methods as well).

Sure, that's why I'm proposing that non-GET requests should have the 
pre-flight check.

> I do wonder how long will it take for the browser preferences and 
> proxies to catch up; doubtless some people will want [this Referer] 
> blocked too. I'm reminded of SOAPAction.

Well, if host A is contacting host B, it can already send the full path 
and everything. So this header doesn't introduce a new privacy leak, and I 
see no reason why anyone would block it (especially since doing so 
introduces security vulnerabilities).

Ian Hickson               U+1047E                )\._.,--....,'``.    fL
http://ln.hixie.ch/       U+263A                /,   _.. \   _\  ;`._ ,.
Things that are impossible just take longer.   `._.-(,_..'--(,_..'`-.;.'
Received on Friday, 14 April 2006 21:11:06 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:16:21 UTC