Re: CSR and Mozilla - Clarifying HTTP Header Filtering

On Tue, 19 Feb 2008 15:33:02 +0100, Mark Baker <distobj@acm.org> wrote:
> On 2/19/08, Anne van Kesteren <annevk@opera.com> wrote:
>> On Tue, 19 Feb 2008 05:21:12 +0100, Mark Baker <distobj@acm.org> wrote:
>> > http://lists.w3.org/Archives/Public/public-webapi/2006May/0008.html
>>
>> No, these are completely different cases. What you're referring to is ok
>> for same-origin requests and is what the same-origin requests still  
>> allow.
>> Non same-origin requests probably require a different policy though.
>
> I think it's the same case.  The issue in both cases is that the
> script should always be subordinate to the user agent whose job it is
> to ensure that the messages it sends are valid HTTP messages that
> don't misrepresent either the user or its own capabilities.

The issue is that cross-site requests that are possible today for GET do  
not involve arbitrary headers made up by the author. Therefore servers  
could be vulnerable to cross-site GET requests that do have arbitrary  
headers set. This is a new attack vector and has nothing to do with the  
same-origin blacklist.

Having said that, Henri Sivonen suggested that for cross-site GET requests  
where the author has provided new headers the preflight OPTIONS could also  
be performed. You'd basically get

   if method == GET && !authorHeaders:
      crossSiteRequest()
   else:
      crossSiteRequestWithPreflight()


-- 
Anne van Kesteren
<http://annevankesteren.nl/>
<http://www.opera.com/>

Received on Tuesday, 19 February 2008 14:52:16 UTC