Re: The HTTP Origin Header (draft-abarth-origin)

On 23/01/2009, at 12:45 PM, Adam Barth wrote:
>> I don't think this argument is valid. The number of proxies that do  
>> this is
>> very small.
>
> In fact, this occurs for about 3% of users, which is a significant  
> number.

I'd like to dig into that. You believe that most of the suppression of  
the Referer header is done in proxies, due to the differences seen in  
HTTP and HTTPS. However, there are also considerable differences  
between the block rates for same-domain vs. cross-domain requests; are  
you implying that these proxies are parsing the Referer and only  
blocking those that are cross-domain? If so, this seems an odd  
rationale; a person or company blocking referers for purpose of  
privacy would presumably be doing so for all values, not just cross- 
domain referers. Likewise, someone doing it to hide intranet URLs  
would be more likely to only hide those, rather than to stop cross- 
domain referers.

Additionally, discriminating requests as cross-domain is more  
expensive to implement in an intermediary, and these implementers are  
famously sensitive to performance issues. All of the products that I'm  
aware of would easily allow wholesale blocking of a header, but would  
require a relatively expensive (and thereby less likely) callout  
(e.g., with ICAP) to selectively block them based upon request state.

On the other hand, I do notice that Firefox has the ability to  
selectively configure how Referer headers are blocked, both in terms  
of same-site vs. cross-site and HTTP vs. HTTPS;
   http://kb.mozillazine.org/Network.http.sendRefererHeader
   http://kb.mozillazine.org/Network.http.sendSecureXSiteReferrer
Couldn't that account for at least a portion of the discrepancies you  
saw?

BTW, did you look for vanilla wafers <http://www.junkbusters.com/ijbfaq.html#wafers 
 > to see how much of this stripping could be attributed to JunkBuster?

Also, did you find any rationale for the difference between rates seen  
on network A vs. network B? It's a pretty wide range...

The numbers that I found especially interesting were for stripping of  
same-site XmlHttpRequest-generated Referer headers, which came in at  
(eyeballing Figure 3) about 0.6% on HTTP and 0.2% on HTTPS  
(discounting the Firefox 1.x bug, which isn't relevant to this  
discussion, since we're talking about updated browsers as a pre- 
condition). Aren't these numbers closer to what one would expect? In  
particular, they're much closer to the numbers for "custom" headers  
that you measure, which means we are looking at implementations that  
white-list as a significant factor (as well as statistical error, of  
course)...

Lastly -- Figure 3 says that its unit is requests; wouldn't IP  
addresses be a more useful number here? Or, better yet, unique clients  
(tracked with a cookie)? Otherwise it seems that the results could be  
skewed by, for example, a single very active proxy. Likewise, did you  
record the geographical distribution of clients? It would be nice to  
have assurances that this sample represents a global audience, and not  
just a selective (read: US) one.


>> Those that exist will come under pressure to change their
>> behaviour when their users notice that Web applications that rely  
>> upon this
>> feature break behind those proxies.
>
> Unfortunately, these proxies prevent Web sites from relying on the
> Referer header, and so the operators of these proxies never come under
> pressure to stop suppressing the header.

Certainly they do. If Cool New Cross-Site Web Apps are broken, and  
they explain to the user why it is broken, both ISPs and companies  
will come under pressure.


>> In other words, just as the authors are satisfied having a staged  
>> deployment
>> where some browsers on the Web support cross-site requests using  
>> their
>> mechanism, while some still do not, I see no reason why it isn't  
>> just as
>> acceptable to accept that some proxies will support it, and some  
>> will not;
>> over time, more will.
>
> The difference is that there is a deployment plan self-interested
> agents to bring about the usefulness of the Origin header for CSRF
> defense, but there is no such plan for the Referer header.
>
> Origin header deployment plan:
>
> 1) Each user agent implements the Origin header and advertise that
> they support more secure than those that do not implement the header.
>
> 2) Sites add rules to their Web application firewall rules to protect
> those of their users that use a supporting browser.
>
> 3) Users choose supporting user agents because they are more secure.
>
> 4) Supporting user agents come to dominate the market.

IMO 99% of the driving factor for deployment here is going to be new  
features -- supporting cross-site XmlHttpRequest with authentication,  
etc.


>> unless the server decides to
>> also allow requests that omit the Origin header altogether.
>
> In fact, the ID requires participating sites to do this.
>>
>> If they do that,
>> however, it seems like a non-programmatic, non-Origin-creating  
>> browser would
>> leave a hole...
>
> Can you explain this hole?


Well, we'd be in the same situation as today; a current (non-Origin)  
browser would be able to make cross-site requests (using IMG,  
form.submit, etc.).

Cheers,

--
Mark Nottingham     http://www.mnot.net/

Received on Sunday, 25 January 2009 04:31:40 UTC