W3C home > Mailing lists > Public > ietf-http-wg@w3.org > January to March 2009

Re: The HTTP Origin Header (draft-abarth-origin)

From: Roy T. Fielding <fielding@gbiv.com>
Date: Fri, 23 Jan 2009 17:17:18 -0800
Message-Id: <67F16D8F-74C2-44CF-A057-AB650DAC9E29@gbiv.com>
Cc: Mark Nottingham <mnot@mnot.net>, Larry Masinter <LMM@acm.org>, ietf-http-wg@w3.org, Lisa Dusseault <ldusseault@commerce.net>
To: Adam Barth <w3c@adambarth.com>

On Jan 22, 2009, at 7:51 PM, Adam Barth wrote:
> On Thu, Jan 22, 2009 at 6:29 PM, Roy T. Fielding  
> <fielding@gbiv.com> wrote:
>> The feature of "defend themselves against CSRF by identifying
>> the referral page" is satisfied by "don't allow requests that
>> lack an appropriate Referer".  Your estimate that it would also
>> block some 3% of false negatives does not lessen its defense.
>> The 3% would get an error message in response.
>
> These 3% of potential users would be unable to use the Web site.  In
> talking with folks who run large Web sites, I've been told that
> excluding 3% of your potential customers is not acceptable.

No, those 3% of users would have to use TLS/SSL, which the site could
redirect to automatically (and should have been using anyway for a
state-changing authenticated request.  As your study shows, https
exchanges are not modified by those proxies.

Obviously, you would have had a different response if you had asked
them a more accurate question, like "would you prefer a 97% success
rate today or a 0% success rate today."

Referer was introduced in 1993, deployed by spiders in 1994, grudgingly
accepted by browsers in 1995, and only reached the 90% range sometime
around the 1999-2000 timeframe.  The notion that we can introduce Origin
now and reach anywhere near 97% correct deployment is absurd.

Meanwhile, those same sites will implement this CSRF protection
using Referer, if they need it, since 97% protection is obviously better
than no success if they want to protect against CSRF. That percentage
can already be improved to 99% if non-Referer state-changing requests  
are
(re)directed to an https site.  So, what you are actually asking is
that we standardize an entirely new header field in the vague hope
that it will eventually be deployed correctly on better than 97% of
browsers in the wild so that some 2.x% of those browsers that happen
to still be behind old non-CSRF-protecting proxies at that future
date might be able to use a separate codepath on the server that
duplicates all of the security checks for Referer instead of just
using an https connection, assuming that none of the browser
developers introduce a bug that makes Origin less secure than Referer.

Sorry, the answer from this server developer is NO.  Spend the
effort on improving the definition of Referer instead so that it
will be blocked by fewer proxies, and then help educate those
proxy vendors as to why they shouldn't make their own users more
vulnerable to CSRF attacks.  I suspect that you will have little
problem convincing sites to upgrade their old proxy if you can
show them a CSRF attack on salesforce.com, Google domains, or AWS
that is enabled by the proxy vendor's bad code or config.

>> Your claims are based on the assumption that those very same
>> 3% proxies will forward the Origin header unchanged.
>
> This is not an assumption.  In April 2008, measured how often various
> headers were suppressed for 283,945 browsers who viewed an
> advertisement we placed with a minor ad network.  We observed that the
> Referer header was suppressed for approximately 3% of requests whereas
> the Origin header was only suppressed 0.029-0.047% of requests (95%
> confidence).  For more detailed results and a description of the
> methedology, please see Section 4.2.1 of
> http://www.adambarth.com/papers/2008/barth-jackson-mitchell-b.pdf

I read the paper.  I am not sure if automated ad serving technology
would change the sample size to be self-selecting (and therefore not
representative of deployed browsers), but I doubt that it would change
the analysis much.  Regardless, Origin is not deployed and there is
no reason to believe that it would ever reach the 97% deployment
threshold.  Heck, HTTP parsing isn't implemented correctly in far
more than 3% of the browsers in the wild.

>> Your assumption is wrong.
>
> What evidence do you have to back up this claim?

16 years of Web protocol implementations, 14 years being the editor
of the HTTP specs, and 13 years in which my contact information has
been supplied for people to send "why doesn't HTTP work?" messages
for both standards and Apache httpd.  For example, there was a period
when some MS folks thought that HTTP/1.1 chunked encoding didn't work,
until I pointed out that all of their tests were being made though
their own corporate firewall (eating characters).  I get that stuff
all the time.

....Roy
Received on Saturday, 24 January 2009 01:17:56 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Friday, 27 April 2012 06:51:00 GMT