Re: CSR and Mozilla - Clarifying HTTP Header Filtering

On 2/20/08, Jonas Sicking <jonas@sicking.cc> wrote:
> Mark Baker wrote:
>  > On 2/20/08, Anne van Kesteren <annevk@opera.com> wrote:
>  >> On Wed, 20 Feb 2008 15:15:39 +0100, Mark Baker <distobj@acm.org> wrote:
>  >>> Your premise seems to be that in the future, the community might rally
>  >>> around and widely deploy, brain-dead extensions which attempt to
>  >>> violate the fundamental semantics of HTTP, in this case the safety of
>  >>> GET messages.  IMO, that's not a realistic concern.
>  >> I'm not talking about communities, or braind-dead extensions. I'm talking
>  >> about the theoretical possibility that this might already be deployed on
>  >> some servers around the world (or something of equivalent nature) and that
>  >> therefore allowing such cross-domain GET requests with custom headers
>  >> introduces a new attack vector. And introducing a new attack vector is
>  >> something we should avoid, regardless of whether being vulnerable to that
>  >> attack vector relies on violating the fundamental semantics of HTTP.
>  >
>  > It's not a new attack vector, because I can already use curl to send a
>  > GET message which causes the harm you're worried about.  AFAICT, all
>  > that changes in a cross-site scenario is that the attacker uses the
>  > client as an anonymizer, something that can already be done with open
>  > proxies (of various flavours).  Is that worth crippling the spec in
>  > such a fundamental way?  Not IMO.
>
>
> When you use curl you will not be able to include the auth headers or
>  cookies of other users. You will also not be able to use curl to connect
>  to websites inside firewalls.

But Anne was concerned about existing sites, viz a viz his Amazon
example.  So which is it?

Mark.
-- 
Mark Baker.  Ottawa, Ontario, CANADA.         http://www.markbaker.ca
Coactus; Web-inspired integration strategies  http://www.coactus.com

Received on Thursday, 21 February 2008 15:41:14 UTC