Re: CSR and Mozilla - Clarifying HTTP Header Filtering

On 2/20/08, Anne van Kesteren <annevk@opera.com> wrote:
> On Wed, 20 Feb 2008 15:15:39 +0100, Mark Baker <distobj@acm.org> wrote:
> > Your premise seems to be that in the future, the community might rally
> > around and widely deploy, brain-dead extensions which attempt to
> > violate the fundamental semantics of HTTP, in this case the safety of
> > GET messages.  IMO, that's not a realistic concern.
>
> I'm not talking about communities, or braind-dead extensions. I'm talking
> about the theoretical possibility that this might already be deployed on
> some servers around the world (or something of equivalent nature) and that
> therefore allowing such cross-domain GET requests with custom headers
> introduces a new attack vector. And introducing a new attack vector is
> something we should avoid, regardless of whether being vulnerable to that
> attack vector relies on violating the fundamental semantics of HTTP.

It's not a new attack vector, because I can already use curl to send a
GET message which causes the harm you're worried about.  AFAICT, all
that changes in a cross-site scenario is that the attacker uses the
client as an anonymizer, something that can already be done with open
proxies (of various flavours).  Is that worth crippling the spec in
such a fundamental way?  Not IMO.

Also, I have no pity for any Web admin who suffers harm as a direct
result of permitting badly designed Web apps to be deployed on their
servers.

Mark.
-- 
Mark Baker.  Ottawa, Ontario, CANADA.         http://www.markbaker.ca
Coactus; Web-inspired integration strategies  http://www.coactus.com

Received on Wednesday, 20 February 2008 18:43:17 UTC