[whatwg] ContextAgnosticXmlHttpRequest: an informal RFC

On Wed, 9 Mar 2005 08:57:12 +0000, Jim Ley <jim.ley at gmail.com> wrote:
> On Tue, 8 Mar 2005 19:09:43 -0800, Chris Holland <frenchy at gmail.com> wrote:
> > Well, the value of the Referrer header i'm talking about in this case,
> > would always be the URI of the document originating the
> > ContextAgnosticXmlHttpRequest, NOT the *document*'s referrer. Based on
> > this requirement, i should be able to rely on this header to protect
> > my service.
> 
> How do you know it's not just some random client with a refererrer
> that happens to meet your idea of accurate.  Even if implementors of
> your version of the object were religiously accurate in following this
> rule, no other HTTP implementation need do it.
> 

well this is part of a much broader issue. If you offer an HTTP
service, any piece of software out there is free to abuse it. An
end-user has to first install said piece of software and run it. What
we're talking about here is a malicious web *document* gaining access
to data from a foreign host. If the User Agent is a traditional web
browser, the only way a given document could ever initiate a request
to a host different from the one that served it, would be through a
ContextAgnosticHttpRequest (i'm liking this name less and less, sorry
about that), and this request would infallibly send, in every request,
the full URI of the document initiating the request, as the value of
the "Referer" header.

I'm agreeing with the both of you that this isn't likely to be a good
solution to the whole permission issue as it would require providers
of existing services that live behind firewalls to protect themselves
to not be vulnerable should such a feature ever roll out into
mainstream browsers. The solution should require providers of HTTP/XML
services to "opt-in" allowing their document to be retrieved by
foreign hosts, hence the next proposal of "X-Allow-Foreign-Host"

I do however still think that the "Referer" header *should* be sent in
all cases so providers of HTTP/XML services can at least get a better
idea of "where all this traffic is coming from". I could also see this
be used to fine-tune access restrictions to my service, that is,
*ONCE* i have already *opted-in* allowing foreign documents to access
my service.

> > How about requiring from a service that it sets an extra HTTP header
> > to offer its content to "foreign" hosts:
> >
> > X-Allow-Foreign-Host: All | None | .someforeigndomain.com |
> > .somehost.someforeigndomain.com
> 
> This is a much better proposal than the stealing of URI's in my domain
> to mean some special thing.  We're already plagued by the Favicon bugs
> in FireFox hammering our servers with requests for documents we never
> defined

cool.

> > all this, i believe, tends to bleed into your own idea of establishing
> > some sort of trust relationship. To that end, I need to spend more
> > time  grokking 11.4 from your document. I think I'm getting there.
> 
> 11.4 isn't particularly relevant surely?  That's about Cross-document,
> so both documents would need to exist on the client before any
> communication could occur.

snap. ah ... indeed. 

> 
> > I was basically trying to
> > further limit the types of documents you could ever retrieve, to
> > purely valid XML documents, so no random text or Tag Soup HTML
> > document could be arbitrarily leeched.
> 
> Please don't have any solution that limits the user to XML, it's a
> pointless arbritrary restriction that offers nothing but serious
> performance hits to the client, and complications to the user.

well it would appear XML validity already is a restriction, but okee.

-c


-- 
Chris Holland
http://chrisholland.blogspot.com/

Received on Wednesday, 9 March 2005 01:31:50 UTC