Re: XHR header blacklist rationale

>> I also made it clear that the user agent is not to set any headers 
>> other than those on that list and those permitted to be set if the 
>> author has not set them (as explained under the send() algorithm).
> 
> So, why are the headers below on the list?
> 
>     * Accept-Charset
>     * Accept-Encoding

I do see a reason why a UA wouldn't want content to set these.

Which charsets and encoding a UA supports is currently very UA 
dependent. Currently this is ok since it's a feature not exposed to web 
content, but rather just an interface between the server and the UA.

If web content can set these headers it is highly likely that pages will 
inadvertently create UA dependent pages that work if the UA supports 
some encodings, but break in UAs that don't. It would effectively force 
all UAs to support some standard set of encodings in order to work with 
the web.

>     * Expect

Don't know much about this header so i'll let others speak here.

>     * Referer
>     * User-Agent

These should absolutely not be under control of web content.

The Referer header is used by some web servers for security checks so 
allowing this to be settable would work around that. Servers can't 
currently rely on the header being there due to some firewalls/proxies 
filtering it, however they can rely on it being true when it is there.

The User-Agent is used a lot for logging and measuring various aspects 
(OS, UA, etc) of the user base for a site. Allowing this to be "spoofed" 
by a web page would severely reduce its usefulness. You cite in a 
different mail that you want to be able to set this to work around 
servers that send different content based to different UAs based on this 
header. However if we let this header be set by web content then servers 
would not be able to rely on the User-Agent header and would likely 
start using even worse mechanisms.

/ Jonas

Received on Tuesday, 27 May 2008 23:29:52 UTC