W3C home > Mailing lists > Public > www-talk@w3.org > January to February 2009

Re: Origin vs Authority; use of HTTPS (draft-nottingham-site-meta-01)

From: Eran Hammer-Lahav <eran@hueniverse.com>
Date: Wed, 11 Feb 2009 15:26:57 -0700
To: Adam Barth <w3c@adambarth.com>
CC: "www-talk@w3.org" <www-talk@w3.org>, Mark Nottingham <mnot@mnot.net>
Message-ID: <C5B890B1.1276E%eran@hueniverse.com>
But you are missing the entire application layer here! A browser will not use host-meta. It will use an application spec that will use host-meta and that application, it security is a concern, will specify such requirements to ensure interoperability. It is not the job of host-meta to tell applications what is good for them.


On 2/11/09 12:27 PM, "Adam Barth" <w3c@adambarth.com> wrote:

That would cause interoperability problems where user agents that care
about security would be incompatible with sites implemented with
insecure user agents in mind.  Based on past history, this leads to a
race to the bottom where no user agents can be both popular and

On Wed, Feb 11, 2009 at 11:46 AM, Eran Hammer-Lahav <eran@hueniverse.com> wrote:
> How about clearly identifying the threat in the spec instead of making this
> a requirement?
> On 2/11/09 10:14 AM, "Adam Barth" <w3c@adambarth.com> wrote:
> On Tue, Feb 10, 2009 at 11:51 PM, Eran Hammer-Lahav <eran@hueniverse.com>
> wrote:
>>> In particular, you should require that
>>> the host-meta file should be served with a specific mime type (ignore
>>> the response if the mime type is wrong.  This protects servers that
>>> let users upload content from having attackers upload a bogus
>>> host-meta file.
>> I am not sure the value added in security (which I find hard to buy) is
>> worth excluding many
>> hosting solutions where people not always have access to setting
>> content-type headers.
>> After all, focusing on an HTTP GET based solution was based on getting the
>> most
>> accessible approach.
> Adobe found the security case compelling enough to break backwards
> compatibility in their crossdomain.xml policy file system to enforce
> this requirement.  Most serious Web sites opt-in to requiring an
> explicit Content-Type.  For example,
> $ wget http://mail.google.com/crossdomain.xml --save-headers
> $ cat crossdomain.xml
> HTTP/1.0 200 OK
> Content-Type: text/x-cross-domain-policy
> Last-Modified: Tue, 04 Mar 2008 21:38:05 GMT
> Set-Cookie: ***REDACTED***
> Date: Wed, 11 Feb 2009 18:07:40 GMT
> Server: gws
> Cache-Control: private, x-gzip-ok=""
> Expires: Wed, 11 Feb 2009 18:07:40 GMT
> <?xml version="1.0"?>
> <!DOCTYPE cross-domain-policy SYSTEM
> "http://www.macromedia.com/xml/dtds/cross-domain-policy.dtd">
> <cross-domain-policy>
>   <site-control permitted-cross-domain-policies="by-content-type" />
> </cross-domain-policy>
> Google Gears has also recently issued a security patch enforcing the
> same Content-Type checks to protect their users from similar attacks.
>>> Also, if you want this feature to be useful for Web browsers, you
>>> should align the scope of the host-meta file with the notion or origin
>>> (not authority).
>> The scope is host/port/protocol. The protocol is not said explicitly but
>> is very much implied.
>> I'll leave it up to Mark to address wordings. As for the term 'origin', I
>> rather do anything but
>> get involved with another term at this point.
> I'd greatly prefer that is this was stated explicitly.  Why leave such
> a critical security requirement implied?
> Adam
Received on Wednesday, 11 February 2009 22:27:41 UTC

This archive was generated by hypermail 2.4.0 : Monday, 20 January 2020 16:08:30 UTC