W3C home > Mailing lists > Public > www-talk@w3.org > November to December 2008

Re: Fallback flow for /site-meta for top level domains

From: Ben Laurie <benl@google.com>
Date: Wed, 3 Dec 2008 12:32:16 +0000
Message-ID: <1b587cab0812030432l7d8949a2o70c3a14a7b54b268@mail.gmail.com>
To: Mark Nottingham <mnot@mnot.net>
Cc: Eran Hammer-Lahav <eran@hueniverse.com>, "www-talk@w3.org" <www-talk@w3.org>, Jonathan Rees <jar@creativecommons.org>

On Wed, Dec 3, 2008 at 10:38 AM, Mark Nottingham <mnot@mnot.net> wrote:
> Considering that one of your core use cases for this is security-related,
> I'm surprised that you're effectively arguing that HTTP and HTTPS URLs with
> the same authority be collapsed into one name space.
> Many standards and common practices currently sandbox policy and metadata to
> a single URL scheme + authority by default, including robots.txt, p3p.xml,
> cookie scoping,

Surely cookies are scoped to HTTP and HTTPS by default.

> automated redirection processing in HTTP,

I don't know what this is.

> cache invalidation, OPTIONS metadata, cross-site scripting

There are standards for XSS???

> and I'm sure quite a
> few more. This is the de facto standard for what a "Web site" is, and while
> there are many other colloquial meanings of that phrase, this is current
> technical practice.
> Trying to establish a standard for site-wide metadata that doesn't follow
> this practice is IMO doomed to sow yet more confusion about an already
> muddled area, and potentially open up security as well as usability and
> technical problems.
> That said, there's nothing to stop a particular application -- e.g., OpenID
> -- saying that for a particular purpose, site-meta should be checked on a
> HTTP URL even though the URL presented is mailto: (for example), or even
> that www.example.com should be tried if example.com isn't available
> (although I still don't think it's necessary).
> What I'm not willing to do is enshrine these things in standards that are
> supposed to help extend the Web architecture, not dilute it. The fact that a
> few $2 Web hosts don't provide adequate control to their customers in 2008
> should not affect something so fundamental as the definition of what a Web
> site is for the next 30 years (if this succeeds, of course).
> Cheers,
> On 03/12/2008, at 6:32 PM, Eran Hammer-Lahav wrote:
>>> On 02/12/2008, at 4:24 PM, Mark Nottingham wrote:
>>> /site-meta on http://foobar.com/ doesn't (and can't, on its own) make
>>> any authoritative assertions about mailto:dirk@foobar.com; even though
>>> the authority is the same, the URI scheme is different.
>>> I know this particular issue is an important one to the OpenID folks,
>>> but there needs to be a very careful and broad discussion of allowing
>>> policy and metadata from HTTP to be considered *automatically*
>>> authoritative for other protocols.
>> I do not considered /site-meta to be about HTTP resources. It is metadata
>> about the domain authority and uses HTTP as the protocol to deliver that
>> document. It can equally link to HTTP URIs as to other URIs (i.e. point to
>> its robots.txt available at an ftp:// URI). I think it is safe to assume
>> that whoever controls the domain controls any URI scheme within that domain.
>> Companies can split control between departments but you go high enough there
>> is one entity which owns everything under that authority.
>> HTTP clearly allows: 'GET mailto:eran@example.com', but what is actually
>> served is up to the server. In theory, that can serve a 303 with Link header
>> to the XRD describing the identifier. The problem, of course, is that most
>> web servers will fail on such request, or at least most platforms will not
>> allow the developer easy access to control the response to such requests.
>> But the point is, the HTTP protocol is nowhere restricted to provide
>> information about HTTP URIs alone. The fact that user-agents will use HTTP
>> when the URI scheme is HTTP and use FTP when the URI scheme is FTP is more
>> of a practical convention than a strict requirement.
>> The issue of what constitute authoritative metadata with regard to the
>> domain authority is not something we can resolve beyond the reasonable
>> expectation that the entity that control the domain has sufficient
>> authority. Can the profile.yahoo.com admin be considered the authority for
>> my profile page? In the context of discovery, I believe the answer is yes.
>> Philosophically, I can argue that only the profile owner has the authority
>> to control that page, but such control in today's infrastructure, is
>> eventually enforced by the domain admin anyway.
>> EHL
> --
> Mark Nottingham     http://www.mnot.net/
Received on Wednesday, 3 December 2008 13:24:48 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:33:07 UTC