- From: <noah_mendelsohn@us.ibm.com>
- Date: Wed, 30 Jul 2008 11:04:50 -0400
- To: Dan Connolly <connolly@w3.org>
- Cc: www-tag <www-tag@w3.org>
My gut feel is that this might better be done by retrieval of hypermedia
documents as opposed to through maintenance of a centralized list. For
example, what if HTTP GET from http://uk (are retrievals from top level
domains supported?) returned a document with a list of public suffixes
such as "co.uk"? You could, I suppose, also establish some standard
subdomain so instead of retrieving from "uk" you'd retrieve from
http://domain_description.uk. Browsers could then use recursive
retrievals to build up pertinent parts of the public domain table locally.
Seems much more scalable and appropriately distributed than a centralized
list. Am I missing something obvious?
Noah
--------------------------------------
Noah Mendelsohn
IBM Corporation
One Rogers Street
Cambridge, MA 02142
1-617-693-4036
--------------------------------------
Dan Connolly <connolly@w3.org>
Sent by: www-tag-request@w3.org
06/19/2008 12:01 PM
To: www-tag <www-tag@w3.org>
cc:
Subject: public suffix list: when opacity meets security
[metaDataInURI-31 siteData-36]
I wonder how the principle of opacity applies in this case...
http://www.w3.org/TR/webarch/#pr-uri-opacity
The proposal is:
[[
The Mozilla Project (http://www.mozilla.org/), responsible for the
Firefox web browser, requests your help.
We are maintaining a list of all "Public Suffixes". A Public Suffix is a
domain label under which internet users can directly register domains.
Examples of Public Suffixes are ".net", ".org.uk" and ".pvt.k12.ca.us".
In other words, the list is an encoding of the "structure" of each
top-level domain, so a TLD may contain many Public Suffixes. This
information is used by web browsers for several purposes - for example,
to make sure they have secure cookie-setting policies. For more details,
see http://publicsuffix.org/learn/.
]]
-- Gervase Markham (Monday, 9 June)
http://lists.w3.org/Archives/Public/ietf-http-wg/2008AprJun/0483.html
arguments against include:
[[
By proper design you can easily make cross-site cookies be
verifiable. Set out the goal that a site must indicate that cross-site
cookies is allowed for it to be accepted, and then work from there.
There is many paths how to get there, and the more delegated you make it
close to the owners and operators of the sites the better.
The big question is what that design should look like, but it's
certainly not a central repository with copies hardcoded into software.
]]
-- Henrik Nordstrom 10 Jun 2008
http://lists.w3.org/Archives/Public/ietf-http-wg/2008AprJun/0552.html
tracker: ISSUE-31, ISSUE-36
--
Dan Connolly, W3C http://www.w3.org/People/Connolly/
gpg D3C2 887B 0F92 6005 C541 0875 0F91 96DE 6E52 C29E
Received on Wednesday, 30 July 2008 15:04:07 UTC