W3C home > Mailing lists > Public > public-tracking@w3.org > March 2012

RE: Initial feedback on the well-known URI Proposal

From: Shane Wiley <wileys@yahoo-inc.com>
Date: Tue, 6 Mar 2012 19:02:16 -0800
To: "Aleecia M. McDonald" <aleecia@aleecia.com>, "public-tracking@w3.org (public-tracking@w3.org)" <public-tracking@w3.org>
Message-ID: <63294A1959410048A33AEE161379C8023D104172D6@SP2-EX07VS02.ds.corp.yahoo.com>

Most web sites in the world today are hosted through a 3rd party service (GoDaddy, Y! Small Biz, etc.) but these tend to be the medium to small size publisher world as you've pointed out.  While hosted by the 3rd party I don't mean to say they operate under that service's domain name - merely they are using tools from that 3rd party to build and maintain their site even if through its own domain.

I'll side step the P3P issues as your assessment with respect to Y! Small Biz isn't correct (we can discuss offline if you're interested - don't want to drag this group through the details) and I don't believe it's an important point for this conversation other than to say there could be an option for 3rd parties to build tools to enable small publishers to support DNT (at some fee I would assume) or hosting services could build these tools themselves to offer to their customers.

Agreed - content hosting sites (Hulu, YouTube, Y! Flickr, etc.) are a separate issue and are arguable outside of scope for DNT as these would likely be considered a 1st party in most cases (or at least a Service Provider in others).

I have two Wordpress accounts myself and you have the option to either go with the hosted solution or simply host the Wordpress software separately through my own domain (both of mine go through my own domains).  In these cases you have full control of the environment but as I had stated earlier, in this case via GoDaddy, many of the tools were easily available to me with automated installation options.  I have the ability to create single pages at my discretion but attempting to self-implement automated response header code would be more complex (no visual representation of success) whereas a static page is fairly easy to see if I "got it right".

Similar to P3P testing tools, as we get further along this part of the conversation, I'm hopeful someone in the WG will step up and volunteer to build a Well-Known URI testing tool for publishers.

Thank you,

From: Aleecia M. McDonald [mailto:aleecia@aleecia.com]
Sent: Tuesday, March 06, 2012 9:36 PM
To: public-tracking@w3.org (public-tracking@w3.org)
Subject: Re: Initial feedback on the well-known URI Proposal

Hi Shane,

Ok, we're talking about very different use cases here. I think you are talking less about generic small website owners and more about hosted sites -- an important market, but not the one I thought you were describing. Let's see if I can understand your use case, and other hosted scenarios like yours.

Pretend I'm using Yahoo! Small Biz. From memory, with at least some Yahoo! business tools I inherit a P3P policy that may or may not match my actual privacy practices, and I probably inherit a human readable privacy policy too. I don't mean to pick on Yahoo! for this -- others didn't support P3P at all -- but I think with the benefit of hindsight, we might try to create a path where users can set their own DNT responses rather than have something set for them that might turn out to be untrue. (Actually, we look at saying some variety of Thou Shalt Not Send a DNT Signal Your Users Did Not Mean. Perhaps we want to examine something similar for hosted sites sending DNT on behalf of site owners that have no idea...)

So I go to http://smallbusiness.yahoo.com/. Keen, I can get a domain name. If I can arbitrarily upload *.html pages in any directory structure I like (can I?) then yes, having http://www.myhosteddomain.com/knownlocation/dnt.html works just fine for me. And here I agree with you: for a hosted environment, in a world where it's hard to get a shell account any more, it doesn't matter if I can copy & paste simple code. Unless I'm now missing something in the other direction, I'm going to want access to the server and that's going to fail. Unless, of course, the hosted solution does some very easy configuration to support DNT. But certainly that's one more thing that can hold up adoption.

How about Flickr? If I host photos there, what does it look like if I want to handle DNT for my photos? YouTube? Hulu? My first impression here is that a site that simply hosts content, not entire sites, is a different problem.

Let's say instead I go to WordPress, which wikipedia tells me powers about 15% of the "top million" websites. I could still get my own domain and we're into the use case I was thinking about initially. Or, I could wind up with a hosted site, which seems to just have pages in the format of example.wordpress.com/year/month/day/title<http://example.wordpress.com/year/month/day/title>. Does anyone know WordPress well enough off-hand to know if I could create my own subdirectory and file name, or if it's only generated URLs in the hosted environments?    For that matter: hosted WordPress does analytics and comment spam blocking automagically. Depending on how they do that, it might not be possible for hosted WordPress customers to be DNT compliant, unless WordPress is willing to either change or at least disclose their practices.

It's possible this is an edge case where the type of response doesn't actually matter that much -- the general bureaucracy from hosting will trump all else. Without spending much time on research, my quick reaction here is that no matter what we do, a non-trivial proportion of hosted sites will need active involvement of the hosting party to be able to support DNT. This seems like a good thing to think about while looking for early adopters.


On Mar 6, 2012, at 3:26 PM, Shane Wiley wrote:


This is probably a situation where less than 1% of web site owners have technical skills at your level or above.  Having worked with many small business owners via Yahoo! Small Biz, I've seen their reliance on pre-packaged products, the difficulties that have in even operating those, and I believe they would be COMPLETELY lost if asked to manage something more detailed from a response header coding perspective.  On the other hand, if they had to create a single page on their site (well-known URI) with a specific format of text, they could probably accomplish this.

- Shane

From: Aleecia M. McDonald [mailto:aleecia@aleecia.com]
Sent: Tuesday, March 06, 2012 5:44 PM
To: public-tracking@w3.org<mailto:public-tracking@w3.org> (public-tracking@w3.org<mailto:public-tracking@w3.org>)
Subject: Re: Initial feedback on the well-known URI Proposal

On Mar 6, 2012, at 4:13 AM, Shane Wiley wrote:

The one choice that does appear to be off the table at this point (unless someone strongly disagrees) is Response Headers in isolation as this would take years before medium to small web sites would be able to support DNT then (would require standard web server systems to come with off-the-shelf support for Response Headers).  Agreed?

Not following, perhaps my mistake here. I should think that adding a response header to www.aleecia.com<http://www.aleecia.com> would be trivial for a competent developer, and that the code would be easy enough to have sample code for copy&paste cargo cult programming for the non-competent programmers (read: people like me.) It might take longer to get PHP installed, updated, and running...

What am I missing? Where is this hard?

Received on Wednesday, 7 March 2012 03:03:12 UTC

This archive was generated by hypermail 2.3.1 : Friday, 3 November 2017 21:44:46 UTC