W3C home > Mailing lists > Public > site-comments@w3.org > July 2005

Re: PR Hording at W3.org

From: Susan Lesch <lesch@w3.org>
Date: Fri, 22 Jul 2005 12:26:34 -0700
Message-ID: <42E1486A.5060100@w3.org>
To: donny@prnewsnow.com
Cc: site-comments@w3.org
Hello, Donny,

I have forwarded your note to W3C's Webmaster who may have more information
about robots.txt than I do.

donny@prnewsnow.com wrote:
> It came to my attention that w3 has placed a disallow tag in their
> Robots.txt to prevent robots from seeing th Check pages.
> I find this to be defective of the goal of w3.  SEO firms would jump at
> the chance to get compliant for a link back to the page.
> I went to the effort to go XHTML compliant on all 21,000 articles (have
> about 5% straggling still)  Placing the link to W3 Validator on all the
> pages.  Now this builds up your PR Respect with Google but you don't
> really need it.
> Throw a valid dog a bone and allow the page to be read to find the link
> back to the valid page.  Its the small reward that will have every SEO
> firm dancing to be XHTML compliant.
> Donny Lairson
> President PR News Now
> http://www.prnewsnow.com
> 29 GunMuse Lane
> Lakewood NM 88254
> 469 228 2183

Thank you for writing,
Susan Lesch           http://www.w3.org/People/Lesch/
mailto:lesch@w3.org               tel:+1.858.483.4819
World Wide Web Consortium (W3C)    http://www.w3.org/

Received on Friday, 22 July 2005 19:26:42 UTC

This archive was generated by hypermail 2.4.0 : Monday, 18 April 2022 20:33:44 UTC