- From: Sandy Smith <ssmith@forumone.com>
- Date: Sat, 22 Jan 2005 18:27:46 +0000
- To: Trejkaz <trejkaz@trypticon.org>
- Cc: www-html@w3.org
Many Web applications render content to static HTML files for efficiency and server load. Thus, they cannot re-create a page for a given user agent. For the same reason, dynamic transformation of those static files is not a reasonable expectation. On Jan 22, 2005, at 10:18 AM, Trejkaz wrote: > On Sunday 23 January 2005 02:07, Mark Birbeck wrote: >> For example, all the blogging software companies could indicate that >> the >> 'type' of the page was a blog. Or they could mark up the comments >> area as >> 'comments'. > > I would hope it's the latter, unless you intend to penalise the owner > of the > weblog the same way that the spammers are penalised. > > I'm starting to wonder why this couldn't have been achieved without > markup at > all. The web server _knows_ that it's Googlebot requesting the page. > Couldn't the web application simply omit the entire comment section? > Or add > any magic attributes only for bots? Or if more than one bot had to be > supported, could they not introduce a new HTTP header? > > TX > > -- > Email: Trejkaz Xaoza <trejkaz@trypticon.org> > Web site: http://xaoza.net/ > Jabber ID: trejkaz@jabber.zim.net.au > GPG Fingerprint: 9EEB 97D7 8F7B 7977 F39F A62C B8C7 BC8B 037E EA73 > -- Sandy Smith, Senior Programmer Forum One Communications <ssmith@forumone.com> http://www.forumone.com/ tel. (703) 548-1855 x28
Received on Saturday, 22 January 2005 21:36:19 UTC