[whatwg] Proposal: Exclude robots on a per-link basis

Hello

Viewing the logs of applications I wrote, I noticed that a considerable 
number of requests are from robots following links of types such as "Add 
to shopping cart" or "Remember this item" - links that typically point 
to the same page they are clicked on, with some GET variable that 
triggers an action on the server.

Trying to find a solution to tell robots not to follow these links, I 
came across microformat Robots Exclusion Profile 
http://microformats.org/wiki/robots-exclusion and the @rel=nofollow 
attribute. While the latter does not look robot-specific to me (it 
actually states that the author wants to discourage from following the 
link), I must admit that I don't fully understand the Robots Exclusion 
Profile approach. If this approach is serving the purpose, please feel 
free to ignore this proposal. It might then be helpful to add some hint 
on this somewhere in 4.12 of the HTML spec.

I propose to add either a new attribute, or a new link type, or a 
keyword for @rel, whichever is most consistent in the HTML structure:
<a href="page.html?add-item=1" robots-nofollow>Add to cart</a>
<a href="page.html?add-item=1" rel="robots-nofollow">Add to cart</a>
<a href="page.html?add-item=1" type="robots-nofollow">Add to cart</a>

-- 
Markus

Received on Saturday, 26 November 2011 04:20:28 UTC