Re: robots.rdf

<quote>
I was wondering if the robots exclusion protocol
could be leveraged to make alife easier for rdf-aware agents, in a way that
would be a lot less effort than going for something like full-blown P3P.
</quote>
Less effort than P3P??

<quote>
There are two ways that I am aware of the protocol being used at present -
either in a metatag (e.g. <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">)
or in a robots.txt file in the root directory of the server. The former
couldn't really add much of value to the existing situation, but the latter
might have a lot of potential. If robots.txt contained information
specifically aimed at rdf agents, then a lot of the ad hoc link
following/metatag scrunching that might otherwise have to be employed by
these agents wouldn't be necessary.
</quote>
Incidentally, since you brought the P3P thingy in, the smart way would
be instead to stick any RDF you want in its well-known location (that
has been designed just to allow this), so browsers like IE6 etc 
will just munch the metadata with a single GET.

Note this also relates to the sitemap thread (in fact, that's been one
of the possible applications we had in mind).

-M

Received on Wednesday, 9 January 2002 13:46:34 UTC