Re: Some thoughts on a new publication approach

Hi Martin,

strong words, let me try to clarify a few things. :)

Le 27 oct. 2013 à 01:06, Martin J. Dürst <duerst@it.aoyama.ac.jp> a écrit :
> I don't think we should do the search engines' work.

Yes we can. :)
See that as a two folds thing: On search engines out there, we have little to no control on their ranking algorithm, search strategy, etc.  On a local search engine you could perfectly make everything browsable and searchable in the way that they make sense to your information architecture.

See for example the W3C mailing-list search engine.

> Even more, this proposal, as far as I understand it, would essentially make it impossible to go search for "foo-tech Last Call".

It depends on what you call impossible. If you think about using the common search engines, the answer is yes. If you rely on a local search engine or even your own then no, there is no issue.

> That would be bad. I'm not exactly sure how to express this, but in some way, tweaking robots.txt for this feels like censorship to me.

nope. Censorship is removing content entirely. There is still access to old versions. Through linking, nobody forbids you to browse them and even collect them. I think I have the full TR space until 2008.

Hope it helps.

-- 
Karl Dubost
http://www.la-grange.net/karl/

Received on Monday, 28 October 2013 01:40:16 UTC