- From: Hugh Glaser <hg@ecs.soton.ac.uk>
- Date: Wed, 15 Oct 2008 15:19:33 +0100
- To: Alan Ruttenberg <alanruttenberg@gmail.com>, Reto Bachmann-Gmür <reto.bachmann@trialox.org>
- CC: carmen r <_@whats-your.name>, semantic-web at W3C <semantic-web@w3.org>, Eric Miller <em@zepheira.com>
Peer-to-peer networks deal with this problem (and others) quite effectively. (Michael Hausenblas pointed this out to me.) On 15/10/2008 15:03, "Alan Ruttenberg" <alanruttenberg@gmail.com> wrote: > I believe that the model we should look to is the linux distribution system. > There are a number of mirrors each of which are coequal. On can explicitly > choose which site to use or have on randomly assigned. In a federation of > PURLs one site turned casino would be quickly removed from the list. I think > this is quite feasible to accomplish for PURL servers, have discussed this > with the developers, and hope to see a prototype some time in the near future. > > -Alan > > On Wed, Oct 15, 2008 at 9:12 AM, Reto Bachmann-Gmür > <reto.bachmann@trialox.org> wrote: >> >> carmen r said the following on 2008-10-14 15:28: >>> On Tue Oct 14, 2008 at 12:39:54AM +0100, Giovanni Tummarello wrote: >>> >>>> Hi Martin, all, yes, it is a service that was planned, >>>> >>>> unfortunately the cache system we have is based on HBase, which is >>>> still in a very early stage and badly crashed on us recently. We're in >>>> the process of updating, restoring it etc. >>>> It will take some time but it is coming, will announce it when ready. >>>> (probably together with a simple library for transparent fallover) >>>> >>>> So a semantic web client could simply do an HTTP on the URL and if >>>> fails switch back to Sindice or whoever else wants to do that. >>>> >>>> I agree this service is badly needed. I dont think Semantic Web can be >>>> that interesting if a client doesnt mash or chains together several >>>> resources automatically, with the consequent dramatic chances of >>>> failure, thus the need for one or more backup servers.. (which however >>>> >>> >>> i think its fundamental enough a need to warrant architectural consideration >>> >>> i mean on the level of HTTP. >>> >>> not saying HTTP should go away. probably some bblfish way of doing it >>> without inventing a new protocol (heck, Bittorrent still uses HTTP for >>> parts) >>> >>> >> If purl comes back up we are lucky, but maybe we could learn something >> anyway. >> >> Having names for fundamental terms based on the DNS system is a >> weakness. What will we do if purl.org <http://purl.org> gets taken over by a >> casino site? >> Will we argue that the terms keep their meaning even if the casino site >> says something else? In my scifi post[1] I've scheduled this topic for >> 2015. Using hash-uri or other non-http uris have advantage of stability, >> but it's harder to look up the meaning, could we combine the approaches? >> should we have protocol independents terms with evolving meaning as in >> natural languages? >> >>> we need alternatives to the Google "we are your backup server" system >>> >> indeed. >> >> reto >> >> 1. http://lists.w3.org/Archives/Public/semantic-web/2008Jan/0118.html >> >> > >
Received on Wednesday, 15 October 2008 14:20:29 UTC