Re: Wikipedia incremental updates

Hi Nicolas,

The upcoming DBpedia Live service is hosted at:

	http://dbpedia-live.openlinksw.com/sparql

This service extracts live updates from Wikipedia as can be seen at:

	http://dbpedia-live.openlinksw.com/stats/

It is still under going testing but is available for use and scheduled become/replace the default DBpedia service soon ...

Best Regards
Hugh Williams
Professional Services
OpenLink Software
Web: http://www.openlinksw.com
Support: http://support.openlinksw.com
Forums: http://boards.openlinksw.com/support
Twitter: http://twitter.com/OpenLink

On 22 Jan 2010, at 03:35, Nicolas Torzec wrote:

> Hi there,
> 
> I am using open data sets such as Wikipedia for data mining and knowledge acquisition purposes; entities and relations extracted being exposed and consumed via indices.  
> 
> I am already retrieving and processing new Wikipedia static dumps every time they are available, but I would like to go beyond this and use incremental/live updates to be more in synch with Wikipedia content.
> 
> I know that I could use some Web services and IRC Channels for tracking changes in Wikipedia but, beside the fact that the web service has been designed more for tracking individual changes than monitoring Wikipedia changes continuously, these two methods will still require to parse the update messages (for extracting the URLs of the new/modified/deleted pages) and then to retrieve the actual pages.
> 
> Does anyone has experience with that? 
> 
> Is there any other way to retrieve incremental updates in a reliable and continuous way, especially in the same format as the one provided for the static dumps?  (mysql replication, incremental dumps... )
> 
> I have also read that DBpedia was trying to be more in sync with Wikipedia content. How do they plan to stay in sync with Wikipedia updates?      
> 
> 
> Thanks for your help.
> 
> Best,
> Nicolas Torzec.

Received on Friday, 22 January 2010 04:36:13 UTC