W3C home > Mailing lists > Public > semantic-web@w3.org > January 2011

Re: Reasoning over millions of triples - any working reasoners?

From: Ivan Mikhailov <imikhailov@openlinksw.com>
Date: Sat, 22 Jan 2011 13:42:31 +0600
To: Sampo Syreeni <decoy@iki.fi>
Cc: Harry Halpin <hhalpin@ibiblio.org>, Semantic Web <semantic-web@w3.org>
Message-ID: <1295682151.5222.4672.camel@octo.iv.dev.null>
Hello Sampo,

On Sat, 2011-01-22 at 00:19 +0200, Sampo Syreeni wrote:
> On 2011-01-19, Ivan Mikhailov wrote:
> > Virtuoso deals with owl:sameAs in a scalable way, so you can try. Of 
> > course, a single chain 50 million connections long would cause 
> > problems, but more traditional cases should work fine. Google for 
> > "virtuoso owl:same-as input:inference" may be the fastest way to get 
> > more hints.

> A mere fifty million indices mean nothing when you mind your algos, and 
> especially if you really utilize the current parallel hardware we have 
> to its fullest. It'd actually be a shame if you couldn't sustain 50M 
> such arbitrary reductions per hour, guaranteed, full-time, on a single 
> Duo, even using something much less efficient than well-hand-optimized C 
> and/or assembly.

I did not mean the sustained speed of reduction on some frozen state, we
had to answer multiple queries simultaneously, in interactive style, on
fresh data.

Best Regards,

Ivan Mikhailov
OpenLink Software
Received on Saturday, 22 January 2011 07:43:04 UTC

This archive was generated by hypermail 2.4.0 : Thursday, 24 March 2022 20:41:25 UTC