SwiftOWLIM ver.2.9.0 loads Wordnet in two minutes

Ontotext announces ver 2.9.0 SwiftOWLIM 

OWLIM (http://www.ontotext.com/owlim/) - is a high-performance RDF repository with support for light-weight OWL inference and rule entailment. SwiftOWLIM is open-source licensed and purely implemented in Java as a Storage and Inference Layer of the Sesame RDF database.

With its new version, SwiftOWLIM remains the fastest RDF(S) and OWL engine - it can load, persist, and perform inference against millions of statements, at the unmatched speed of 200 000 st./sec. Now it takes advantage of its new multi-threaded inferencer, which comes to match nowadays multi-core CPUs. Its performance is also getting a boost from version 1.6 of JDK. As a result, SwiftOWLIM is effectively twice faster in a typical server setup.

Still, the main improvements in version 2.9.0 are related to reliability and usability. OWLIM can reason against larger datasets with more complicated semantics, without need of additional tuning. The rule-compiler went through several fixes and improvements, as a result of the feedback from multiple users. Among the improvements are better transaction isolation and better management of implicit (inferred) statements.

OWLIM is much easier to install and play with. The distribution package is pretty much self sufficient, and contains all the necessary runtime libraries. The new Getting-Started application template, allows for bootstrapping new applications, using OWLIM, within few minutes. The sample application loads several ontologies and data files, evaluates queries, and modifies the repository. One can even test OWLIM without compiling Java - just by means of specifying the files to be loaded and the queries.

The development of OWLIM is driven by real applications, publicly recognized benchmarks, and popular ontologies and datasets. After LUBM and few other benchmarks, we have tested OWLIM against the standard RDF(S)/OWL representation of Wordnet, developed by W3C's Semantic Web Best Practices and Deployment Working Group. One can play with it just by re-configuration of the Getting-Started template - such setup is provided in the distribution. This Wordnet representation consists of 1.9 million explicit statements, encoded in fragment of OWL Lite. On a notebook, SwiftOWLIM loads them and infers additional 6.3 million implicit statements within two minutes. As all the necessary semantics is supported, it answers correctly various non-trivial queries in real time.

Release Notes, version 2.9.0: 

  a.. Multi-threaded inference: loading speed improves 37-71% on a dual-CPU (4-core) server, depending on the rule-set; 33% speed up on a desktop machine (P4 with hyper-threading); 
  b.. Improved transaction isolation: corresponding to READ COMMITTED level in RDBMS; 
  c.. Transitive closure optimization: the materialization of the "closure" of transitive properties can be switched off. This prevents the generation of O(N2) implicit statements, for a chain of N individuals connected through a transitive property. This optimization improves dramatically the scalability and performance on datasets with long "chains" over transitive properties; 
  d.. Stack-safe inference: in ver. 2.8.3/4 a "stack-safe" mode, was allowing handling very "deep" inference chains; in this mode, OWLIM was slower. Now the reasoning algorithm is stack-safe without performance penalty or need of a specific configuration parameter; 
  e.. Improved management of implicit and explicit statements: separate retrieval of explicit and implicit statements is straightforward; 
  f.. Rule compiler fix: now it can process rules with virtually unlimited number of premises. 
  g.. Getting-started introduced: a sample application setup (incl. source code, binaries, scripts, and configurations), allowing for easy bootstrapping of applications, which use OWLIM; 
  h.. Wordnet: a sample application loading W3C's RDF/OWL representation of Wordnet is provided; 
  i.. Distribution improvements: OWLIM is now packed with all necessary runtime libraries; numerous improvements to the accompanying scripts make running OWLIM trivial. 
SwiftOWLIM can be downloaded at http://www.ontotext.com/owlim/. 
We are impatient to get your feedback and to hear your experiences, comments, problems, and requests at our mailing lists: 
http://www.ontotext.com/owlim/mailing-lists-info.html


The OWLIM team  
Ontotext Lab.
Sirma Group Corp.

Received on Tuesday, 12 June 2007 18:54:05 UTC