W3C home > Mailing lists > Public > www-rdf-interest@w3.org > March 2006

[announcement] OWLIM semantic repository v2.8.2 released

From: Damyan Ognyanoff <damyan@sirma.bg>
Date: Mon, 6 Mar 2006 17:23:56 +0200
Message-ID: <005701c64131$ffa76040$cd80a8c0@sirma.int>
To: "sekt" <sekt@aifb.uni-karlsruhe.de>
Cc: <www-rdf-interest@w3.org>, <semantic-web@w3.org>, <dip-all@lists.deri.org>, <seweb-list@lists.deri.org>, <sesame-interest@lists.sourceforge.net>, <interested-in-OWLIM@ontotext.com>
(*** We apologize if you receive multiple copies of this announcement ***)

Ontotext is happy to announce the release of ver. 2.8.2 of the OWLIM semantic repository, http://www.ontotext.com/owlim/. The new version is even faster and more scalable than the previous ones; it also supports richer semantics. The major news is that OWLIM is now much more configurable: one can choose between 7 different types of entailments (semantics) and control the size of its main index (to trade RAM vs. performance). 

OWLIM is a high-performance semantic repository, packaged as a Storage and Inference Layer (SAIL) for the Sesame RDF database (v.1.2.1-1.2.4). OWLIM uses the TRREE (Triple Reasoning and Rule Entailment Engine) engine to perform RDFS, OWL DLP, and OWL Horst reasoning. The most expressive language supported is a combination of limited OWL Lite and unconstrained RDFS. Reasoning and query evaluation are performed in-memory, while a reliable persistence strategy assures data preservation, consistency and integrity. 

OWLIM is proven to scale up to tens of millions of statements, maintaining upload speed of tens of thousands of statements per second. It can manage millions of statements even on commodity desktop hardware. OWLIM completes the LUBM(50,0) benchmark within 6 minutes on a $1000-worth machine. Given 6GB of RAM on an entry-level server, OWLIM loads LUBM(300,0) in about 50 minutes.

The changes in version 2.8.2, with respect to 2.8, can be summarized as follows:

        TRREE: OWLIM uses TRREE engine for in-memory reasoning and query evaluation. TRREE is a newer version of the IRRE engine, which was part of OWLIM v.2.8.

        7 different inference modes: OWLIM can be configured to work with one of three pre-built sets of rules that support respectively the semantics of RDFS, OWL Horst, and a specific fragment we name owl-max (combining OWL Lite with unrestricted RDFS). These rule sets can be altered to "partial-rdfs" mode, when some of the normative RDFS entailments are discharged for performance reasons. In addition, the entailment is made optional, so that it is possible to switch it off completely and to use OWLIM as a plain RDF store.

        Extended OWL support: owl:oneOf, owl:minCardinality, owl:maxCardinality, owl:cardinality; partial OWL-Lite T-Box (schema-level) reasoning added. 

        Configurable index size: allows the user to manage the tradeoff between required RAM and performance.

Damyan Ognyanoff

Ontotext Lab., Sirma AI
Received on Monday, 6 March 2006 15:24:40 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 7 January 2015 15:07:57 UTC