ANNOUNCE: SW Inferencing Skeleton in Haskell (Swish)

I've been working on building some tools for semantic web inferencing using 
Haskell.  (Haskell is a pure functional programming language with higher 
order functions, lazy evaluation and more;  see http://www.haskell.org/ )

The first phase of this has been posted on my web site at:

   http://www.ninebynine.org/Software/Intro.html#Swish  (links)
   http://www.ninebynine.org/Software/Swish-0.1.zip     (software source)
   http://www.ninebynine.org/Software/swish-0.1.html    (documentation)

This first phase is just a platform for future development, consisting of:
- An data type and functions for manipulating RDF models
   (extended to handle N3 formulae), with a simple interface for creating
   combining, accessing and comparing graphs
- A Notation3 parser
- A basic Notation 3 formatter
- A command-line utility for testing the above
- Numerous test programs

The code's a bit tacky in places (by Haskell standards) but it does seem to
work pretty reliably.  It has been tested with the Hugs and GHC Haskell 
systems (see document for details) -- GHC can be used to create a 
stand-alone executable.

For anyone who tries to run this, I draw your attention to the Haskell 
environment descriptions in section 4.3 of the document ... when my son 
test-installed it he didn't read this and consequently some of the support 
libraries were not located.

As it stands, it doesn't do much other than testing for graph equivalence 
(allowing for differing bnode identifiers) and merging graphs (renaming 
bnodes as needed to avoid introducing any additional entailments).  I'm 
currently working on designing some modules for doing simple queries and 
inferences on the RDF data.

#g


-------------------
Graham Klyne
<GK@NineByNine.org>
PGP: 0FAA 69FF C083 000B A2E9  A131 01B9 1C7A DBCA CB5E

Received on Friday, 6 June 2003 14:28:04 UTC