W3C home > Mailing lists > Public > www-rdf-interest@w3.org > June 2003

Re: ANNOUNCE: SW Inferencing Skeleton in Haskell (Swish)

From: Jos De_Roo <jos.deroo@agfa.com>
Date: Sun, 8 Jun 2003 12:53:59 +0200
To: "Graham Klyne <gk" <gk@ninebynine.org>
Cc: RDF interest group <www-rdf-interest@w3.org>, naudts_vannoten@yahoo.com
Message-ID: <OF114230D5.B17903B6-ONC1256D3F.0038C402-C1256D3F.003BE21A@agfa.be>


Graham, this is very rdf-interesting!

Guido Naudts, doing his Masters thesis,
is also using Haskell succesfully.
There's a current *draft* at
http://www.agfa.com/w3c/2002/02/thesis/thesis.pdf
and test cases and running code at
http://www.agfa.com/w3c/2002/02/thesis/

Guido is still very busy; here's a sample session

__   __ __  __  ____   ___
_______________________________________________
||   || ||  || ||  || ||__      Hugs 98: Based on the Haskell 98 standard
||___|| ||__|| ||__||  __||     Copyright (c) 1994-2001
||---||         ___||           World Wide Web: http://haskell.org/hugs
||   ||                         Report bugs to: hugs-bugs@haskell.org
||   || Version: December 2001
_______________________________________________

Haskell 98 mode: Restart with command line option -98 to enable extensions

Reading file "C:\Hugs98\lib\Prelude.hs":

Hugs session for:
C:\Hugs98\lib\Prelude.hs
Type :? for help
Prelude> :l N3Engine
Reading file "N3Engine.hs":
Reading file "Utils.hs":
Reading file "IO.hs":
Reading file "C:\Hugs98\lib\Ix.hs":
Reading file "IO.hs":
Reading file "Utils.hs":
Reading file "TripleData.hs":
Reading file "XML.hs":
Reading file "N3Parser.hs":
Reading file "C:\Hugs98\lib\hugs\Interact.hs":
Reading file "C:\Hugs98\lib\exts\Observe.lhs":
Reading file "N3Parser.hs":
Reading file "C:\Hugs98\lib\Array.hs":
Reading file "C:\Hugs98\lib\List.hs":
Reading file "C:\Hugs98\lib\Maybe.hs":
Reading file "C:\Hugs98\lib\List.hs":
Reading file "C:\Hugs98\lib\Array.hs":
Reading file "LoadTree.hs":
Reading file "N3Unify.hs":
Reading file "UnifIn.hs":
Reading file "TripleApi.hs":
Reading file "Subgraphs.hs":
Reading file "GenerateDB.hs":
Reading file "Transform.hs":
Reading file "ToN3.hs":
Reading file "Transform.hs":
Reading file "Subgraphs.hs":
Reading file "N3Unify.hs":
Reading file "AltsOut.hs":
Reading file "AltsIn.hs":
Reading file "C:\Hugs98\lib\Time.hs":
Reading file "C:\Hugs98\lib\Locale.hs":
Reading file "C:\Hugs98\lib\Char.hs":
Reading file "C:\Hugs98\lib\exts\IOExts.hs":
Reading file "C:\Hugs98\lib\exts\IORef.lhs":
Reading file "C:\Hugs98\lib\exts\IOExts.hs":
Reading file "C:\Hugs98\lib\Time.hs":
Reading file "SubAnons.hs":
Reading file "N3Engine.hs":

Hugs session for:
C:\Hugs98\lib\Prelude.hs
C:\Hugs98\lib\Ix.hs
IO.hs
Utils.hs
TripleData.hs
XML.hs
C:\Hugs98\lib\hugs\Interact.hs
C:\Hugs98\lib\exts\Observe.lhs
N3Parser.hs
C:\Hugs98\lib\Maybe.hs
C:\Hugs98\lib\List.hs
C:\Hugs98\lib\Array.hs
LoadTree.hs
UnifIn.hs
TripleApi.hs
GenerateDB.hs
ToN3.hs
Transform.hs
Subgraphs.hs
N3Unify.hs
AltsOut.hs
AltsIn.hs
C:\Hugs98\lib\Locale.hs
C:\Hugs98\lib\Char.hs
C:\Hugs98\lib\exts\IORef.lhs
C:\Hugs98\lib\exts\IOExts.hs
C:\Hugs98\lib\Time.hs
SubAnons.hs
N3Engine.hs
N3Engine> en5
Starting the parse ...

N3Trace:

140
Query:

{:Frans gc:granddaughter __:x. }

Solution:

{:Frans gc:granddaughter :Goedele. }

Solution:

{:Frans gc:granddaughter :Veerle. }

Solution:

{:Frans gc:granddaughter :Nele. }

Solution:

{:Frans gc:granddaughter :Ann. }

Solution:

{:Frans gc:granddaughter :Ann_Sophie. }

Solution:

{:Frans gc:granddaughter :Valerie. }

Solution:

{:Frans gc:granddaughter :Stephanie. }

Solution:

{:Frans gc:granddaughter :Louise. }

Solution:

{:Frans gc:granddaughter :Bieke. }

Solution:

{:Frans gc:granddaughter :Tineke. }

Solution:

{:Frans gc:granddaughter :Stefanie. }

Solution:

{:Frans gc:granddaughter :Lien. }

Working time:
 seconds 86 micros 0

N3Engine>



--
Jos De Roo, AGFA http://www.agfa.com/w3c/jdroo/


                                                                                                                          
                    Graham Klyne                                                                                          
                    <gk@ninebynine.org>         To:     RDF interest group <www-rdf-interest@w3.org>                      
                    Sent by:                    cc:                                                                       
                    www-rdf-interest-requ       Subject:     ANNOUNCE: SW Inferencing Skeleton in Haskell (Swish)         
                    est@w3.org                                                                                            
                                                                                                                          
                                                                                                                          
                    2003-06-06 08:05 PM                                                                                   
                                                                                                                          
                                                                                                                          





I've been working on building some tools for semantic web inferencing using

Haskell.  (Haskell is a pure functional programming language with higher
order functions, lazy evaluation and more;  see http://www.haskell.org/ )

The first phase of this has been posted on my web site at:

   http://www.ninebynine.org/Software/Intro.html#Swish  (links)
   http://www.ninebynine.org/Software/Swish-0.1.zip     (software source)
   http://www.ninebynine.org/Software/swish-0.1.html    (documentation)

This first phase is just a platform for future development, consisting of:
- An data type and functions for manipulating RDF models
   (extended to handle N3 formulae), with a simple interface for creating
   combining, accessing and comparing graphs
- A Notation3 parser
- A basic Notation 3 formatter
- A command-line utility for testing the above
- Numerous test programs

The code's a bit tacky in places (by Haskell standards) but it does seem to
work pretty reliably.  It has been tested with the Hugs and GHC Haskell
systems (see document for details) -- GHC can be used to create a
stand-alone executable.

For anyone who tries to run this, I draw your attention to the Haskell
environment descriptions in section 4.3 of the document ... when my son
test-installed it he didn't read this and consequently some of the support
libraries were not located.

As it stands, it doesn't do much other than testing for graph equivalence
(allowing for differing bnode identifiers) and merging graphs (renaming
bnodes as needed to avoid introducing any additional entailments).  I'm
currently working on designing some modules for doing simple queries and
inferences on the RDF data.

#g


-------------------
Graham Klyne
<GK@NineByNine.org>
PGP: 0FAA 69FF C083 000B A2E9  A131 01B9 1C7A DBCA CB5E
Received on Sunday, 8 June 2003 13:14:25 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 7 December 2009 10:51:59 GMT