W3C home > Mailing lists > Public > semantic-web@w3.org > February 2010

SPARQL & memory use

From: Danny Ayers <danny.ayers@gmail.com>
Date: Sun, 7 Feb 2010 21:30:52 +0100
Message-ID: <1f2ed5cd1002071230r112dfc8rc6d719bd3b2eb6c1@mail.gmail.com>
To: Semantic Web <semantic-web@w3.org>
A little while ago I was convinced I should learn Scala (thanks Reto)
- not there yet by any means, but noticed a few things along the way.
If you use Jena, try Scala - you have the same lib stuff available but
in a syntax that's a lot friendlier than Java.

Already I digress.

So my server, on which I run a blog, is falling over on a daily basis
with 1GB immediate mem (+2GB swap) available (and a lousy admin).
Today, like a fool I coded a SPARQL endpoint for it.

Here's the rub I want to rub against you - to get even moderately
general queries I had to say -Xmx2048m -Xmn1024m, which damn near
drove my local laptop (with 4GB available) into the ground.

Which is where I could use your expertise.

Because I was only working with only a tiny model (about 40MB, most of
it literals) I thought I could keep that in mem, query away. D'oh!

Would using an e.g. MySQL-backed model help here? Or something else?

What strategies (and code?) do we have to detect a memory-munching query?

Or should I simply go on the game to raise funds for more slicehost memory?

The thing is is here:
http://dannyayers.com

(already set to clear the decks every hour)

Source is near here:
http://hyperdata.org/wiki/wiki/Gradino

Cheers,
Danny.

-- 
http://danny.ayers.name
Received on Sunday, 7 February 2010 20:31:25 GMT

This archive was generated by hypermail 2.3.1 : Tuesday, 26 March 2013 21:45:34 GMT