Re: Presenting on JSON-LD and MongoDB at NoSQL Now!

Sandro, thanks for the feedback!

...

On Aug 20, 2012, at 9:46 AM, Sandro Hawke <sandro@w3.org> wrote:

> On 08/18/2012 09:38 PM, Gregg Kellogg wrote:
>> I'm presenting on the use of JSON-LD with MongoDB to enable Single Page Applications at the NoSQL Now! conference in San Jose this week.
>> 
>> My slides are available at on SlideShare [1]. There is plenty of time for updates of people have any suggestions.
>> 
>> Gregg Kellogg
>> gregg@greggkellogg.net
>> 
>> [1] http://www.slideshare.net/gkellogg1/jsonld-and-mongodb
> 
> Good work, very clear.
> 
> Minor comments:
> 
>    slide 14 s/hight/height/
> 
>    slide 21 - really, we call it "chaining" instead of "nesting"? 
> Nesting seems like a better term for this structure than chaining; or 
> maybe I'm misunderstanding it.

Nesting is the correct term, as used in the syntax spec; I had a brain freeze :P

>    slide 24 - personally, I'd rather see that slide titled "Turtle 
> Mapping", since as far as I'm concerned JSON LD (at least the parts 
> shown here) is RDF, just like Turtle.  So what this slide is really 
> showing is that this JSON-LD text can be programmatically transformed 
> into other RDF serializations.     But maybe you want to distance 
> JSON-LD from RDF.

No, not my intent. However, much of the audience may not be so familiar with RDF, and JSON-LD is intended to be used on it's own, otherwise outside of the RDF echo-system. But, I can talk through that; I'll take your suggesting for "Turtle Mapping".

> Will you make the point verbally that one could easily use JSON-LD or 
> Turtle or RDF/XML over the wire, and if you're using a library on the 
> client side, you'll never know the difference?    Or would that just 
> confuse this audience?

I think that when talking about REST, which I do, that content-negotiation is a reasonable topic. In fact, in my specific use, the REST endpoints return either HTML or JSON-LD, depending on the Accept header; they could also return RDF/XML or Turtle (most other services I've written are agnostic about this, and just provide anything the library has available).

This talk is really motivating Single Page Web Apps, where JSON is commonly used for state transfer. But, I think I can add the content-negotiation bit in.

> Will you make the point that while MongoDB makes a splendid back-end 
> store for this, there are many other stores (SPARQL ones) that are more 
> optimized for graph-based data, and might be more performant for some 
> applications?      (I'm still waiting for someone to put a SPARQL front 
> end to a Mongo setup like this, so we can have some proper benchmarking.)

Frankly, what I'm outlining is not really optimized for SPARQL. Rob Vesse, did do something like this [1], which I made Pius Uzamere made use of in his Ruby RDF/Mongo repository adaptor [2]. For performance, you're much better at storing triples as documents, then full-on subject definitions.

This application is more tuned for returning collections of subject definitions based on a fairly narrow query bounds (resources relevant to a particular point in game play) then general querying. In fact, I've considered that an evolution of the application I'm working on may well go to a generic SPARQL store, with good JSON-LD support, such as Stardog.

I think it's fair to point out what the limitations of this system are and that the same application could be supported, with fairly minimal back-end changes, using such a triple store.

> If I were not a Linked Data fan, I imagine I'd be wondering what JSON-LD 
> brings to this application?  Why not just MongoDB all the way?        
> Actually, I'm a little confused -- are you using an existing MongoDB 
> REST interface or did you write your own JSON-LD specific one?

I actually have a blog post I'll put up later today or tomorrow that describes the application more, and how it relates to both JSON-LD and RDFa. Suffice it to say, that when trying to represent data in a Wiki, getting structure out first as RDFa is important. JSON-LD becomes a vehicle for aggregating that information and providing it to other applications in an efficient way.

The back end is actually written using Ruby/Sinatra which transforms the HTTP queries into Mongo, and of course deals with the content negotiation. There is also some necessary transformation from BSON to JSON-LD, mostly @id fields and date objects which need string representations, so using Mongo directly would be difficult.

It's also an interesting use of OWL and JSON-LD to create a client-based CMS; but that's another story.

The site should become generally available in a couple of months. The code which implements it should be open sourced around the same time.

>        -- Sandro

Gregg

[1] http://www.dotnetrdf.org/blogitem.asp?blogID=35
[2] https://github.com/pius/rdf-mongo

Received on Monday, 20 August 2012 17:12:11 UTC