W3C home > Mailing lists > Public > public-linked-json@w3.org > September 2011

Re: Why Framing and Normalization

From: Dave Longley <dlongley@digitalbazaar.com>
Date: Tue, 06 Sep 2011 15:01:30 -0400
Message-ID: <4E666E0A.3010109@digitalbazaar.com>
To: public-linked-json@w3.org
On 09/03/2011 12:19 PM, Ivan Herman wrote:
> On Sep 3, 2011, at 17:46 , Dave Longley wrote:
>> Despite some measure of reinventing the wheel (or at the very least, reusing parts of a similar car), we decided that it might be a bit much to require an entire SPARQL stack in order for people to use JSON-LD framing. In part because the framing feature itself can stand alone fairly easily, but also because we expect SPARQL to be a wholly foreign technology to most JSON developers and have a much higher learning curve than building a frame that looks just like the objects you work with in your code.
>> There is also something to be said about not requiring a SPARQL implementation in JavaScript or in the web service that you acquire your data from. For example, in PaySwarm, we built a wordpress plugin to serve web pages that contain listings of assets (in this case blogged recipes) for sale marked up in RDFa. Now, there is also other data marked up in RDFa on the web page that the PaySwarm purchase process is totally disinterested in, but neither the PaySwarm code or the wordpress site needs a SPARQL stack in order for the PaySwarm code to obtain the data, frame just the listing, and then use it when performing the purchase of an asset.
>> I don't think that we should require an understanding of SPARQL or access to an implementation of it in order for people to use the framing feature in JSON-LD. Using SPARQL might not sound like a big deal to the RDF community, but I suspect that it would be a tougher sell in the JSON community. However, none of this means that it isn't a good idea to add a framing feature to SPARQL.
> I understand all this. And I did _not_ mean to use SPARQL literally or through an implementation, sorry if I was not clear. However, my feeling is that the concepts of SPARQL might be useful and maybe even better in framing (sic!) the whole idea and the algorithms. For example, SPARQL gives you a separation of the patterns for the triples on the one hand, and what you construct out of it on the other; that does not look like a horribly complex concept to me, and may be a better way of describing what we need to achieve. The concept of a variable made explicit is useful in this respect; it is, in some way, hidden in the current frame language. Again, I do not think this is something of a huge complexity (after all, many of the developers have been exposed to SQL which operates with similar concepts). Also, describing the algorithm in terms of a graph matching on the pattern and then using the variable to construct a frame does not look like very complicated. Of course, there area a number of SPARQL features that, at least at first, we would not have (OPTIONAL, UNION, FILTER, GRAPH, FROM, etc), which might keep things simple. And, I presume, the concepts should be cast into a JSON syntax.
> I realize that I am hopelessly influenced by my background and, from where I stand, the current frame algorithm seems to be way more complicated than describing the same effect with graph matching and construction. But that may be only me...

If what you're suggesting is to simply draw on some ideas from SPARQL to 
better explain framing or simplify the framing algorithm, that sounds 
good to me. I'm not that familiar with SPARQL so I can't say whether or 
not I think bringing some of its concepts into the framing discussion 
would better explain it to JSON developers. JSON developers are familiar 
with objects and I don't think the idea of a filling a skeleton object 
with data is too complex of a concept for them to grasp; I'm a bit more 
wary about explaining framing as "graph matching on a pattern", but if 
people end up thinking it gets the concept across better that's fine 
with me.

Dave Longley
Digital Bazaar, Inc.
Received on Tuesday, 6 September 2011 19:01:55 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 20:53:18 UTC