W3C home > Mailing lists > Public > public-rdf-wg@w3.org > March 2011

Re: [JSON] market segments

From: Nathan <nathan@webr3.org>
Date: Thu, 17 Mar 2011 04:22:19 +0000
Message-ID: <4D818C7B.4080503@webr3.org>
To: Sandro Hawke <sandro@w3.org>
CC: Thomas Steiner <tsteiner@google.com>, RDF Working Group <public-rdf-wg@w3.org>
Sandro Hawke wrote:
> On Thu, 2011-03-17 at 00:20 +0100, Thomas Steiner wrote:
>> Hello all,
>> While I love "GRDDL for JSON" as a name, I'm still not sure it is a
>> generalizable functionality that would be straight-forward to offer.
>> As I said today on IRC, isn't it one-offs for each and every single
>> JSON data provider? Isn't the objective of this WG to come up with
>> something that JSON data providers would use in the first place? We
>> can still provide (or document how to provide)
>> mappings/goggles/"GRDDL", but in my opinion this shouldn't be our
>> primary goal. Lots of questions to be discussed at the F2F or earlier.
>> Maybe I simply got the concept wrong, though. Thanks for corrections
>> in either case.
> Thinking about it, I'm seeing a spectrum of the folks who send data
> using JSON and their relationship to RDF and this effort.  In order of
> least-RDF-friendly to most-RDF-friendly.....
> Level 1 - Folks who publish json data which cannot be easily mapped to
> RDF and are not willing to do anything about it.  If a translator-to-RDF
> is done for their data, it will have to be complex (probably using a
> Turing Complete language ) and done without their participation.
> Level 2 - Folks who publish json data which has a fairly clean mapping
> to RDF (basically namespaces prefixes and property types), but are still
> not willing to do anything at all to help RDF consumers.
> Level 3 - Folks who are willing to help with the mapping, and have the
> mapping information served from their website in a way that machines can
> find.  But they don't want non-RDF json users to see any hint of RDF in
> their actually data stream.
> Level 4 - Folks who are willing to add a few little flags to their json
> data, such as the URL of the mapping-to-RDF declaration.
> Level 5 - Folks who are willing to include all the necessary information
> for mapping their json to RDF, so their data can be converted without
> additional dereferences.    This probably goes along with having a json
> data design that is close enough to RDF that the mapping is pretty
> trivial, so maybe they had to design their json a bit differently.
> Level 6 - Folks who are willing to make their main json feeds
> specifically designed for RDF compatibility, probably making their json
> seem a little odd to people with a feel for json.   These are the folks
> that I think the charter requires us to address; it's debatable how much
> we should cater to levels 1-5.   (I think helping levels 3-5 is
> desirable, but might not be practical.
> Level 7 - Folks who are willing to provide their data in RDFa, RDF/XML,
> or Turtle.   These folks don't actually need us to do anything in the
> json space.
> I think there are also some different segments among developers (json
> users):
> Group A - Developers consuming data from a relatively small number of
> sites, willing to do custom coding for each site (eg hardcoding how to
> read news streams from twitter *and* facebook).   These folks probably
> want very comfortable and obvious json they can use without libraries.
> That's what they get now, and they wont like the json to get messier.  I
> don't think we can offer anything to this group; our job is avoid making
> their lives difficult or annoying them.
> Groups B+C - Developers who do not want to do custom coding per data
> source.  They want to see those news streams in some common interface,
> so they don't need special code for each one.  That sounds like RDF to
> me.   (These developers are also working on behalf of users who don't
> want to hire more developers for each data site.)  Now, this group can
> be divided into:
>  -- Group B, Developers who want the generality of RDF but don't want to
> use any sort of RDF API.  For them we could try to make a format the
> Level-6 producers will publish and which Group-B developers can consume
> without complex code or a library/API.
>  -- Group C, Developers who want the generality of RDF, but are willing
> to use an RDF API.   What they really need is for their libraries to
> support Level-N sources, for as low an N as possible.  They would be
> most happy if they have RDF API access to even Level-1 sources, without
> having to do any work.  (This could potentially be done with some sort
> of open repository of javascript modules which translate various
> publishers' json to RDF.  I think level-3 is as low as this WG can go,
> though.)
> Looking at this matrix of 21 boxes (in my head), I'm seeing two sweet
> spots, where we could do some good: 6-B and 3/4/5/6-C.  I'm worried that
> 6-B is over-constrained (anything we do will annoy Group A -- we went
> through this with RDF and XML), so I think our best bet is probably
> 3/4/5/6-C.     At least, that's how it looks to me right now.   This is
> the space of standardizing annotations, occurring inline or out-of-band,
> which enable deterministic, efficient conversation of "normal" json into
> RDF triples, by a library.    
> Note that for this space (1) we don't need to serialize every kind of
> RDF graph, and (2) the json is "pretty" (but not RDFy) for Group-A
> folks, but can be converted by standard code, eventually built into
> various libraries, to RDF for Group-C folks.  Group B may just be out of
> luck any way we slice it.

Fantastic summary again Sandro :) I agree completely 3/4/5/6-C.

However, I would like to add that imho there is a growing need to 
standardize something separate like jTriples and Talis' RDF/JSON for 
Level 7 above. Ideally if we produce text/ntriples or text/nquads, then 
it would be nice to have a JSON production of that standardized and 
registered as application/ntriples+json or suchlike, it could be an easy 
hit within minimal specification.


Received on Thursday, 17 March 2011 04:23:22 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 17:04:04 UTC