- From: ☮ elf Pavlik ☮ <perpetual-tripper@wwelves.org>
- Date: Mon, 13 Apr 2015 13:03:09 +0200
- To: Erik Wilde <dret@berkeley.edu>, "public-socialweb@w3.org" <public-socialweb@w3.org>
- CC: James M Snell <jasnell@gmail.com>
- Message-ID: <552BA26D.5010203@wwelves.org>
On 04/13/2015 05:58 AM, Erik Wilde wrote: > hello elf. Hello Erik, > > On 2015-04-12 13:47, ☮ elf Pavlik ☮ wrote: >> While spec defines how to expand just few common prefixes, it also uses >> in example *gr:Location* not included in this list. >> http://www.w3.org/TR/activitystreams-core/#fig-an-object-that-is-both-a-place-and-a-gr-location >> > > i am not arguing that the spec does not map what's in the spec. i am > pointing out that per the current spec mapping cannot be done in all > cases because no context is required for extensions, and JSON-LD is not > a required processing model. To my understanding * NOT required: http://www.w3.org/TR/json-ld-api/ * required: http://www.w3.org/TR/activitystreams-core/#compact-iris where Handling of JSON-LD Compact IRIs requires information from the @context. * required: http://www.w3.org/TR/activitystreams-core/#naturalLanguageValues similar with Natural Language Values "The default language for document or an individual object can be established using the JSON-LD @language keyword within a @context definition" Last but not least, I see JSON-LD @context providing standardized way to address *describe data using URIs* requirement from WG charter: "A transfer syntax for social data such as activities (such as status updates) should include at least the ability to describe the data using URIs in an extensible manner, time-stamping, and should include a serialization compatible with Javascript (JSON) and possibly JSON-LD." http://www.w3.org/2013/socialweb/social-wg-charter.html#scope Would you suggest some other, preferably standarized, ways than JSON-LD @context for using URIs in JSON? Especially for object *keys* (property/predicate) and not only values (object) (of course value of "@id" denotes triples subject, EAV/SPO). Since charter states "possibly compatible with JSON-LD" I think we should provide strong technical reasoning, which clearly explains what makes it not possible to stay compatible with JSON-LD! > >>> since for now we're still saying we're "JSON-based" and should also >>> demonstrate what this means when you're *not* operating in RDF-land, we >>> have to be clear what it means when people are not as careful as you, >>> and what it means for these two different user groups (JSON users and >>> RDF users) to interact via AS2. >> Once again, IMO to make it possible we really need to come up with smart >> strategy for using JSON-LD contexts > > mostly, i am talking about that we need to come up with a robust way to > define what extensions have to look like, and what consumers are > supposed to report to applications when they encounter one that is not > based on JSON-LD. What kind of JSON data and not 'based on' JSON-LD we need to support? * no JSON-LD @context at all ** publisher can't use Compact IRIs (PROPOSAL: just use full IRIs? similar to https://tools.ietf.org/html/rfc5988#section-5.3) ** publisher can't set default @language (TODO propose solutions) * uses terms not mapped to URIs in normative/recommended JSON-LD @context ** TODO explain what happens when consumer run JSON-LD algorithms e.g. expansion * uses arrays of arrays ** TODO explain what happens when consumer chooses to run JSON-LD algorithms e.g. expansion If you can provide some sample data we could make this conversation more concrete and include that data in our test suites. > >>> you can think of this as the exact equivalent of pre-infoset XML: people >>> could be disciplined and use it in the way later formalized by XMLNS, >>> but pre-XMLNS, it was ok for people to produce non-XMLNS XML, and the >>> question then is what's that supposed to mean. we keep dodging that >>> question, and i think it's going to bite us. >> I must admit not knowing so much about history around XML technologies. > > that's ok, but we're in exactly the same spot. so for those old enough > to remember XML, here's where we are: > > - XMLNS was a useful idea and introduced its own abstract layer, the > infoset. pretty much all XML techs in existence today are strictly > speaking not XML techs, but infoset techs. > > - producing non-XMLNS XML (i.e., XML that cannot be parsed into an > infoset) these days will get you rejected by almost any toolchain out > there. > > - there are well-defined expectations what infoset-based techs should > make available to applications, and that seems to work well. Thank you for this additional elaboration. What do you see as a drawback of what I understood as a shift towards making XMLNS and infoset a requirement? > > i have yet to see any proposal how apart from magic hardcoded mapping > rules based on JSON-LD contexts, AS2 will robustly handle extensions > that are not conforming to JSON-LD. i guess we'll see once we have test > cases exploring that end of the format spectrum, and making statements > about what should be made available to applications and how. I don't understand what you mean by "magic hardcoded mapping rules based on JSON-LD context". I see those mappings *machine readable* first of all, even while not supporting all JSON-LD Processing Algorithms. Contexts could also evolve over time, and their maintainers just needs to stay very careful not to make changes which will change meaning of data already published. e.g. { "@context": "http://schema.org/" } `$ curl http://schema.org/ -H "Accept: application/ld+json"` I hope tomorrow we can take some time for more in depth conversation... Cheers!
Received on Monday, 13 April 2015 11:04:03 UTC