- From: Sandro Hawke <sandro@w3.org>
- Date: Wed, 03 Apr 2013 10:57:13 -0400
- To: W3C RDF WG <public-rdf-wg@w3.org>
A few issues came up during my review, related to handling of typed values.
My understanding is RDF allows typed values to be expressed in canonical
form, like "1.1"^^xs:double or non-canonical form like
"1.10"^^xs:double. I suspect all existing RDF parsers & generators
preserve the form, keeping things like trailing zeros. On the other
hand, deeper systems like SPARQL engines often convert to canonical
form. Basically, it would be silly to require triplestores to
remember not just the actually number being stored by also arbitrary
additional data about how it was formatted, so we don't do that.
I think it's clear all RDF consumers should accept input in either
canonical or non-canonical form. They may preserve things like
trailing zeros, but they should not depend on them.
With JSON, technically the syntax allows trailing zeros, etc, but JSON
parsers are going to lose that information, because they return an
actually IEEE Double instead of a string for that value. So, in RDF
terms, it makes sense to treat JSON-LD like a triplestore; it's going to
lose any non-canonical form of the lexical representation of datatyped
values.
JSON-LD has two ways to express datatyped values in JSON. It can
either use native types (like 1.1) or expanded types (like { "@value":
"1.1", "@type": "http://www.w3.org/2001/XMLSchema#double" }).
Issue 1: How should RDF->JSON converters handle typed values for which
there is a JSON native type and which are not in canonical form. Should
they be converted to a JSON native value, in which case formatting
(trailing zeros, etc) is lost, or should they remain in expanded form,
so the full details remain. I prefer just using native values,
since I think it makes the system easier to use. Basically, I think
it's fine for components to convert things to canonical form when they
can. I'd also be okay with leaving it open to implementors.
Issue 2: What about values which are going to be slightly changed by
being converted to native form, like "1.99999999999999999999"^^xs:double
? If that gets output in JSON as 1.99999999999999999999 I believe
it'll get read back in as "2"^^xs:int. Should these be forced to
remain in expanded form, to avoid this changing of value and even type?
On this, I suggest expanded form be required if practical -- but I'm not
sure it's always clear when it's required.
These basically boil down to what kind of fidelity is to be guaranteed
by round-tripping RDF->JSONLD->RDF.
-- Sandro
Received on Wednesday, 3 April 2013 14:57:20 UTC