- From: Andy Seaborne <andy.seaborne@epimorphics.com>
- Date: Thu, 04 Apr 2013 13:57:41 +0100
- To: public-rdf-wg@w3.org
On 03/04/13 15:57, Sandro Hawke wrote:
> A few issues came up during my review, related to handling of typed values.
>
> My understanding is RDF allows typed values to be expressed in canonical
> form, like "1.1"^^xs:double or non-canonical form like
> "1.10"^^xs:double. I suspect all existing RDF parsers & generators
> preserve the form, keeping things like trailing zeros. On the other
> hand, deeper systems like SPARQL engines often convert to canonical
> form. Basically, it would be silly to require triplestores to
> remember not just the actually number being stored by also arbitrary
> additional data about how it was formatted, so we don't do that.
>
> I think it's clear all RDF consumers should accept input in either
> canonical or non-canonical form. They may preserve things like
> trailing zeros, but they should not depend on them.
>
> With JSON, technically the syntax allows trailing zeros, etc, but JSON
> parsers are going to lose that information, because they return an
> actually IEEE Double instead of a string for that value. So, in RDF
> terms, it makes sense to treat JSON-LD like a triplestore; it's going to
> lose any non-canonical form of the lexical representation of datatyped
> values.
>
> JSON-LD has two ways to express datatyped values in JSON. It can
> either use native types (like 1.1) or expanded types (like { "@value":
> "1.1", "@type": "http://www.w3.org/2001/XMLSchema#double" }).
>
> Issue 1: How should RDF->JSON converters handle typed values for which
> there is a JSON native type and which are not in canonical form. Should
> they be converted to a JSON native value, in which case formatting
> (trailing zeros, etc) is lost, or should they remain in expanded form,
> so the full details remain. I prefer just using native values,
> since I think it makes the system easier to use. Basically, I think
> it's fine for components to convert things to canonical form when they
> can. I'd also be okay with leaving it open to implementors.
>
> Issue 2: What about values which are going to be slightly changed by
> being converted to native form, like "1.99999999999999999999"^^xs:double
> ? If that gets output in JSON as 1.99999999999999999999 I believe
> it'll get read back in as "2"^^xs:int.
Minor:
"1.99999999999999999999"^^xs:double isn't a double or at least not the
double you may think it is . It is outside the precision range for
doubles which is 53 bits. As JSON uses doubles, effectively, they do
roughy align on xs:double.
http://www.w3.org/TR/xmlschema11-2/#double
A decimal is arbitrary precision in XSD:
"1.99999999999999999999"^^xs:decimal
Decimals minimum precision is 18 digits.
and also
http://www.w3.org/TR/xsd-precisionDecimal/
> Should these be forced to
> remain in expanded form, to avoid this changing of value and even type?
> On this, I suggest expanded form be required if practical -- but I'm not
> sure it's always clear when it's required.
>
> These basically boil down to what kind of fidelity is to be guaranteed
> by round-tripping RDF->JSONLD->RDF.
which is a very good question
Andy
>
> -- Sandro
>
>
>
>
Received on Thursday, 4 April 2013 12:58:13 UTC