- From: Josh Mandel <Joshua.Mandel@childrens.harvard.edu>
- Date: Tue, 8 Nov 2011 19:29:59 -0800
- To: public-linked-json@w3.org
I mentioned to Dave on IRC that I was seeing some not-huge graphs that are very slow to frame. He pointed out that this is likely because they're slow to normalize, and the current framing algorithm pre-normalizes. I don't know if this a common use case, but for me framing is important, but normalization is less so. I promised to send a complete example of this kind of graph, so here goes: --- https://github.com/jmandel/jsonld.js/tree/slowness/smart Here's a JSON document representing ~1800-triples that takes 7 seconds to normalize. Lots of blank nodes :-) $ node smart-test.js 8 Nov 19:15:51 - Normalization time: 7021 8 Nov 19:15:51 - Framing time: 7189 8 Nov 19:15:51 - 1876 Triples time: 6585 --- Incidentally, the sample data come from an RDF ontology I'm building for medical data (lab results in this case). All the data in this example are synthetic and/or anonymized. (More info at http://smartplatforms.org). -Josh
Received on Wednesday, 9 November 2011 08:30:21 UTC