W3C home > Mailing lists > Public > whatwg@whatwg.org > March 2013

Re: [whatwg] asynchronous JSON.parse

From: Robin Berjon <robin@w3.org>
Date: Fri, 08 Mar 2013 10:44:50 +0100
Message-ID: <5139B312.50604@w3.org>
To: Tobie Langel <tobie.langel@gmail.com>
Cc: whatwg@whatwg.org, David Rajchenbach-Teller <dteller@mozilla.com>
On 07/03/2013 23:34 , Tobie Langel wrote:
> In which case, isn't part of the solution to paginate your data, and
> parse those pages separately?

Assuming you can modify the backend. Also, data doesn't necessarily have 
to get all that bulky before you notice on a somewhat sluggish device.

> Even if an async API for JSON existed, wouldn't the perf bottleneck
> then simply fall on whatever processing needs to be done afterwards?

But for that part you're in control of whether your processing is 
blocking or not.

> Wouldn't some form of event-based API be more indicated? E.g.:
>
> var parser = JSON.parser();
 > parser.parse(src);
 > parser.onparse = function(e) { doSomething(e.data); };

I'm not sure how that snippet would be different from a single callback API.

There could possibly be value in an event-based API if you could set it 
up with a filter, e.g. JSON.filtered("$.*").then(function (item) {}); 
which would call you for ever item in the root object. Getting an event 
for every information item that the parser processes would likely flood 
you in events.

Yet another option is a pull API. There's a lot of experience from the 
XML planet in APIs with specific performance characteristics. They would 
obviously be a lot simpler for JSON; I wonder how well that experience 
translates.

-- 
Robin Berjon - http://berjon.com/ - @robinberjon
Received on Friday, 8 March 2013 09:45:33 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Friday, 8 March 2013 09:45:33 GMT