W3C home > Mailing lists > Public > public-ldp-wg@w3.org > February 2014

Re: stable paging

From: John Arwe <johnarwe@us.ibm.com>
Date: Mon, 17 Feb 2014 08:33:52 -0500
To: Linked Data Platform WG <public-ldp-wg@w3.org>
Message-ID: <OF4EB86D38.60B8C66D-ON85257C82.0046C9C7-85257C82.004A8481@us.ibm.com>
> ....  Lossy paging would result in postings not 
> being shown to some people in some circumstance, which is likely to 
> be unacceptable. 

This makes it sound as if the real chafing point is the inability for the 
client to detect when it never sees something (when it needs to "start 
over" if it cares about completeness), which is different than having a 
problem with lossy paging per se.  In our current implementations (other 
email), we also ended up giving clients a signal by which they could Know 
that they missed something and hence need to start over if they care about 
completeness; [1] is the spec many of them are following.

[1] http://open-services.net/wiki/core/TrackedResourceSet-2.0/

> .... As with static paging, the server can, at any time, give 
> up on a particular paging function and answer 410 GONE for those 
> URLs.  ...

This is an interesting variation.  Many client apps are written to treat 
4xx codes as errors.  "page gone" is something of an "expected error" - 
more like a 5xx in some ways.  It's not like there's anything wrong with 
the client's code to cause the 410 (but that would be true of 410 in 
general, aside from cases where the same code already deleted the 
request-URI for which the 410 is sent).

Nit: "Stable" seems a bit strong.  This is more a bounded-loss case, isn't 
it?

> ..., but each triple which could 
> theoretically ever be in the graph is assigned to a particular page.

Does this imply that you need a closed model in order to implement it? 
Otherwise the number of triples which could theoretically ever be in the 
graph is infinite, so you fall somewhere in the space between needing 
infinite pages, having some pages that will be too large to transfer 
(defeating the purpose), and having an infinite number of mapping 
functions.  It's sounding like some of the exchanges the WG has had on 
'reasoning' ... theoretically NP, but in practice not so bad.

I'm wondering if generic graph stores would have any problem with it, 
since they definitionally have open models and hence know basically 
nothing about the triples that might theoretically exist over time in a 
resource.


Best Regards, John

Voice US 845-435-9470  BluePages
Tivoli OSLC Lead - Show me the Scenario
Received on Monday, 17 February 2014 13:34:23 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 21:11:55 UTC