Re: {a draft proposal} How to publish SPARQL endpoint limits/metadata?

On 10/14/13 9:16 AM, Ghislain Atemezing wrote:
> Hi Frans, all
> I've started a draft vocabulary for discussion re this thread.
> I've re-used the following classes in other vocabulary namespaces 
> sd:Feature; http:Message, interval:CalendarInterval, dctype:Software 
> and org:Organization.
>> Below are some things that I think would benefit communication between
>> SPARQL endpoints and user agents. Please know that I am a novice in
>> Linked Data, so perhaps some of these are already covered by existing
>> standards or best practices, or do not make sense.
>>
>>  1. The maximum number of results per request (hard limit)
>>  2. The amount of remaining requests (This could be used for a
>>     throttling mechanism that allows only a certain amount of request
>>     per unit of time and IP address. I remember having talked to a data
>>     service on the web where this information was put in the HTTP
>>     response headers)
>>  3. The time period of the next scheduled downtime
>>  4. The version(s) of the protocol that are supported
>>  5. (the URI of) a document that contains a human readable SLA or fair
>>     use policy for the service
>>  6. URIs of mirrors
> Please find attached the .ttl file and a graph representation.
>
> So, let's start from here ?!
> @TODO: Any idea for a place to collaboratively update this draft? 
> github? W3C community group ? a volunteer to host in his/her server?
>
> HTH
>
> Ghislain
>
Best you use an owl:imports relation to hook in terms from other 
vocabularies :-)

-- 

Regards,

Kingsley Idehen	
Founder & CEO
OpenLink Software
Company Web: http://www.openlinksw.com
Personal Weblog: http://www.openlinksw.com/blog/~kidehen
Twitter/Identi.ca handle: @kidehen
Google+ Profile: https://plus.google.com/112399767740508618350/about
LinkedIn Profile: http://www.linkedin.com/in/kidehen

Received on Monday, 14 October 2013 18:08:12 UTC