W3C home > Mailing lists > Public > public-rdf-dawg@w3.org > October to December 2012

Re: GSP validator and tests

From: Chime Ogbuji <chimezie@gmail.com>
Date: Sat, 20 Oct 2012 15:24:39 -0400
To: Andy Seaborne <andy.seaborne@epimorphics.com>
Cc: public-rdf-dawg@w3.org
Message-ID: <E3028DC8EFF54E72B845B5EE8120DAC9@gmail.com>
On Friday, October 19, 2012 at 11:49 AM, Andy Seaborne wrote:
> On 17/10/12 06:14, Chime Ogbuji wrote:
> > There is now a validator for GSP implementations listening here:
> > 
> > http://metacognition.info/gsp_validation/gsp.validator.form
> Progress report:
> == EARL
> Did I miss a way to get an EARL report? c.f. the protocol validator.
No, you didn't.  Looking at the minutes, we only committed to setup "a validator instead of machine-readable tests".  So, I didn't add any capability to generate EARL at the time, but I have since added content-negotiable EARL reporting to the validator.

I imagine it would also make it easier to integrate the results into Gregg's implementation report.
> == The validator only tried 18 of 22 tests
> Fuseki did not receive any multipart tests (which it does not implement) 
> (test 15 and the check 16)

Yes, it turns out there was some more work to be done on the validator for these tests and I have added the multipart/form-data tests.  
> I may never implement these because it conflicts with treating the graph 
> store itself RESTfully - i.e. POST of data means append to the store 
> (quads or triples)

Can you elaborate on how this is a conflict?  A graph store is not the same as RDF graph content, so it seems reasonable IMO that POST requests to each would behave differently: POST of data to RDF graph content means append to it, POST of data to the store means create a new graph in the store.

The store is more a container of the other and this is exactly how the protocol that motivated this interface - Atom Pub - handles it.   
> == Rerunning the tests I got:
> gsp.validator.run =>
> [[
> Internal Server Error
> Server got itself in trouble
> Error 500
> ]]
> so then I lost the report at that point :-(
Uh oh.   I have addressed the error you run raised as well as others.  Please try again when you get a chance.
> == Test 5 - "PUT - default graph"
> A PUT to /ds/?default
> failed
> expected status 201, received 204 ()
> ?? The default graph always exists so it should never return 201 (created)
I agree - actually my implementation was failing on this test as it was returning a 204.  I have changed this on the test web page and the validator.
> == Turtle content type:
> The content type for Turtle is
> Content-type: text/turtle; charset=utf-8
I have made these changes in the tests and the validator 
> == Coverage
> Are there no tests for graph= ?
> PUTing via direct and GETting indirect, and vice versa, would good tests 
> to have.

I have changed the tests to account for this.  In particular, "PUT - empty graph" and "GET of PUT - Initial state" are indirect
> == Empty graphs
> Whether a graph store response 200 or 404 to an empty graph is best left 
> untested or at least documented in the tests - the same issues arise in 
> SPARQL Update.

FYI, my implementation fails all the empty graph tests (because it does't support empty graphs).  When you say 'or at least documented in the tests', 
do you mean adding a notice for the empty graph tests, stating that this (what we have now) is just one way an implementation could respond? 
> A quad store might respond 404 to an empty graph or even always say 200 
> to any graph.

The latter case makes it hard to *fully* test DELETE, doesn't it, since a 200 Ok response to DELETE implies that a 404 response (even for a store that supports empty
graphs) to a subsequent request to the DELETEd resource is required?

I think empty graph tests should be removed.  What do you think?
> The only other option is to acknowledge that GSP forces choice of 
> implementation technology.

I think that would be too heavy-handed.  Each implementation that supports empty graphs just needs to reconcile this capability with expected behavior over HTTP.  I, for example, have chosen not 
to implement empty graphs because (IMO and amongst other reasons) they don't make sense through an HTTP lens.
> == Practical assumptions
> This test was running with an experimental service that was processing 
> any protocol and any request. This is less secure (normal security 
> frameworks can't spot updates so easily - e.g. Apache Shiro).

Yes, i agree.  I had the same hesitation about my test endpoint.  Do you have suggestions? Or perhaps, we just bring down our instances after running the tests? I would rather not host the validator indefinitely for the same reasons.
> I had to add /ds as the graph store as well. Fuseki does not support 
> graph store on "/" because it conflicts with serving up HTML pages. 
> Standard servlet dispatch by name gets confused and having to analysis 
> each request in the application is unnecessary work and make si thard to 
> integrate with other protocols (SPARQL protocol, plain old pages).

> == What is the graph store?
> (discussion point)
> I was initially confused by GRAPHSTORE:
> [[
> $GRAPHSTORE$ is the path of the URL of the graph store
> ]]
> but operations include:
> PUT /person/1.ttl
> so doesn't that mean GRAPHSTORE is fixed as "/person" or as "/" because 
> graphs named directly are within the naming tree of the graph store?

> This is all a bit of problem because it is presuming how services are named
> [[
> Graph Store - A mutable repository of RDF graphs managed by one or more 
> services
> ]]
> so the "graph store" does not a URI? only the services?

In retrospect, that should read (they all should be changed to this):

PUT $GRAPHSTORE$/person/1.ttl 

That is how the validator is creating URLs it is using in the tests. Basically, $GRAPHSTORE$ is both a common prefix of the graph IRIs
and the Graph store URL that the GSP implementation is listening on. The requests append a different suffix to form the request / graph IRI. 

 This is just for the convenience of running the tests.
> For Fuseki, the graph store and the services normally have different 
> URLs (it helps with security amongst other things).

It is the same with the Akamu GSP implementations.  
> For the tests, I was running with a all-purpose processor on the graph 
> store URL

Ok, so given the correction I mentioned above, that should work for you, right? 

The validator has been updated and is live, so you can try again and let me know if you run into any more troubles.  

Thanks for the detailed feedback.

Chime Ogbuji

Sent with Sparrow (http://www.sparrowmailapp.com) 
Received on Saturday, 20 October 2012 19:25:09 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 7 January 2015 15:01:08 UTC