RE: saving tested annotations? (model testing)

Sounds about right to me.
-Tim Cole
 
From: Shane McCarron [mailto:shane@spec-ops.io] 
Sent: Thursday, September 08, 2016 9:26 AM
To: Cole, Timothy W <t-cole3@illinois.edu>
Cc: W3C Public Annotation List <public-annotation@w3.org>
Subject: Re: saving tested annotations? (model testing)
 
You are correct - there is no way to record metadata like that in WPT as it stands.  What I would suggest is that the implementor submit XXNN-inputN.json or something as part of the PR to demonstrate what input was tested.  These could even be in a folder with an input file per test case (this would lead to automation in the future).  In that case something like:
 
XXNN-input/
   annotations/
      annotationAgentOptionals-manual.jsonld
      annotationMusts-manual.jsonld
      annotationOptionals-manual.jsonld
 
Optional, of course.  But if the vendor likes, that would be most helpful.  
 
 
On Thu, Sep 8, 2016 at 8:25 AM, Cole, Timothy W <t-cole3@illinois.edu <mailto:t-cole3@illinois.edu> > wrote:
Shane- 
 
As best I can tell, the json-ld submitted through our test suite does not record anywhere the json-ld that is tested. Is this correct? 
 
(We do now have a place to record a mapping between codes and clients http://w3c.github.io/test-results/annotation-model/README.md <https://urldefense.proofpoint.com/v2/url?u=http-3A__w3c.github.io_test-2Dresults_annotation-2Dmodel_README.md&d=CwMFaQ&c=8hUWFZcy2Z-Za5rBPlktOQ&r=zjI0r-H6xRs5fYf2_jJkju6US9ijk0nLw4ns2nuwU2k&m=Bo-M6LoYuaahIKWSptpJkyy6tkb8Kpa2pyimjbJ1UDo&s=7wQhl8_Ty9ufE9InDIaoUDeiqS8S3p4zg8XynOK6s8o&e=> )
 
If so, would it be worth providing a means by which implementers can submit the json-ld tested alongside the test results resports they submit?  This would be useful if bugs are discovered in our test schemas (likely). Tests could then be rerun.
 
Thanks,
 
Tim Cole



 
-- 
Shane McCarron
Projects Manager, Spec-Ops

Received on Thursday, 8 September 2016 16:04:21 UTC