Re: saving tested annotations? (model testing)

You are correct - there is no way to record metadata like that in WPT as it
stands.  What I would suggest is that the implementor submit
XXNN-inputN.json or something as part of the PR to demonstrate what input
was tested.  These could even be in a folder with an input file per test
case (this would lead to automation in the future).  In that case something
like:

XXNN-input/
   annotations/
      annotationAgentOptionals-manual.jsonld
      annotationMusts-manual.jsonld
      annotationOptionals-manual.jsonld

Optional, of course.  But if the vendor likes, that would be most helpful.


On Thu, Sep 8, 2016 at 8:25 AM, Cole, Timothy W <t-cole3@illinois.edu>
wrote:

> Shane-
>
> As best I can tell, the json-ld submitted through our test suite does not
> record anywhere the json-ld that is tested. Is this correct?
>
> (We do now have a place to record a mapping between codes and clients
> http://w3c.github.io/test-results/annotation-model/README.md)
>
> If so, would it be worth providing a means by which implementers can
> submit the json-ld tested alongside the test results resports they submit?
> This would be useful if bugs are discovered in our test schemas (likely).
> Tests could then be rerun.
>
> Thanks,
>
> Tim Cole
>



-- 
Shane McCarron
Projects Manager, Spec-Ops

Received on Thursday, 8 September 2016 14:27:03 UTC