RE: annotation data model -- running test scripts

Somewhere I saw succeedAndContinue, will switch to passAndContinue and see if that works. Sorry for the confusion.
Regarding errorMessage, yes, would like to see that before or even instead of AJV error message which are not at all useful really.
-Tim Cole
From: Shane McCarron [] 
Sent: Friday, August 12, 2016 9:43 AM
To: Cole, Timothy W <>
Cc: W3C Public Annotation List <>
Subject: Re: annotation data model -- running test scripts
Oh, and according to the documentation in, the values for onUnexpectedResult are: 
|onUnexpectedResult   | `failAndContinue`, `failAndSkip`, `failAndAbort`, `passAndContinue`, `passAndSkip`, `passAndAbort` | Action to take when the result is not as expected. Default is `failAndContinue` |
See <> 
On Thu, Aug 11, 2016 at 9:39 PM, Cole, Timothy W < <> > wrote:
Okay, broke a selection of the individual schemas / assertions into individual files.  Helped in that I now see the name of each assertion one-by-one and with results, but I don't see the errorMessage, and while expectedResult seems to operate as expected, onUnexpectedResult seems to have no effect, regardless of whether the value is failAndContinue and succeedAndContinue, the test harness says fail if result is unexpected. 
For the example tests so far, see: <> <> 
or run both old (some of which reference schemas no longer available) and new tests by going to <> 
and specifying for path either /annotation-model/annotations or /annotation-model/bodiesTargets
We can discuss tomorrow...
-Tim Cole


From: Shane McCarron [ <> ]
Sent: Thursday, August 11, 2016 15:41
To: Cole, Timothy W
Cc: W3C Public Annotation List
Subject: Re: annotation data model -- running test scripts
Comments inline: 
On Thu, Aug 11, 2016 at 11:01 AM, Timothy Cole < <> > wrote:
We've made progress on rounding out the json schemas / assertions needed for testing the model.  Still more to generate, but now trying to create and generate the test scripts to understand better how that part of the process works.  
One glitch is that the test harness processes assertions (json schemas) as an entire file, i.e., does not seem to allow reference to individual schemas within a single .json file.  Can work around this, but wanted to make sure I wasn't missing something obvious.  So for example, 
3.1-annotationMustKeys.json contains 6 sub-schemas in #/definitions/.  So I wrote the assertions part of the test script thusly:
  "assertions": [
                The test harness ignored the fragment part of the ids.  If there's a way when specifying assertions to reference sub-schemas directly, please let me know. Again not critical, but would be nice for a couple of logistical reasons.
Assume this doesn't work.  I don't have the cycles right now to try to sort it out and we are weeks behind.  Either put them in the definitions directory and then reference them inline in the .test file, or split them apart.
Otherwise the tests worked as expected, catching the errors I introduced. When errors encountered AJV error messages appear in the JSON output from the test harness.  However, the schemas themselves include errorMessage values.  These values don't appear in the JSON output as best I can tell.  Do these error messages appear elsewhere, or is there something different I should be doing to make these messages appear when a schema fails?
The error message should be attached to the AJV output.  I will look into this.
Could you clarify the meaning of the testType parameter?
It has no meaning.  
FYI, I will eliminate the old test scripts that reference schemas no longer present so as to streamline running of the tests. I expect to add several more manual tests in appropriate folders today.
Tim Cole
. <>  

Shane McCarron 
Projects Manager, Spec-Ops

Shane McCarron
Projects Manager, Spec-Ops

Received on Friday, 12 August 2016 14:46:17 UTC