RE: Model testing - body and targets and suggestions for testing UI

Okay, though if A and/or B at least were easy, it would make a better first impression as we recruit implementers...

Regarding D, when you do get a chance to look at it, one option might be to add checkboxes to simply suppress reporting of failed assertions that are should or may (as opposed to must), and to suppress test harness and AJV error messages. The report would then tell us that the annotation passed all MUST assertions and implemented features associated with the should and may assertions. Checkboxes could be unchecked for those who want to see...

-Tim Cole
From: Shane McCarron []
Sent: Thursday, September 01, 2016 11:16
To: Cole, Timothy W
Cc: W3C Public Annotation List
Subject: Re: Model testing - body and targets and suggestions for testing UI

I will look into making these improvements.  However, they don't impact the initial integration.  We should pull in the updated tests ASAP.

On Thu, Sep 1, 2016 at 9:56 AM, Timothy Cole <<>> wrote:

New data model tests (i.e., checking body and target constraints) are have been added to the annotation-level tests already available on the testdev server. These are ready to be migrated up to production, but as mentioned last night we need to get rid of old, deprecated tests and json files. I also have some feedback on the test script UI that would be nice to address if not too difficult (see below). Where we are with the data model tests:

Mandatory tests  all annotations should pass all assertions checked by these 3 tests:
1. Annotation-level:<>  (14 assertions)
2. Body-level:<>  (16 assertions)
3. Target-level:<>  (15 assertions)

Recommended / Optional tests  most individual annotations will "fail" a majority of these assertions:
1. Annotation-level:<>  (15 assertions)
2. Annotation-level Agents:<>  (16 assertions)
3. Body-level:<>   (28 Assertions)
4. Target-level:<>  (25 assertions)
5. Body and Target-level Agents:<>  (16 assertions)


A. The text box for inputting JSON-LD annotation and the Check JSON button need to appear on the form before the list of assertions being checked.  The list of assertions is of variable length and can be quite long. Better when you have to paste in your annotation 8 times that the input box be higher on the form and always in the same spot.

B. Per discussion, the title of each assertion contains mark-down. This shows up formatted on the input form, but in the summary of results, the mark-down is not being processed. Any way to fix?

C. When a test 'fails' the error message from the assertion is displayed first (good), but then the AJV error message and test script trace is concatenated, not useful for the person submitting the annotations for testing. Is there any way to suppress (e.g., in hidden HTML) the AJV and test harness trace error messages, or to format to be less prominent?

D. Most annotations will fail most recommended / optional assertions. These are intended to identify what features have been implemented in any given annotation and do not really go to validation per se, and relatively few individual annotations implemented more than a handful of optional features. Example 44 from our model only pass 3 out of 25 target 'tests'.  Is there any option on should and may assertions to tone down the red small caps FAIL error message? Something in yellow and maybe a different word than FAIL?  Alternatively, we may need to discuss granularity further  separate post forthcoming.

Any changes to help ameliorate any of these issues that are easy to fix before moving latest into production would be appreciated.


Tim Cole

Shane McCarron
Projects Manager, Spec-Ops

Received on Thursday, 1 September 2016 16:31:30 UTC